GKD-M Accuracy Alchemist [Loxx]Giga Kaleidoscope GKD-M Accuracy Alchemist is a Metamorphosis module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
█ GKD-M Accuracy Alchemist
What is the Accuracy Alchemist?
The Accuracy Alchemist is designed to process up to 10 GKD-C indicators and create a compound signal that can be utilized in a GKD-BT backtest. It achieves this by applying an individual Solo Confirmation Simple backtest to each GKD-C indicator provided. The compound signal is derived from the cumulative accuracy rate of each candle within a specified date range.
To illustrate this process, consider the following scenario:
The Fisher Transform indicator has a 65% win rate for long positions on the current ticker.
The Vortex indicator has a 45% success rate on the current candle.
Suppose a long signal is generated by the Vortex indicator. However, this signal is disregarded because its accuracy is lower than that of the Fisher Transform. Now, imagine that the subsequent candle produces a long signal from the Fisher Transform indicator. This signal will be exported to the backtest. The GKD-C indicator that exhibits the highest accuracy for the current candle is chosen to generate the signal. The dominant indicator, determined by its accuracy, will always be used to generate signals. However, it is important to note that the current dominant indicator might not retain its dominance in the future if its accuracy rate falls below that of other indicators connected within the Accuracy Alchemist indicator.
The Accuracy Alchemist provides a comprehensive table that displays the dominant indicator based on accuracy, highlighted in green for the highest long accuracy rate and in red for the highest short accuracy rate. Additionally, the table presents the cumulative long and short accuracy rates for all indicators.
The functionality of the Accuracy Alchemist extends to several GKD-BT backtests, allowing for seamless integration. These backtests include:
-Solo Confirmation Simple
-Solo Confirmation Complex
-Solo Confirmation Super Complex
-Full GKD (as a Confirmation 1 indicator only)
-Confirmation Stack (as a Confirmation 1 indicator only)
By incorporating the Accuracy Alchemist, you gain the ability to evaluate and compare GKD-C Confirmation indicators within your full GKD trading system. It serves as an ideal tool to assess the performance of different confirmation indicators and aids in the selection process for determining which indicators to incorporate into your trading strategy.
Take Profit and Stoploss
The GKD system utilizes volatility-based take profits and stop losses, where each take profit and stop loss is calculated as a multiple of volatility. Users have the flexibility to adjust the multiplier values in the settings to suit their preferences. Accuracy Alchemist tests the accuracy of GKD-C Confirmation indicators and therefore has only 1 take profit and 1 stoploss. You can adjust the multipliers of both in the settings
Setting up Accuracy Alchemist
To use this indicator, you must import GKD-C Confirmation indicators and then activate them in the Accuracy Alchemist settings. Import the value "Input into NEW GKD-BT Backtest" from a GKD-C indicator and then activate it by checking the box next to the import. See below:
Volatility Types Included
17 types of volatility are included in this indicator
Close-to-Close
Parkinson
Garman-Klass
Rogers-Satchell
Yang-Zhang
Garman-Klass-Yang-Zhang
Exponential Weighted Moving Average
Standard Deviation of Log Returns
Pseudo GARCH(2,2)
Average True Range
True Range Double
Standard Deviation
Adaptive Deviation
Median Absolute Deviation
Efficiency-Ratio Adaptive ATR
Mean Absolute Deviation
Static Percent
Close-to-Close
Close-to-Close volatility is a classic and widely used volatility measure, sometimes referred to as historical volatility.
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a larger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility is calculated using only a stock's closing prices. It is the simplest volatility estimator. However, in many cases, it is not precise enough. Stock prices could jump significantly during a trading session and return to the opening value at the end. That means that a considerable amount of price information is not taken into account by close-to-close volatility.
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. This is useful as close-to-close prices could show little difference while large price movements could have occurred during the day. Thus, Parkinson's volatility is considered more precise and requires less data for calculation than close-to-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after the market closes. Hence, it systematically undervalues volatility. This drawback is addressed in the Garman-Klass volatility estimator.
Garman-Klass
Garman-Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing prices. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change follows a continuous diffusion process (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremes.
Researchers Rogers and Satchell have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates a drift term (mean return not equal to zero). As a result, it provides better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. This leads to an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
Yang-Zhang volatility can be thought of as a combination of the overnight (close-to-open volatility) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility. It is considered to be 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator incorporates the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e., it assumes that the underlying asset follows a Geometric Brownian Motion (GBM) process with zero drift. Therefore, the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, with the main applications being technical analysis and volatility modeling.
The moving average is designed such that older observations are given lower weights. The weights decrease exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility. It's the standard deviation of ln(close/close(1)).
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by ?.
?avg(var;M) + (1 ? ?) avg(var;N) = 2?var/(M+1-(M-1)L) + 2(1-?)var/(M+1-(M-1)L)
Solving for ? can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg(var; N) against avg(var; M) - avg(var; N) and using the resulting beta estimate as ?.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Standard Deviation
Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The standard deviation is calculated as the square root of variance by determining each data point's deviation relative to the mean. If the data points are further from the mean, there is a higher deviation within the data set; thus, the more spread out the data, the higher the standard deviation.
Adaptive Deviation
By definition, the Standard Deviation (STD, also represented by the Greek letter sigma ? or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. In technical analysis, we usually use it to measure the level of current volatility.
Standard Deviation is based on Simple Moving Average calculation for mean value. This version of standard deviation uses the properties of EMA to calculate what can be called a new type of deviation, and since it is based on EMA, we can call it EMA deviation. Additionally, Perry Kaufman's efficiency ratio is used to make it adaptive (since all EMA type calculations are nearly perfect for adapting).
The difference when compared to the standard is significant--not just because of EMA usage, but the efficiency ratio makes it a "bit more logical" in very volatile market conditions.
Median Absolute Deviation
The median absolute deviation is a measure of statistical dispersion. Moreover, the MAD is a robust statistic, being more resilient to outliers in a data set than the standard deviation. In the standard deviation, the distances from the mean are squared, so large deviations are weighted more heavily, and thus outliers can heavily influence it. In the MAD, the deviations of a small number of outliers are irrelevant.
Because the MAD is a more robust estimator of scale than the sample variance or standard deviation, it works better with distributions without a mean or variance, such as the Cauchy distribution.
Efficiency-Ratio Adaptive ATR
Average True Range (ATR) is a widely used indicator for many occasions in technical analysis. It is calculated as the RMA of the true range. This version adds a "twist": it uses Perry Kaufman's Efficiency Ratio to calculate adaptive true range.
Mean Absolute Deviation
The mean absolute deviation (MAD) is a measure of variability that indicates the average distance between observations and their mean. MAD uses the original units of the data, which simplifies interpretation. Larger values signify that the data points spread out further from the average. Conversely, lower values correspond to data points bunching closer to it. The mean absolute deviation is also known as the mean deviation and average absolute deviation.
This definition of the mean absolute deviation sounds similar to the standard deviation (SD). While both measure variability, they have different calculations. In recent years, some proponents of MAD have suggested that it replace the SD as the primary measure because it is a simpler concept that better fits real life.
Static Percent
Static Percent allows the user to insert their own constant percent that will then be used to create take profits and stoploss
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
8. Metamorphosis - a technical indicator that produces a compound signal from the combination of other GKD indicators*
*(not part of the NNFX algorithm)
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, the Average Directional Index (ADX), and the Chandelier Exit.
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
What is an Metamorphosis indicator?
The concept of a metamorphosis indicator involves the integration of two or more GKD indicators to generate a compound signal. This is achieved by evaluating the accuracy of each indicator and selecting the signal from the indicator with the highest accuracy. As an illustration, let's consider a scenario where we calculate the accuracy of 10 indicators and choose the signal from the indicator that demonstrates the highest accuracy.
The resulting output from the metamorphosis indicator can then be utilized in a GKD-BT backtest by occupying a slot that aligns with the purpose of the metamorphosis indicator. The slot can be a GKD-B, GKD-C, or GKD-E slot, depending on the specific requirements and objectives of the indicator. This allows for seamless integration and utilization of the compound signal within the GKD-BT framework.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v2.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
6. GKD-M - Metamorphosis module (Metamorphosis, Number 8 in the NNFX algorithm, but not part of the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data to A backtest module wherein the various components of the GKD system are combined to create a trading signal.
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Full GKD Backtest
Baseline: Hull Moving Average
Volatility/Volume: Hurst Exponent
Confirmation 1: Composite RSI
Confirmation 2: uf2018
Continuation: Vortex
Exit: Rex Oscillator
Metamorphosis: Fisher Transform, Universal Oscillator, Aroon, Vortex .. combined
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD system.
█ Giga Kaleidoscope Modularized Trading System Signals
Standard Entry
1. GKD-C Confirmation gives signal
2. Baseline agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
1-Candle Standard Entry
1a. GKD-C Confirmation gives signal
2a. Baseline agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Baseline Entry
1. GKD-B Basline gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
7. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
1-Candle Baseline Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Volatility/Volume Entry
1. GKD-V Volatility/Volume gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Volatility/Volume Entry
1a. GKD-V Volatility/Volume gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSVVC Bars Back' prior
Next Candle
1b. Price retraced
2b. Volatility/Volume agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Baseline agrees
Confirmation 2 Entry
1. GKD-C Confirmation 2 gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Volatility/Volume agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Confirmation 2 Entry
1a. GKD-C Confirmation 2 gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSC2C Bars Back' prior
Next Candle
1b. Price retraced
2b. Confirmation 2 agrees
3b. Confirmation 1 agrees
4b. Volatility/Volume agrees
5b. Baseline agrees
PullBack Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price is beyond 1.0x Volatility of Baseline
Next Candle
1b. Price inside Goldie Locks Zone Minimum
2b. Price inside Goldie Locks Zone Maximum
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Continuation Entry
1. Standard Entry, 1-Candle Standard Entry, Baseline Entry, 1-Candle Baseline Entry, Volatility/Volume Entry, 1-Candle Volatility/Volume Entry, Confirmation 2 Entry, 1-Candle Confirmation 2 Entry, or Pullback entry triggered previously
2. Baseline hasn't crossed since entry signal trigger
4. Confirmation 1 agrees
5. Baseline agrees
6. Confirmation 2 agrees
█ Connecting to Backtests
All GKD indicators are chained indicators meaning you export the value of the indicators to specialized backtest to creat your GKD trading system. Each indicator contains a proprietary signal generation algo that will only work with GKD backtests. You can find these backtests using the links below.
GKD-BT Giga Confirmation Stack Backtest:
GKD-BT Giga Stacks Backtest:
GKD-BT Full Giga Kaleidoscope Backtest:
GKD-BT Solo Confirmation Super Complex Backtest:
GKD-BT Solo Confirmation Complex Backtest:
GKD-BT Solo Confirmation Simple Backtest:
Cari dalam skrip untuk "session"
GKD-B Stepped Baseline [Loxx]Giga Kaleidoscope GKD-B Stepped Baseline is a Baseline module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
█ GKD-B Stepped Baseline
This is a special implementation of GKD-B Baseline in that it allows the user to filter the selected moving average using the various types of volatility listed below. This additional filter allows the trader to identify longer trends that may be more confucive to a slow and steady trading style.
GKD Stepped Baseline includes 64 different moving averages:
Adaptive Moving Average - AMA
ADXvma - Average Directional Volatility Moving Average
Ahrens Moving Average
Alexander Moving Average - ALXMA
Deviation Scaled Moving Average - DSMA
Donchian
Double Exponential Moving Average - DEMA
Double Smoothed Exponential Moving Average - DSEMA
Double Smoothed FEMA - DSFEMA
Double Smoothed Range Weighted EMA - DSRWEMA
Double Smoothed Wilders EMA - DSWEMA
Double Weighted Moving Average - DWMA
Ehlers Optimal Tracking Filter - EOTF
Exponential Moving Average - EMA
Fast Exponential Moving Average - FEMA
Fractal Adaptive Moving Average - FRAMA
Generalized DEMA - GDEMA
Generalized Double DEMA - GDDEMA
Hull Moving Average (Type 1) - HMA1
Hull Moving Average (Type 2) - HMA2
Hull Moving Average (Type 3) - HMA3
Hull Moving Average (Type 4) - HMA4
IE /2 - Early T3 by Tim Tilson
Integral of Linear Regression Slope - ILRS
Instantaneous Trendline
Kalman Filter
Kaufman Adaptive Moving Average - KAMA
Laguerre Filter
Leader Exponential Moving Average
Linear Regression Value - LSMA ( Least Squares Moving Average )
Linear Weighted Moving Average - LWMA
McGinley Dynamic
McNicholl EMA
Non-Lag Moving Average
Ocean NMA Moving Average - ONMAMA
One More Moving Average - OMA
Parabolic Weighted Moving Average
Probability Density Function Moving Average - PDFMA
Quadratic Regression Moving Average - QRMA
Regularized EMA - REMA
Range Weighted EMA - RWEMA
Recursive Moving Trendline
Simple Decycler - SDEC
Simple Jurik Moving Average - SJMA
Simple Moving Average - SMA
Sine Weighted Moving Average
Smoothed LWMA - SLWMA
Smoothed Moving Average - SMMA
Smoother
Super Smoother
T3
Three-pole Ehlers Butterworth
Three-pole Ehlers Smoother
Triangular Moving Average - TMA
Triple Exponential Moving Average - TEMA
Two-pole Ehlers Butterworth
Two-pole Ehlers smoother
Variable Index Dynamic Average - VIDYA
Variable Moving Average - VMA
Volume Weighted EMA - VEMA
Volume Weighted Moving Average - VWMA
Zero-Lag DEMA - Zero Lag Exponential Moving Average
Zero-Lag Moving Average
Zero Lag TEMA - Zero Lag Triple Exponential Moving Average
Adaptive Moving Average - AMA
The Adaptive Moving Average (AMA) is a moving average that changes its sensitivity to price moves depending on the calculated volatility. It becomes more sensitive during periods when the price is moving smoothly in a certain direction and becomes less sensitive when the price is volatile.
ADXvma - Average Directional Volatility Moving Average
Linnsoft's ADXvma formula is a volatility-based moving average, with the volatility being determined by the value of the ADX indicator.
The ADXvma has the SMA in Chande's CMO replaced with an EMA , it then uses a few more layers of EMA smoothing before the "Volatility Index" is calculated.
A side effect is, those additional layers slow down the ADXvma when you compare it to Chande's Variable Index Dynamic Average VIDYA .
The ADXVMA provides support during uptrends and resistance during downtrends and will stay flat for longer, but will create some of the most accurate market signals when it decides to move.
Ahrens Moving Average
Richard D. Ahrens's Moving Average promises "Smoother Data" that isn't influenced by the occasional price spike. It works by using the Open and the Close in his formula so that the only time the Ahrens Moving Average will change is when the candlestick is either making new highs or new lows.
Alexander Moving Average - ALXMA
This Moving Average uses an elaborate smoothing formula and utilizes a 7 period Moving Average. It corresponds to fitting a second-order polynomial to seven consecutive observations. This moving average is rarely used in trading but is interesting as this Moving Average has been applied to diffusion indexes that tend to be very volatile.
Deviation Scaled Moving Average - DSMA
The Deviation-Scaled Moving Average is a data smoothing technique that acts like an exponential moving average with a dynamic smoothing coefficient. The smoothing coefficient is automatically updated based on the magnitude of price changes. In the Deviation-Scaled Moving Average, the standard deviation from the mean is chosen to be the measure of this magnitude. The resulting indicator provides substantial smoothing of the data even when price changes are small while quickly adapting to these changes.
Donchian
Donchian Channels are three lines generated by moving average calculations that comprise an indicator formed by upper and lower bands around a midrange or median band. The upper band marks the highest price of a security over N periods while the lower band marks the lowest price of a security over N periods.
Double Exponential Moving Average - DEMA
The Double Exponential Moving Average ( DEMA ) combines a smoothed EMA and a single EMA to provide a low-lag indicator. It's primary purpose is to reduce the amount of "lagging entry" opportunities, and like all Moving Averages, the DEMA confirms uptrends whenever price crosses on top of it and closes above it, and confirms downtrends when the price crosses under it and closes below it - but with significantly less lag.
Double Smoothed Exponential Moving Average - DSEMA
The Double Smoothed Exponential Moving Average is a lot less laggy compared to a traditional EMA . It's also considered a leading indicator compared to the EMA , and is best utilized whenever smoothness and speed of reaction to market changes are required.
Double Smoothed FEMA - DSFEMA
Same as the Double Exponential Moving Average (DEMA), but uses a faster version of EMA for its calculation.
Double Smoothed Range Weighted EMA - DSRWEMA
Range weighted exponential moving average (EMA) is, unlike the "regular" range weighted average calculated in a different way. Even though the basis - the range weighting - is the same, the way how it is calculated is completely different. By definition this type of EMA is calculated as a ratio of EMA of price*weight / EMA of weight. And the results are very different and the two should be considered as completely different types of averages. The higher than EMA to price changes responsiveness when the ranges increase remains in this EMA too and in those cases this EMA is clearly leading the "regular" EMA. This version includes double smoothing.
Double Smoothed Wilders EMA - DSWEMA
Welles Wilder was frequently using one "special" case of EMA (Exponential Moving Average) that is due to that fact (that he used it) sometimes called Wilder's EMA. This version is adding double smoothing to Wilder's EMA in order to make it "faster" (it is more responsive to market prices than the original) and is still keeping very smooth values.
Double Weighted Moving Average - DWMA
Double weighted moving average is an LWMA (Linear Weighted Moving Average). Instead of doing one cycle for calculating the LWMA, the indicator is made to cycle the loop 2 times. That produces a smoother values than the original LWMA
Ehlers Optimal Tracking Filter - EOTF
The Elher's Optimum Tracking Filter quickly adjusts rapid shifts in the price and yet is relatively smooth when the price has a sideways action. The operation of this filter is similar to Kaufman’s Adaptive Moving
Average
Exponential Moving Average - EMA
The EMA places more significance on recent data points and moves closer to price than the SMA ( Simple Moving Average ). It reacts faster to volatility due to its emphasis on recent data and is known for its ability to give greater weight to recent and more relevant data. The EMA is therefore seen as an enhancement over the SMA .
Fast Exponential Moving Average - FEMA
An Exponential Moving Average with a short look-back period.
Fractal Adaptive Moving Average - FRAMA
The Fractal Adaptive Moving Average by John Ehlers is an intelligent adaptive Moving Average which takes the importance of price changes into account and follows price closely enough to display significant moves whilst remaining flat if price ranges. The FRAMA does this by dynamically adjusting the look-back period based on the market's fractal geometry.
Generalized DEMA - GDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages.". Instead of using fixed multiplication factor in the final DEMA formula, the generalized version allows you to change it. By varying the "volume factor" form 0 to 1 you apply different multiplications and thus producing DEMA with different "speed" - the higher the volume factor is the "faster" the DEMA will be (but also the slope of it will be less smooth). The volume factor is limited in the calculation to 1 since any volume factor that is larger than 1 is increasing the overshooting to the extent that some volume factors usage makes the indicator unusable.
Generalized Double DEMA - GDDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages''. This is an extension of the Generalized DEMA using Tim Tillsons (the inventor of T3) idea, and is using GDEMA of GDEMA for calculation (which is the "middle step" of T3 calculation). Since there are no versions showing that middle step, this version covers that too. The result is smoother than Generalized DEMA, but is less smooth than T3 - one has to do some experimenting in order to find the optimal way to use it, but in any case, since it is "faster" than the T3 (Tim Tillson T3) and still smooth, it looks like a good compromise between speed and smoothness.
Hull Moving Average (Type 1) - HMA1
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMA for smoothing.
Hull Moving Average (Type 2) - HMA2
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses EMA for smoothing.
Hull Moving Average (Type 3) - HMA3
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses LWMA for smoothing.
Hull Moving Average (Type 4) - HMA4
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMMA for smoothing.
IE /2 - Early T3 by Tim Tilson and T3 new
The T3 moving average is a type of technical indicator used in financial analysis to identify trends in price movements. It is similar to the Exponential Moving Average (EMA) and the Double Exponential Moving Average (DEMA), but uses a different smoothing algorithm.
The T3 moving average is calculated using a series of exponential moving averages that are designed to filter out noise and smooth the data. The resulting smoothed data is then weighted with a non-linear function to produce a final output that is more responsive to changes in trend direction.
The T3 moving average can be customized by adjusting the length of the moving average, as well as the weighting function used to smooth the data. It is commonly used in conjunction with other technical indicators as part of a larger trading strategy.
Integral of Linear Regression Slope - ILRS
A Moving Average where the slope of a linear regression line is simply integrated as it is fitted in a moving window of length N (natural numbers in maths) across the data. The derivative of ILRS is the linear regression slope. ILRS is not the same as a SMA ( Simple Moving Average ) of length N, which is actually the midpoint of the linear regression line as it moves across the data.
Instantaneous Trendline
The Instantaneous Trendline is created by removing the dominant cycle component from the price information which makes this Moving Average suitable for medium to long-term trading.
Kalman Filter
Kalman filter is an algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies. This means that the filter was originally designed to work with noisy data. Also, it is able to work with incomplete data. Another advantage is that it is designed for and applied in dynamic systems; our price chart belongs to such systems. This version is true to the original design of the trade-ready Kalman Filter where velocity is the triggering mechanism.
Kalman Filter is a more accurate smoothing/prediction algorithm than the moving average because it is adaptive: it accounts for estimation errors and tries to adjust its predictions from the information it learned in the previous stage. Theoretically, Kalman Filter consists of measurement and transition components.
Kaufman Adaptive Moving Average - KAMA
Developed by Perry Kaufman, Kaufman's Adaptive Moving Average (KAMA) is a moving average designed to account for market noise or volatility. KAMA will closely follow prices when the price swings are relatively small and the noise is low.
Laguerre Filter
The Laguerre Filter is a smoothing filter which is based on Laguerre polynomials. The filter requires the current price, three prior prices, a user defined factor called Alpha to fill its calculation.
Adjusting the Alpha coefficient is used to increase or decrease its lag and its smoothness.
Leader Exponential Moving Average
The Leader EMA was created by Giorgos E. Siligardos who created a Moving Average which was able to eliminate lag altogether whilst maintaining some smoothness. It was first described during his research paper "MACD Leader" where he applied this to the MACD to improve its signals and remove its lagging issue. This filter uses his leading MACD's "modified EMA" and can be used as a zero lag filter.
Linear Regression Value - LSMA ( Least Squares Moving Average )
LSMA as a Moving Average is based on plotting the end point of the linear regression line. It compares the current value to the prior value and a determination is made of a possible trend, eg. the linear regression line is pointing up or down.
Linear Weighted Moving Average - LWMA
LWMA reacts to price quicker than the SMA and EMA . Although it's similar to the Simple Moving Average , the difference is that a weight coefficient is multiplied to the price which means the most recent price has the highest weighting, and each prior price has progressively less weight. The weights drop in a linear fashion.
McGinley Dynamic
John McGinley created this Moving Average to track prices better than traditional Moving Averages. It does this by incorporating an automatic adjustment factor into its formula, which speeds (or slows) the indicator in trending, or ranging, markets.
McNicholl EMA
Dennis McNicholl developed this Moving Average to use as his center line for his "Better Bollinger Bands" indicator and was successful because it responded better to volatility changes over the standard SMA and managed to avoid common whipsaws.
Non-lag moving average
The Non Lag Moving average follows price closely and gives very quick signals as well as early signals of price change. As a standalone Moving Average, it should not be used on its own, but as an additional confluence tool for early signals.
Ocean NMA Moving Average - ONMAMA
Created by Jim Sloman, the NMA is a moving average that automatically adjusts to volatility without being programmed to do so. For more info, read his guide "Ocean Theory, an Introduction"
One More Moving Average (OMA)
The One More Moving Average (OMA) is a technical indicator that calculates a series of Jurik-style moving averages in order to reduce noise and provide smoother price data. It uses six exponential moving averages to generate the final value, with the length of the moving averages determined by an adaptive algorithm that adjusts to the current market conditions. The algorithm calculates the average period by comparing the signal to noise ratio and using this value to determine the length of the moving averages. The resulting values are used to generate the final value of the OMA, which can be used to identify trends and potential changes in trend direction.
Parabolic Weighted Moving Average
The Parabolic Weighted Moving Average is a variation of the Linear Weighted Moving Average . The Linear Weighted Moving Average calculates the average by assigning different weights to each element in its calculation. The Parabolic Weighted Moving Average is a variation that allows weights to be changed to form a parabolic curve. It is done simply by using the Power parameter of this indicator.
Probability Density Function Moving Average - PDFMA
Probability density function based MA is a sort of weighted moving average that uses probability density function to calculate the weights. By its nature it is similar to a lot of digital filters.
Quadratic Regression Moving Average - QRMA
A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. This moving average is an obscure concept that was posted to Forex forums in around 2008.
Regularized EMA - REMA
The regularized exponential moving average (REMA) by Chris Satchwell is a variation on the EMA (see Exponential Moving Average) designed to be smoother but not introduce too much extra lag.
Range Weighted EMA - RWEMA
This indicator is a variation of the range weighted EMA. The variation comes from a possible need to make that indicator a bit less "noisy" when it comes to slope changes. The method used for calculating this variation is the method described by Lee Leibfarth in his article "Trading With An Adaptive Price Zone".
Recursive Moving Trendline
Dennis Meyers's Recursive Moving Trendline uses a recursive (repeated application of a rule) polynomial fit, a technique that uses a small number of past values estimations of price and today's price to predict tomorrow's price.
Simple Decycler - SDEC
The Ehlers Simple Decycler study is a virtually zero-lag technical indicator proposed by John F. Ehlers. The original idea behind this study (and several others created by John F. Ehlers) is that market data can be considered a continuum of cycle periods with different cycle amplitudes. Thus, trending periods can be considered segments of longer cycles, or, in other words, low-frequency segments. Applying the right filter might help identify these segments.
Simple Loxx Moving Average - SLMA
A three stage moving average combining an adaptive EMA, a Kalman Filter, and a Kauffman adaptive filter.
Simple Moving Average - SMA
The SMA calculates the average of a range of prices by adding recent prices and then dividing that figure by the number of time periods in the calculation average. It is the most basic Moving Average which is seen as a reliable tool for starting off with Moving Average studies. As reliable as it may be, the basic moving average will work better when it's enhanced into an EMA .
Sine Weighted Moving Average
The Sine Weighted Moving Average assigns the most weight at the middle of the data set. It does this by weighting from the first half of a Sine Wave Cycle and the most weighting is given to the data in the middle of that data set. The Sine WMA closely resembles the TMA (Triangular Moving Average).
Smoothed LWMA - SLWMA
A smoothed version of the LWMA
Smoothed Moving Average - SMMA
The Smoothed Moving Average is similar to the Simple Moving Average ( SMA ), but aims to reduce noise rather than reduce lag. SMMA takes all prices into account and uses a long lookback period. Due to this, it's seen as an accurate yet laggy Moving Average.
Smoother
The Smoother filter is a faster-reacting smoothing technique which generates considerably less lag than the SMMA ( Smoothed Moving Average ). It gives earlier signals but can also create false signals due to its earlier reactions. This filter is sometimes wrongly mistaken for the superior Jurik Smoothing algorithm.
Super Smoother
The Super Smoother filter uses John Ehlers’s “Super Smoother” which consists of a Two pole Butterworth filter combined with a 2-bar SMA ( Simple Moving Average ) that suppresses the 22050 Hz Nyquist frequency: A characteristic of a sampler, which converts a continuous function or signal into a discrete sequence.
Three-pole Ehlers Butterworth
The 3 pole Ehlers Butterworth (as well as the Two pole Butterworth) are both superior alternatives to the EMA and SMA . They aim at producing less lag whilst maintaining accuracy. The 2 pole filter will give you a better approximation for price, whereas the 3 pole filter has superior smoothing.
Three-pole Ehlers smoother
The 3 pole Ehlers smoother works almost as close to price as the above mentioned 3 Pole Ehlers Butterworth. It acts as a strong baseline for signals but removes some noise. Side by side, it hardly differs from the Three Pole Ehlers Butterworth but when examined closely, it has better overshoot reduction compared to the 3 pole Ehlers Butterworth.
Triangular Moving Average - TMA
The TMA is similar to the EMA but uses a different weighting scheme. Exponential and weighted Moving Averages will assign weight to the most recent price data. Simple moving averages will assign the weight equally across all the price data. With a TMA (Triangular Moving Average), it is double smoother (averaged twice) so the majority of the weight is assigned to the middle portion of the data.
Triple Exponential Moving Average - TEMA
The TEMA uses multiple EMA calculations as well as subtracting lag to create a tool which can be used for scalping pullbacks. As it follows price closely, its signals are considered very noisy and should only be used in extremely fast-paced trading conditions.
Two-pole Ehlers Butterworth
The 2 pole Ehlers Butterworth (as well as the three pole Butterworth mentioned above) is another filter that cuts out the noise and follows the price closely. The 2 pole is seen as a faster, leading filter over the 3 pole and follows price a bit more closely. Analysts will utilize both a 2 pole and a 3 pole Butterworth on the same chart using the same period, but having both on chart allows its crosses to be traded.
Two-pole Ehlers smoother
A smoother version of the Two pole Ehlers Butterworth. This filter is the faster version out of the 3 pole Ehlers Butterworth. It does a decent job at cutting out market noise whilst emphasizing a closer following to price over the 3 pole Ehlers .
Variable Index Dynamic Average - VIDYA
Variable Index Dynamic Average Technical Indicator ( VIDYA ) was developed by Tushar Chande. It is an original method of calculating the Exponential Moving Average ( EMA ) with the dynamically changing period of averaging.
Variable Moving Average - VMA
The Variable Moving Average (VMA) is a study that uses an Exponential Moving Average being able to automatically adjust its smoothing factor according to the market volatility.
Volume Weighted EMA - VEMA
An EMA that uses a volume and price weighted calculation instead of the standard price input.
Volume Weighted Moving Average - VWMA
A Volume Weighted Moving Average is a moving average where more weight is given to bars with heavy volume than with light volume. Thus the value of the moving average will be closer to where most trading actually happened than it otherwise would be without being volume weighted.
Zero-Lag DEMA - Zero Lag Double Exponential Moving Average
John Ehlers's Zero Lag DEMA's aim is to eliminate the inherent lag associated with all trend following indicators which average a price over time. Because this is a Double Exponential Moving Average with Zero Lag, it has a tendency to overshoot and create a lot of false signals for swing trading. It can however be used for quick scalping or as a secondary indicator for confluence.
Zero-Lag Moving Average
The Zero Lag Moving Average is described by its creator, John Ehlers , as a Moving Average with absolutely no delay. And it's for this reason that this filter will cause a lot of abrupt signals which will not be ideal for medium to long-term traders. This filter is designed to follow price as close as possible whilst de-lagging data instead of basing it on regular data. The way this is done is by attempting to remove the cumulative effect of the Moving Average.
Zero-Lag TEMA - Zero Lag Triple Exponential Moving Average
Just like the Zero Lag DEMA , this filter will give you the fastest signals out of all the Zero Lag Moving Averages. This is useful for scalping but dangerous for medium to long-term traders, especially during market Volatility and news events. Having no lag, this filter also has no smoothing in its signals and can cause some very bizarre behavior when applied to certain indicators.
Volatility Goldie Locks Zone
This volatility filter is the standard first pass filter that is used for all NNFX systems despite the additional volatility/volume filter used in step 5. For this filter, price must fall into a range of maximum and minimum values calculated using multiples of volatility. Unlike the standard NNFX systems, this version of volatility filtering is separated from the core Baseline and uses it's own moving average with Loxx's Exotic Source Types. The green and red dots at the top of the chart denote whether a candle qualifies for a either or long or short respectively. The green and red triangles at the bottom of the chart denote whether the trigger has crossed up or down and qualifies inside the Goldie Locks zone. White coloring of the Goldie Locks Zone mean line is where volatility is too low to trade.
Volatility Types Included
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility .
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility . That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator consists of using the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e. it assumes that the underlying asset follows a GBM process with zero drift. Therefore the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Standard Deviation
Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The standard deviation is calculated as the square root of variance by determining each data point's deviation relative to the mean. If the data points are further from the mean, there is a higher deviation within the data set; thus, the more spread out the data, the higher the standard deviation.
Adaptive Deviation
By definition, the Standard Deviation (STD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. In technical analysis we usually use it to measure the level of current volatility .
Standard Deviation is based on Simple Moving Average calculation for mean value. This version of standard deviation uses the properties of EMA to calculate what can be called a new type of deviation, and since it is based on EMA , we can call it EMA deviation. And added to that, Perry Kaufman's efficiency ratio is used to make it adaptive (since all EMA type calculations are nearly perfect for adapting).
The difference when compared to standard is significant--not just because of EMA usage, but the efficiency ratio makes it a "bit more logical" in very volatile market conditions.
Median Absolute Deviation
The median absolute deviation is a measure of statistical dispersion. Moreover, the MAD is a robust statistic, being more resilient to outliers in a data set than the standard deviation. In the standard deviation, the distances from the mean are squared, so large deviations are weighted more heavily, and thus outliers can heavily influence it. In the MAD, the deviations of a small number of outliers are irrelevant.
Because the MAD is a more robust estimator of scale than the sample variance or standard deviation, it works better with distributions without a mean or variance, such as the Cauchy distribution.
For this indicator, I used a manual recreation of the quantile function in Pine Script. This is so users have a full inside view into how this is calculated.
Efficiency-Ratio Adaptive ATR
Average True Range (ATR) is widely used indicator in many occasions for technical analysis . It is calculated as the RMA of true range. This version adds a "twist": it uses Perry Kaufman's Efficiency Ratio to calculate adaptive true range
Mean Absolute Deviation
The mean absolute deviation (MAD) is a measure of variability that indicates the average distance between observations and their mean. MAD uses the original units of the data, which simplifies interpretation. Larger values signify that the data points spread out further from the average. Conversely, lower values correspond to data points bunching closer to it. The mean absolute deviation is also known as the mean deviation and average absolute deviation.
This definition of the mean absolute deviation sounds similar to the standard deviation ( SD ). While both measure variability, they have different calculations. In recent years, some proponents of MAD have suggested that it replace the SD as the primary measure because it is a simpler concept that better fits real life.
For Pine Coders, this is equivalent of using ta.dev()
Additional features will be added in future releases.
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, the Average Directional Index (ADX), and the Chandelier Exit.
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v1.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data between modules. Data is passed between each module as described below:
GKD-B => GKD-V => GKD-C(1) => GKD-C(2) => GKD-C(Continuation) => GKD-E => GKD-BT
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Strategy with 1-3 take profits, trailing stop loss, multiple types of PnL volatility, and 2 backtesting styles
Baseline: Hull Moving Average as shown on the chart above
Volatility/Volume: Hurst Exponent
Confirmation 1: Fisher Transform
Confirmation 2: Williams Percent Range
Continuation: Fisher Transform
Exit: Rex Oscillator
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD protocol chain.
Giga Kaleidoscope Modularized Trading System Signals (based on the NNFX algorithm)
Standard Entry
1. GKD-C Confirmation 1 Signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Continuation Entry
1. Standard Entry, Baseline Entry, or Pullback; entry triggered previously
2. GKD-B Baseline hasn't crossed since entry signal trigger
3. GKD-C Confirmation Continuation Indicator signals
4. GKD-C Confirmation 1 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 2 agrees
1-Candle Rule Standard Entry
1. GKD-C Confirmation 1 signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
1-Candle Rule Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume Agrees
1-Candle Rule Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close)
2. GKD-B Volatility/Volume agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
PullBack Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is beyond 1.0x Volatility of Baseline
Next Candle:
1. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
2. GKD-C Confirmation 1 agrees
3. GKD-C Confirmation 2 agrees
4. GKD-V Volatility/Volume Agrees
]█ Setting up the GKD
The GKD system involves chaining indicators together. These are the steps to set this up.
Use a GKD-C indicator alone on a chart
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
Use a GKD-V indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Use a GKD-B indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Baseline (Baseline, Backtest)
1. Import the GKD-B Baseline into the GKD-BT Backtest: "Input into Volatility/Volume or Backtest (Baseline testing)"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline"
Volatility/Volume (Volatility/Volume, Backte st)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Solo"
2. Inside the GKD-V indicator, change the "Signal Type" setting to "Crossing" (neither traditional nor both can be backtested)
3. Import the GKD-V indicator into the GKD-BT Backtest: "Input into C1 or Backtest"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Volatility/Volume"
5. Inside the GKD-BT Backtest, a) change the setting "Backtest Type" to "Trading" if using a directional GKD-V indicator; or, b) change the setting "Backtest Type" to "Full" if using a directional or non-directional GKD-V indicator (non-directional GKD-V can only test Longs and Shorts separately)
6. If "Backtest Type" is set to "Full": Inside the GKD-BT Backtest, change the setting "Backtest Side" to "Long" or "Short
7. If "Backtest Type" is set to "Full": To allow the system to open multiple orders at one time so you test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Solo Confirmation Simple (Confirmation, Backtest)
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
1. Import the GKD-C indicator into the GKD-BT Backtest: "Input into Backtest"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Solo Confirmation Simple"
Solo Confirmation Complex without Exits (Baseline, Volatility/Volume, Confirmation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
6. Import the GKD-C into the GKD-BT Backtest: "Input into Exit or Backtest"
Solo Confirmation Complex with Exits (Baseline, Volatility/Volume, Confirmation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Import the GKD-C indicator into the GKD-E indicator: "Input into Exit"
6. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
7. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Full GKD without Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
9. Import the GKD-E into the GKD-BT Backtest: "Input into Exit or Backtest"
Full GKD with Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Import the GKD-C Continuation indicator into the GKD-E indicator: "Input into Exit"
9. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
10. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Baseline + Volatility/Volume (Baseline, Volatility/Volume, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Baseline + Volatility/Volume"
2. Inside the GKD-V indicator, make sure the "Signal Type" setting is set to "Traditional"
3. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline + Volatility/Volume"
5. Import the GKD-V into the GKD-BT Backtest: "Input into C1 or Backtest"
6. Inside the GKD-BT Backtest, change the setting "Backtest Type" to "Full". For this backtest, you must test Longs and Shorts separately
7. To allow the system to open multiple orders at one time so you can test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Requirements
Outputs
Chained or Standalone: GKD-BT or GKC-V
Stack 1: GKD-C Continuation indicator
Stack 2: GKD-C Continuation indicator
GKD-V Volatility Ratio [Loxx]Giga Kaleidoscope Volatility Ratio is a Volatility/Volume module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
What is Loxx's "Giga Kaleidoscope Modularized Trading System"?
The Giga Kaleidoscope Modularized Trading System is a trading system built on the philosophy of the NNFX (No Nonsense Forex) algorithmic trading.
What is an NNFX algorithmic trading strategy?
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend (such as "Baseline" shown on the chart above)
3. Confirmation 1 - a technical indicator used to identify trends. This should agree with the "Baseline"
4. Confirmation 2 - a technical indicator used to identify trends. This filters/verifies the trend identified by "Baseline" and "Confirmation 1"
5. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown.
6. Exit - a technical indicator used to determine when a trend is exhausted.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v1.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 module (Confirmation 1/2, Numbers 3 and 4 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 5 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 6 in the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data between modules. Data is passed between each module as described below:
GKD-B => GKD-V => GKD-C(1) => GKD-C(2) => GKD-E => GKD-BT
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Strategy with 1-3 take profits, trailing stop loss, multiple types of PnL volatility, and 2 backtesting styles
Baseline: Hull Moving Average
Volatility/Volume: Volatility Ratio as shown on the chart above
Confirmation 1: Vortex
Confirmation 2: Fisher Transform
Exit: Rex Oscillator
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD protocol chain.
Now that you have a general understanding of the NNFX algorithm and the GKD trading system. Let's go over what's inside the GKD-V Volatility Ratio itself.
What is Volatility Ratio?
Volatility Ratio is a comparison between volatility and its moving average. This indicator includes 11 different types of volatility as well as 63 different moving averages.
You can read about the moving average types here:
Volatility Types Included
v1.0 Included Volatility
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility .
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility . That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator consists of using the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e. it assumes that the underlying asset follows a GBM process with zero drift. Therefore the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Signals
1. Traditional: The traditional signal is volatility is either high or low. This is non-directional. When the Volatility Ratio is above 1, then there is enough volatility to trade long or short. This signal type has a bar risings option that requires that the Volatility Ratio is not only above 1 but also rising for the last XX bars.
2. Crossing: This is experimental. When a cross-up above 1 or cross-down below one occurs then, and only then, is there enough volatility to trade long or short. This is also non-directional.
3. Both Traditional and Crossing
X-bar Rule
If a signal registers XX bars ago, then the signal is still valid. This is an optional feature.
Other things to note
The GKD trading system requires that a GKD-V indicator be present in the indicator chain, but the GKD-V indicator doesn't need to be active. You can turn on/off the Volatility Ratio as you wish so you can backtest your trading strategy with the filter on or off.
Additional features will be added in future releases.
This indicator is only available to ALGX Trading VIP group members . You can see the Author's Instructions below to get more information on how to get access.
GKD-B Baseline [Loxx]Giga Kaleidoscope Baseline is a Baseline module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
What is Loxx's "Giga Kaleidoscope Modularized Trading System"?
The Giga Kaleidoscope Modularized Trading System is a trading system built on the philosophy of the NNFX (No Nonsense Forex) algorithmic trading.
What is an NNFX algorithmic trading strategy?
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend (such as "Baseline" shown on the chart above)
3. Confirmation 1 - a technical indicator used to identify trend. This should agree with the "Baseline"
4. Confirmation 2 - a technical indicator used to identify trend. This filters/verifies the trend identified by "Baseline" and "Confirmation 1"
5. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown.
6. Exit - a technical indicator used to determine when trend is exhausted.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v1.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 module (Confirmation 1/2, Numbers 3 and 4 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 5 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 6 in the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data between modules. Data is passed between each module as described below:
GKD-B => GKD-V => GKD-C(1) => GKD-C(2) => GKD-E => GKD-BT
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Strategy with 1-3 take profits, trailing stop loss, multiple types of PnL volatility, and 2 backtesting styles
Baseline: Hull Moving Average as shown on the chart above
Volatility/Volume: Jurik Volty
Confirmation 1: Vortex
Confirmation 2: Fisher Transform
Exit: Rex Oscillator
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders understand to which module each indicator belongs and where each indicator fits into the GKD protocol chain.
Now that you have a general understanding of the NNFX algorithm and the GKD trading system. let's go over what's inside the GKD-B Baseline itself.
GKD Baseline Special Features and Notable Inputs
GKD Baseline v1.0 includes 63 different moving averages:
Adaptive Moving Average - AMA
ADXvma - Average Directional Volatility Moving Average
Ahrens Moving Average
Alexander Moving Average - ALXMA
Deviation Scaled Moving Average - DSMA
Donchian
Double Exponential Moving Average - DEMA
Double Smoothed Exponential Moving Average - DSEMA
Double Smoothed FEMA - DSFEMA
Double Smoothed Range Weighted EMA - DSRWEMA
Double Smoothed Wilders EMA - DSWEMA
Double Weighted Moving Average - DWMA
Ehlers Optimal Tracking Filter - EOTF
Exponential Moving Average - EMA
Fast Exponential Moving Average - FEMA
Fractal Adaptive Moving Average - FRAMA
Generalized DEMA - GDEMA
Generalized Double DEMA - GDDEMA
Hull Moving Average (Type 1) - HMA1
Hull Moving Average (Type 2) - HMA2
Hull Moving Average (Type 3) - HMA3
Hull Moving Average (Type 4) - HMA4
IE /2 - Early T3 by Tim Tilson
Integral of Linear Regression Slope - ILRS
Instantaneous Trendline
Kalman Filter
Kaufman Adaptive Moving Average - KAMA
Laguerre Filter
Leader Exponential Moving Average
Linear Regression Value - LSMA ( Least Squares Moving Average )
Linear Weighted Moving Average - LWMA
McGinley Dynamic
McNicholl EMA
Non-Lag Moving Average
Ocean NMA Moving Average - ONMAMA
Parabolic Weighted Moving Average
Probability Density Function Moving Average - PDFMA
Quadratic Regression Moving Average - QRMA
Regularized EMA - REMA
Range Weighted EMA - RWEMA
Recursive Moving Trendline
Simple Decycler - SDEC
Simple Jurik Moving Average - SJMA
Simple Moving Average - SMA
Sine Weighted Moving Average
Smoothed LWMA - SLWMA
Smoothed Moving Average - SMMA
Smoother
Super Smoother
T3
Three-pole Ehlers Butterworth
Three-pole Ehlers Smoother
Triangular Moving Average - TMA
Triple Exponential Moving Average - TEMA
Two-pole Ehlers Butterworth
Two-pole Ehlers smoother
Variable Index Dynamic Average - VIDYA
Variable Moving Average - VMA
Volume Weighted EMA - VEMA
Volume Weighted Moving Average - VWMA
Zero-Lag DEMA - Zero Lag Exponential Moving Average
Zero-Lag Moving Average
Zero Lag TEMA - Zero Lag Triple Exponential Moving Average
Adaptive Moving Average - AMA
Description. The Adaptive Moving Average (AMA) is a moving average that changes its sensitivity to price moves depending on the calculated volatility. It becomes more sensitive during periods when the price is moving smoothly in a certain direction and becomes less sensitive when the price is volatile.
ADXvma - Average Directional Volatility Moving Average
Linnsoft's ADXvma formula is a volatility-based moving average, with the volatility being determined by the value of the ADX indicator.
The ADXvma has the SMA in Chande's CMO replaced with an EMA , it then uses a few more layers of EMA smoothing before the "Volatility Index" is calculated.
A side effect is, those additional layers slow down the ADXvma when you compare it to Chande's Variable Index Dynamic Average VIDYA .
The ADXVMA provides support during uptrends and resistance during downtrends and will stay flat for longer, but will create some of the most accurate market signals when it decides to move.
Ahrens Moving Average
Richard D. Ahrens's Moving Average promises "Smoother Data" that isn't influenced by the occasional price spike. It works by using the Open and the Close in his formula so that the only time the Ahrens Moving Average will change is when the candlestick is either making new highs or new lows.
Alexander Moving Average - ALXMA
This Moving Average uses an elaborate smoothing formula and utilizes a 7 period Moving Average. It corresponds to fitting a second-order polynomial to seven consecutive observations. This moving average is rarely used in trading but is interesting as this Moving Average has been applied to diffusion indexes that tend to be very volatile.
Deviation Scaled Moving Average - DSMA
The Deviation-Scaled Moving Average is a data smoothing technique that acts like an exponential moving average with a dynamic smoothing coefficient. The smoothing coefficient is automatically updated based on the magnitude of price changes. In the Deviation-Scaled Moving Average, the standard deviation from the mean is chosen to be the measure of this magnitude. The resulting indicator provides substantial smoothing of the data even when price changes are small while quickly adapting to these changes.
Donchian
Donchian Channels are three lines generated by moving average calculations that comprise an indicator formed by upper and lower bands around a midrange or median band. The upper band marks the highest price of a security over N periods while the lower band marks the lowest price of a security over N periods.
Double Exponential Moving Average - DEMA
The Double Exponential Moving Average ( DEMA ) combines a smoothed EMA and a single EMA to provide a low-lag indicator. It's primary purpose is to reduce the amount of "lagging entry" opportunities, and like all Moving Averages, the DEMA confirms uptrends whenever price crosses on top of it and closes above it, and confirms downtrends when the price crosses under it and closes below it - but with significantly less lag.
Double Smoothed Exponential Moving Average - DSEMA
The Double Smoothed Exponential Moving Average is a lot less laggy compared to a traditional EMA . It's also considered a leading indicator compared to the EMA , and is best utilized whenever smoothness and speed of reaction to market changes are required.
Double Smoothed FEMA - DSFEMA
Same as the Double Exponential Moving Average (DEMA), but uses a faster version of EMA for its calculation.
Double Smoothed Range Weighted EMA - DSRWEMA
Range weighted exponential moving average (EMA) is, unlike the "regular" range weighted average calculated in a different way. Even though the basis - the range weighting - is the same, the way how it is calculated is completely different. By definition this type of EMA is calculated as a ratio of EMA of price*weight / EMA of weight. And the results are very different and the two should be considered as completely different types of averages. The higher than EMA to price changes responsiveness when the ranges increase remains in this EMA too and in those cases this EMA is clearly leading the "regular" EMA. This version includes double smoothing.
Double Smoothed Wilders EMA - DSWEMA
Welles Wilder was frequently using one "special" case of EMA (Exponential Moving Average) that is due to that fact (that he used it) sometimes called Wilder's EMA. This version is adding double smoothing to Wilder's EMA in order to make it "faster" (it is more responsive to market prices than the original) and is still keeping very smooth values.
Double Weighted Moving Average - DWMA
Double weighted moving average is an LWMA (Linear Weighted Moving Average). Instead of doing one cycle for calculating the LWMA, the indicator is made to cycle the loop 2 times. That produces a smoother values than the original LWMA
Ehlers Optimal Tracking Filter - EOTF
The Elher's Optimum Tracking Filter quickly adjusts rapid shifts in the price and yet is relatively smooth when the price has a sideways action. The operation of this filter is similar to Kaufman’s Adaptive Moving
Average
Exponential Moving Average - EMA
The EMA places more significance on recent data points and moves closer to price than the SMA ( Simple Moving Average ). It reacts faster to volatility due to its emphasis on recent data and is known for its ability to give greater weight to recent and more relevant data. The EMA is therefore seen as an enhancement over the SMA .
Fast Exponential Moving Average - FEMA
An Exponential Moving Average with a short look-back period.
Fractal Adaptive Moving Average - FRAMA
The Fractal Adaptive Moving Average by John Ehlers is an intelligent adaptive Moving Average which takes the importance of price changes into account and follows price closely enough to display significant moves whilst remaining flat if price ranges. The FRAMA does this by dynamically adjusting the look-back period based on the market's fractal geometry.
Generalized DEMA - GDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages.". Instead of using fixed multiplication factor in the final DEMA formula, the generalized version allows you to change it. By varying the "volume factor" form 0 to 1 you apply different multiplications and thus producing DEMA with different "speed" - the higher the volume factor is the "faster" the DEMA will be (but also the slope of it will be less smooth). The volume factor is limited in the calculation to 1 since any volume factor that is larger than 1 is increasing the overshooting to the extent that some volume factors usage makes the indicator unusable.
Generalized Double DEMA - GDDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages''. This is an extension of the Generalized DEMA using Tim Tillsons (the inventor of T3) idea, and is using GDEMA of GDEMA for calculation (which is the "middle step" of T3 calculation). Since there are no versions showing that middle step, this version covers that too. The result is smoother than Generalized DEMA, but is less smooth than T3 - one has to do some experimenting in order to find the optimal way to use it, but in any case, since it is "faster" than the T3 (Tim Tillson T3) and still smooth, it looks like a good compromise between speed and smoothness.
Hull Moving Average (Type 1) - HMA1
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMA for smoothing.
Hull Moving Average (Type 2) - HMA2
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses EMA for smoothing.
Hull Moving Average (Type 3) - HMA3
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses LWMA for smoothing.
Hull Moving Average (Type 4) - HMA4
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMMA for smoothing.
IE /2 - Early T3 by Tim Tilson and T3 new
T3 is basically an EMA on steroids, You can read about T3 here:
Integral of Linear Regression Slope - ILRS
A Moving Average where the slope of a linear regression line is simply integrated as it is fitted in a moving window of length N (natural numbers in maths) across the data. The derivative of ILRS is the linear regression slope. ILRS is not the same as a SMA ( Simple Moving Average ) of length N, which is actually the midpoint of the linear regression line as it moves across the data.
Instantaneous Trendline
The Instantaneous Trendline is created by removing the dominant cycle component from the price information which makes this Moving Average suitable for medium to long-term trading.
Kalman Filter
Kalman filter is an algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies. This means that the filter was originally designed to work with noisy data. Also, it is able to work with incomplete data. Another advantage is that it is designed for and applied in dynamic systems; our price chart belongs to such systems. This version is true to the original design of the trade-ready Kalman Filter where velocity is the triggering mechanism.
Kalman Filter is a more accurate smoothing/prediction algorithm than the moving average because it is adaptive: it accounts for estimation errors and tries to adjust its predictions from the information it learned in the previous stage. Theoretically, Kalman Filter consists of measurement and transition components.
Kaufman Adaptive Moving Average - KAMA
Developed by Perry Kaufman, Kaufman's Adaptive Moving Average (KAMA) is a moving average designed to account for market noise or volatility. KAMA will closely follow prices when the price swings are relatively small and the noise is low.
Laguerre Filter
The Laguerre Filter is a smoothing filter which is based on Laguerre polynomials. The filter requires the current price, three prior prices, a user defined factor called Alpha to fill its calculation.
Adjusting the Alpha coefficient is used to increase or decrease its lag and its smoothness.
Leader Exponential Moving Average
The Leader EMA was created by Giorgos E. Siligardos who created a Moving Average which was able to eliminate lag altogether whilst maintaining some smoothness. It was first described during his research paper "MACD Leader" where he applied this to the MACD to improve its signals and remove its lagging issue. This filter uses his leading MACD's "modified EMA" and can be used as a zero lag filter.
Linear Regression Value - LSMA ( Least Squares Moving Average )
LSMA as a Moving Average is based on plotting the end point of the linear regression line. It compares the current value to the prior value and a determination is made of a possible trend, eg. the linear regression line is pointing up or down.
Linear Weighted Moving Average - LWMA
LWMA reacts to price quicker than the SMA and EMA . Although it's similar to the Simple Moving Average , the difference is that a weight coefficient is multiplied to the price which means the most recent price has the highest weighting, and each prior price has progressively less weight. The weights drop in a linear fashion.
McGinley Dynamic
John McGinley created this Moving Average to track prices better than traditional Moving Averages. It does this by incorporating an automatic adjustment factor into its formula, which speeds (or slows) the indicator in trending, or ranging, markets.
McNicholl EMA
Dennis McNicholl developed this Moving Average to use as his center line for his "Better Bollinger Bands" indicator and was successful because it responded better to volatility changes over the standard SMA and managed to avoid common whipsaws.
Non-lag moving average
The Non Lag Moving average follows price closely and gives very quick signals as well as early signals of price change. As a standalone Moving Average, it should not be used on its own, but as an additional confluence tool for early signals.
Ocean NMA Moving Average - ONMAMA
Created by Jim Sloman, the NMA is a moving average that automatically adjusts to volatility without being programmed to do so. For more info, read his guide "Ocean Theory, an Introduction"
Parabolic Weighted Moving Average
The Parabolic Weighted Moving Average is a variation of the Linear Weighted Moving Average . The Linear Weighted Moving Average calculates the average by assigning different weights to each element in its calculation. The Parabolic Weighted Moving Average is a variation that allows weights to be changed to form a parabolic curve. It is done simply by using the Power parameter of this indicator.
Probability Density Function Moving Average - PDFMA
Probability density function based MA is a sort of weighted moving average that uses probability density function to calculate the weights. By its nature it is similar to a lot of digital filters.
Quadratic Regression Moving Average - QRMA
A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. This moving average is an obscure concept that was posted to Forex forums in around 2008.
Regularized EMA - REMA
The regularized exponential moving average (REMA) by Chris Satchwell is a variation on the EMA (see Exponential Moving Average) designed to be smoother but not introduce too much extra lag.
Range Weighted EMA - RWEMA
This indicator is a variation of the range weighted EMA. The variation comes from a possible need to make that indicator a bit less "noisy" when it comes to slope changes. The method used for calculating this variation is the method described by Lee Leibfarth in his article "Trading With An Adaptive Price Zone".
Recursive Moving Trendline
Dennis Meyers's Recursive Moving Trendline uses a recursive (repeated application of a rule) polynomial fit, a technique that uses a small number of past values estimations of price and today's price to predict tomorrow's price.
Simple Decycler - SDEC
The Ehlers Simple Decycler study is a virtually zero-lag technical indicator proposed by John F. Ehlers. The original idea behind this study (and several others created by John F. Ehlers) is that market data can be considered a continuum of cycle periods with different cycle amplitudes. Thus, trending periods can be considered segments of longer cycles, or, in other words, low-frequency segments. Applying the right filter might help identify these segments.
Simple Loxx Moving Average - SLMA
A three stage moving average combining an adaptive EMA, a Kalman Filter, and a Kauffman adaptive filter.
Simple Moving Average - SMA
The SMA calculates the average of a range of prices by adding recent prices and then dividing that figure by the number of time periods in the calculation average. It is the most basic Moving Average which is seen as a reliable tool for starting off with Moving Average studies. As reliable as it may be, the basic moving average will work better when it's enhanced into an EMA .
Sine Weighted Moving Average
The Sine Weighted Moving Average assigns the most weight at the middle of the data set. It does this by weighting from the first half of a Sine Wave Cycle and the most weighting is given to the data in the middle of that data set. The Sine WMA closely resembles the TMA (Triangular Moving Average).
Smoothed LWMA - SLWMA
A smoothed version of the LWMA
Smoothed Moving Average - SMMA
The Smoothed Moving Average is similar to the Simple Moving Average ( SMA ), but aims to reduce noise rather than reduce lag. SMMA takes all prices into account and uses a long lookback period. Due to this, it's seen as an accurate yet laggy Moving Average.
Smoother
The Smoother filter is a faster-reacting smoothing technique which generates considerably less lag than the SMMA ( Smoothed Moving Average ). It gives earlier signals but can also create false signals due to its earlier reactions. This filter is sometimes wrongly mistaken for the superior Jurik Smoothing algorithm.
Super Smoother
The Super Smoother filter uses John Ehlers’s “Super Smoother” which consists of a Two pole Butterworth filter combined with a 2-bar SMA ( Simple Moving Average ) that suppresses the 22050 Hz Nyquist frequency: A characteristic of a sampler, which converts a continuous function or signal into a discrete sequence.
Three-pole Ehlers Butterworth
The 3 pole Ehlers Butterworth (as well as the Two pole Butterworth) are both superior alternatives to the EMA and SMA . They aim at producing less lag whilst maintaining accuracy. The 2 pole filter will give you a better approximation for price, whereas the 3 pole filter has superior smoothing.
Three-pole Ehlers smoother
The 3 pole Ehlers smoother works almost as close to price as the above mentioned 3 Pole Ehlers Butterworth. It acts as a strong baseline for signals but removes some noise. Side by side, it hardly differs from the Three Pole Ehlers Butterworth but when examined closely, it has better overshoot reduction compared to the 3 pole Ehlers Butterworth.
Triangular Moving Average - TMA
The TMA is similar to the EMA but uses a different weighting scheme. Exponential and weighted Moving Averages will assign weight to the most recent price data. Simple moving averages will assign the weight equally across all the price data. With a TMA (Triangular Moving Average), it is double smoother (averaged twice) so the majority of the weight is assigned to the middle portion of the data.
Triple Exponential Moving Average - TEMA
The TEMA uses multiple EMA calculations as well as subtracting lag to create a tool which can be used for scalping pullbacks. As it follows price closely, its signals are considered very noisy and should only be used in extremely fast-paced trading conditions.
Two-pole Ehlers Butterworth
The 2 pole Ehlers Butterworth (as well as the three pole Butterworth mentioned above) is another filter that cuts out the noise and follows the price closely. The 2 pole is seen as a faster, leading filter over the 3 pole and follows price a bit more closely. Analysts will utilize both a 2 pole and a 3 pole Butterworth on the same chart using the same period, but having both on chart allows its crosses to be traded.
Two-pole Ehlers smoother
A smoother version of the Two pole Ehlers Butterworth. This filter is the faster version out of the 3 pole Ehlers Butterworth. It does a decent job at cutting out market noise whilst emphasizing a closer following to price over the 3 pole Ehlers .
Variable Index Dynamic Average - VIDYA
Variable Index Dynamic Average Technical Indicator ( VIDYA ) was developed by Tushar Chande. It is an original method of calculating the Exponential Moving Average ( EMA ) with the dynamically changing period of averaging.
Variable Moving Average - VMA
The Variable Moving Average (VMA) is a study that uses an Exponential Moving Average being able to automatically adjust its smoothing factor according to the market volatility.
Volume Weighted EMA - VEMA
An EMA that uses a volume and price weighted calculation instead of the standard price input.
Volume Weighted Moving Average - VWMA
A Volume Weighted Moving Average is a moving average where more weight is given to bars with heavy volume than with light volume. Thus the value of the moving average will be closer to where most trading actually happened than it otherwise would be without being volume weighted.
Zero-Lag DEMA - Zero Lag Double Exponential Moving Average
John Ehlers's Zero Lag DEMA's aim is to eliminate the inherent lag associated with all trend following indicators which average a price over time. Because this is a Double Exponential Moving Average with Zero Lag, it has a tendency to overshoot and create a lot of false signals for swing trading. It can however be used for quick scalping or as a secondary indicator for confluence.
Zero-Lag Moving Average
The Zero Lag Moving Average is described by its creator, John Ehlers , as a Moving Average with absolutely no delay. And it's for this reason that this filter will cause a lot of abrupt signals which will not be ideal for medium to long-term traders. This filter is designed to follow price as close as possible whilst de-lagging data instead of basing it on regular data. The way this is done is by attempting to remove the cumulative effect of the Moving Average.
Zero-Lag TEMA - Zero Lag Triple Exponential Moving Average
Just like the Zero Lag DEMA , this filter will give you the fastest signals out of all the Zero Lag Moving Averages. This is useful for scalping but dangerous for medium to long-term traders, especially during market Volatility and news events. Having no lag, this filter also has no smoothing in its signals and can cause some very bizarre behavior when applied to certain indicators.
Exotic Triggers
This version of Baseline allows the user to select from exotic or source triggers. An exotic trigger determines trend by either slope or some other mechanism that is special to each moving average. A source trigger is one of 32 different source types from Loxx's Exotic Source Types. You can read about these source types here:
Volatility Goldie Locks Zone
This volatility filter is the standard first pass filter that is used for all NNFX systems despite the additional volatility/volume filter used in step 5. For this filter, price must fall into a range of maximum and minimum values calculated using multiples of volatility. Unlike the standard NNFX systems, this version of volatility filtering is separated from the core Baseline and uses it's own moving average with Loxx's Exotic Source Types. The green and red dots at the top of the chart denote whether a candle qualifies for a either or long or short respectively. The green and red triangles at the bottom of the chart denote whether the trigger has crossed up or down and qualifies inside the Goldie Locks zone. White coloring of the Goldie Locks Zone mean line is where volatility is too low to trade.
Volatility Types Included
v1.0 Included Volatility
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility .
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility . That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator consists of using the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e. it assumes that the underlying asset follows a GBM process with zero drift. Therefore the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Additional features will be added in future releases.
This indicator is only available to ALGX Trading VIP group members . You can see the Author's Instructions below to get more information on how to get access.
Previous Day/Week High & Low + 50% w/ Alerts| by Octopu$
📈 Previous Day/Week High & Low + 50% w/ Alerts| by Octopu$
This Indicator includes Previous Day High and Low Levels and 50% (Half of High & Low)
As well as Previous Week High and Low Levels ((Half of High & Low))
And also Pre-Market Session High and Low.
All of them with Built-in alerts.
Can be used in any timeframe with any ticker.
(Using SPY 5m just as an example:)
www.tradingview.com
SPY
Features:
• D High: Green Top Line
• D Low: Red Bottom Line
• D 50%: White 50% Line
• Week High and Low: Blue Top and Bottom Lines
• Pre-Market and Afterhours Session: Gray Lines
• Labels for Identification
Options:
• Toggle on/off for Day High, Low and 50%
• Toggle on/off for DWeek High, Low and 50%
• Toggle on/off for PM and AH Sessions
• Show/Hide the Labels with names
• Show/Hide the Lines themselves
• Fully Customizable Style and Color
Alerts:
• Triggers for Day (above or below level)
• Triggers for Week (above or below level)
Notes:
v1.0
Release of the Indicator
Changes and updates can come in the future for additional functionalities or per requests.
Did you like it? Shoot me a message! I'd appreciate if you dropped by to say thanks.
- Octopu$
🐙
Syminfo [Epi]Hello! This little script tells you everything TradingView lets you access in a ticker's syminfo in Pine Script:
- description
- type: crypto, economic, forex, fund, futures, index, spread, stock
- tickerid, such as AMEX:BLOK
- prefix, such as AMEX
- Ticker, such as BLOK
- root: for derivatives such as futures contracts
- currency, such as USD
- base currency: returns 'BTC' for the ticker 'BTCUSD'
- mintick
- point value
- session: regular, extended
- timezone
Some surprises I found in my development:
- there are some more types than mentioned in the documentation,
- the tickerid takes on additional information if you adjust for dividends or show extended session,
- the prefix contains "_DL" additions depending on your data subscriptions, .e.g. "CME_MINI_DL:ES1!",
- with futures, TV will show session.regular both for the 'regular' and the 'electronic' session.
- Unfortunately, syminfo does not contain the 'sector', although TV has the information in the database (the sector is shown in the screener but not accessed in Pine Script).
I use this little utility in my development and hope it's useful for the community. I see such a great number of contributions from the community and would like to give back, even if it's not much.
BD Markup V5BD Markup V5 — Multi-Time-Frame Extremes & Session Shading
Pine Script v6 • overlay = true
What the indicator does
BD Markup V5 automatically plots horizontal extreme levels (highs & lows) taken from completed candles on seven standard time-frames:
1 Year (Y) – Purple – “Show Yearly”
1 Month (M) – Red – “Show Monthly”
1 Week (W) – Orange – “Show Weekly”
1 Day (D) – Green – “Show Daily”
4 Hour (4H) – Black – “Show 4H”
1 Hour (H1) – Yellow – “Show H1”
15 Minutes (M15) – Maroon – “Show M15”
For every candle that closes the script:
Stores that candle’s high & low.
When a new candle of the same aggregation begins, draws two horizontal lines at those stored prices.
Extends each line to the right until price touches it ; at first touch the extension stops.
To keep the chart tidy you can set a Candle Count per time-frame so only the most-recent n highs & lows remain.
A Base Time-Frame filter hides all lower-TF lines (e.g. choosing “H4” hides H1 & M15).
The indicator also includes three custom intraday sessions whose backgrounds can be shaded with user-defined colours & transparency.
Key inputs
Base Time-Frame – show only levels whose aggregation order is equal or larger.
Show TF? check-boxes – enable/disable individual layers.
Colour pickers & Candle Count – customise appearance & history depth for each layer.
Chart Timezone – accepts any IANA name or UTC/GMT offset.
Session 1-3 – toggle, set session string (HHMM-HHMM), colour, and opacity.
How overlapping levels are handled
When a new line would be created at exactly the same price as an existing one:
If the new level comes from a higher time-frame, lower-TF duplicates are removed.
If it comes from a lower time-frame, the script skips drawing to preserve the higher-TF level.
This hierarchy keeps the chart clean and emphasises the most significant structure.
Practical uses
Highlight macro structure (yearly / monthly / weekly extremes).
Mark intraday reference points (previous session highs & lows).
Combine with your own strategies for bias, targets, or liquidity zones.
Visually compare price behaviour inside custom trading sessions.
Limitations & notes
Levels are based on completed candles only; they never repaint but always lag one full period.
Once price touches a level its extension stops, but the line remains until pruned by your Candle Count setting.
The script caps total line objects at 500 to stay within Pine runtime limits.
Disclaimer
This script is published for informational and educational purposes only.
It does not constitute financial advice and is not a solicitation to buy or sell any asset.
Trading and investing involve substantial risk; always perform your own research and consult a licensed professional before acting on any idea.
Past performance or historical levels do not guarantee future results.
By using this indicator you agree that the author is not liable for any loss or damage arising from its use.
Correlation Heatmap█ OVERVIEW
This indicator creates a correlation matrix for a user-specified list of symbols based on their time-aligned weekly or monthly price returns. It calculates the Pearson correlation coefficient for each possible symbol pair, and it displays the results in a symmetric table with heatmap-colored cells. This format provides an intuitive view of the linear relationships between various symbols' price movements over a specific time range.
█ CONCEPTS
Correlation
Correlation typically refers to an observable statistical relationship between two datasets. In a financial time series context, it usually represents the extent to which sampled values from a pair of datasets, such as two series of price returns, vary jointly over time. More specifically, in this context, correlation describes the strength and direction of the relationship between the samples from both series.
If two separate time series tend to rise and fall together proportionally, they might be highly correlated. Likewise, if the series often vary in opposite directions, they might have a strong anticorrelation . If the two series do not exhibit a clear relationship, they might be uncorrelated .
Traders frequently analyze asset correlations to help optimize portfolios, assess market behaviors, identify potential risks, and support trading decisions. For instance, correlation often plays a key role in diversification . When two instruments exhibit a strong correlation in their returns, it might indicate that buying or selling both carries elevated unsystematic risk . Therefore, traders often aim to create balanced portfolios of relatively uncorrelated or anticorrelated assets to help promote investment diversity and potentially offset some of the risks.
When using correlation analysis to support investment decisions, it is crucial to understand the following caveats:
• Correlation does not imply causation . Two assets might vary jointly over an analyzed range, resulting in high correlation or anticorrelation in their returns, but that does not indicate that either instrument directly influences the other. Joint variability between assets might occur because of shared sensitivities to external factors, such as interest rates or global sentiment, or it might be entirely coincidental. In other words, correlation does not provide sufficient information to identify cause-and-effect relationships.
• Correlation does not predict the future relationship between two assets. It only reflects the estimated strength and direction of the relationship between the current analyzed samples. Financial time series are ever-changing. A strong trend between two assets can weaken or reverse in the future.
Correlation coefficient
A correlation coefficient is a numeric measure of correlation. Several coefficients exist, each quantifying different types of relationships between two datasets. The most common and widely known measure is the Pearson product-moment correlation coefficient , also known as the Pearson correlation coefficient or Pearson's r . Usually, when the term "correlation coefficient" is used without context, it refers to this correlation measure.
The Pearson correlation coefficient quantifies the strength and direction of the linear relationship between two variables. In other words, it indicates how consistently variables' values move together or in opposite directions in a proportional, linear manner. Its formula is as follows:
𝑟(𝑥, 𝑦) = cov(𝑥, 𝑦) / (𝜎𝑥 * 𝜎𝑦)
Where:
• 𝑥 is the first variable, and 𝑦 is the second variable.
• cov(𝑥, 𝑦) is the covariance between 𝑥 and 𝑦.
• 𝜎𝑥 is the standard deviation of 𝑥.
• 𝜎𝑦 is the standard deviation of 𝑦.
In essence, the correlation coefficient measures the covariance between two variables, normalized by the product of their standard deviations. The coefficient's value ranges from -1 to 1, allowing a more straightforward interpretation of the relationship between two datasets than what covariance alone provides:
• A value of 1 indicates a perfect positive correlation over the analyzed sample. As one variable's value changes, the other variable's value changes proportionally in the same direction .
• A value of -1 indicates a perfect negative correlation (anticorrelation). As one variable's value increases, the other variable's value decreases proportionally.
• A value of 0 indicates no linear relationship between the variables over the analyzed sample.
Aligning returns across instruments
In a financial time series, each data point (i.e., bar) in a sample represents information collected in periodic intervals. For instance, on a "1D" chart, bars form at specific times as successive days elapse.
However, the times of the data points for a symbol's standard dataset depend on its active sessions , and sessions vary across instrument types. For example, the daily session for NYSE stocks is 09:30 - 16:00 UTC-4/-5 on weekdays, Forex instruments have 24-hour sessions that span from 17:00 UTC-4/-5 on one weekday to 17:00 on the next, and new daily sessions for cryptocurrencies start at 00:00 UTC every day because crypto markets are consistently open.
Therefore, comparing the standard datasets for different asset types to identify correlations presents a challenge. If two symbols' datasets have bars that form at unaligned times, their correlation coefficient does not accurately describe their relationship. When calculating correlations between the returns for two assets, both datasets must maintain consistent time alignment in their values and cover identical ranges for meaningful results.
To address the issue of time alignment across instruments, this indicator requests confirmed weekly or monthly data from spread tickers constructed from the chart's ticker and another specified ticker. The datasets for spreads are derived from lower-timeframe data to ensure the values from all symbols come from aligned points in time, allowing a fair comparison between different instrument types. Additionally, each spread ticker ID includes necessary modifiers, such as extended hours and adjustments.
In this indicator, we use the following process to retrieve time-aligned returns for correlation calculations:
1. Request the current and previous prices from a spread representing the sum of the chart symbol and another symbol ( "chartSymbol + anotherSymbol" ).
2. Request the prices from another spread representing the difference between the two symbols ( "chartSymbol - anotherSymbol" ).
3. Calculate half of the difference between the values from both spreads ( 0.5 * (requestedSum - requestedDifference) ). The results represent the symbol's prices at times aligned with the sample points on the current chart.
4. Calculate the arithmetic return of the retrieved prices: (currentPrice - previousPrice) / previousPrice
5. Repeat steps 1-4 for each symbol requiring analysis.
It's crucial to note that because this process retrieves prices for a symbol at times consistent with periodic points on the current chart, the values can represent prices from before or after the closing time of the symbol's usual session.
Additionally, note that the maximum number of weeks or months in the correlation calculations depends on the chart's range and the largest time range common to all the requested symbols. To maximize the amount of data available for the calculations, we recommend setting the chart to use a daily or higher timeframe and specifying a chart symbol that covers a sufficient time range for your needs.
█ FEATURES
This indicator analyzes the correlations between several pairs of user-specified symbols to provide a structured, intuitive view of the relationships in their returns. Below are the indicator's key features:
Requesting a list of securities
The "Symbol list" text box in the indicator's "Settings/Inputs" tab accepts a comma-separated list of symbols or ticker identifiers with optional spaces (e.g., "XOM, MSFT, BITSTAMP:BTCUSD"). The indicator dynamically requests returns for each symbol in the list, then calculates the correlation between each pair of return series for its heatmap display.
Each item in the list must represent a valid symbol or ticker ID. If the list includes an invalid symbol, the script raises a runtime error.
To specify a broker/exchange for a symbol, include its name as a prefix with a colon in the "EXCHANGE:SYMBOL" format. If a symbol in the list does not specify an exchange prefix, the indicator selects the most commonly used exchange when requesting the data.
Note that the number of symbols allowed in the list depends on the user's plan. Users with non-professional plans can compare up to 20 symbols with this indicator, and users with professional plans can compare up to 32 symbols.
Timeframe and data length selection
The "Returns timeframe" input specifies whether the indicator uses weekly or monthly returns in its calculations. By default, its value is "1M", meaning the indicator analyzes monthly returns. Note that this script requires a chart timeframe lower than or equal to "1M". If the chart uses a higher timeframe, it causes a runtime error.
To customize the length of the data used in the correlation calculations, use the "Max periods" input. When enabled, the indicator limits the calculation window to the number of periods specified in the input field. Otherwise, it uses the chart's time range as the limit. The top-left corner of the table shows the number of confirmed weeks or months used in the calculations.
It's important to note that the number of confirmed periods in the correlation calculations is limited to the largest time range common to all the requested datasets, because a meaningful correlation matrix requires analyzing each symbol's returns under the same market conditions. Therefore, the correlation matrix can show different results for the same symbol pair if another listed symbol restricts the aligned data to a shorter time range.
Heatmap display
This indicator displays the correlations for each symbol pair in a heatmap-styled table representing a symmetric correlation matrix. Each row and column corresponds to a specific symbol, and the cells at their intersections correspond to symbol pairs . For example, the cell at the "AAPL" row and "MSFT" column shows the weekly or monthly correlation between those two symbols' returns. Likewise, the cell at the "MSFT" row and "AAPL" column shows the same value.
Note that the main diagonal cells in the display, where the row and column refer to the same symbol, all show a value of 1 because any series of non-na data is always perfectly correlated with itself.
The background of each correlation cell uses a gradient color based on the correlation value. By default, the gradient uses blue hues for positive correlation, orange hues for negative correlation, and white for no correlation. The intensity of each blue or orange hue corresponds to the strength of the measured correlation or anticorrelation. Users can customize the gradient's base colors using the inputs in the "Color gradient" section of the "Settings/Inputs" tab.
█ FOR Pine Script® CODERS
• This script uses the `getArrayFromString()` function from our ValueAtTime library to process the input list of symbols. The function splits the "string" value by its commas, then constructs an array of non-empty strings without leading or trailing whitespaces. Additionally, it uses the str.upper() function to convert each symbol's characters to uppercase.
• The script's `getAlignedReturns()` function requests time-aligned prices with two request.security() calls that use spread tickers based on the chart's symbol and another symbol. Then, it calculates the arithmetic return using the `changePercent()` function from the ta library. The `collectReturns()` function uses `getAlignedReturns()` within a loop and stores the data from each call within a matrix . The script calls the `arrayCorrelation()` function on pairs of rows from the returned matrix to calculate the correlation values.
• For consistency, the `getAlignedReturns()` function includes extended hours and dividend adjustment modifiers in its data requests. Additionally, it includes other settings inherited from the chart's context, such as "settlement-as-close" preferences.
• A Pine script can execute up to 40 or 64 unique `request.*()` function calls, depending on the user's plan. The maximum number of symbols this script compares is half the plan's limit, because `getAlignedReturns()` uses two request.security() calls.
• This script can use the request.security() function within a loop because all scripts in Pine v6 enable dynamic requests by default. Refer to the Dynamic requests section of the Other timeframes and data page to learn more about this feature, and see our v6 migration guide to learn what's new in Pine v6.
• The script's table uses two distinct color.from_gradient() calls in a switch structure to determine the cell colors for positive and negative correlation values. One call calculates the color for values from -1 to 0 based on the first and second input colors, and the other calculates the colors for values from 0 to 1 based on the second and third input colors.
Look first. Then leap.
PARKER Currency Strength with RESETS v 3.00PARKER Currency Strength v3.00 is a comprehensive multi-currency strength indicator designed for Forex traders who want detailed insights into major currency performance. Here are some of its key features:
Customizable Session Resets:
The indicator supports automatic resets of currency strength calculations at the start of each major market session (Sydney, Tokyo, London, and New York). You can also enable a custom reset with a user-defined reset time and name.
User-Defined Market Hours:
With the new "Market Settings" section, you can set the open and close times for each market (Sydney, Tokyo, London, and New York) using hour and minute inputs. This allows you to tailor the session times to your local time zone or trading preferences.
Session Shading and Labels:
The background color of the indicator pane changes based on the active market session. Labels are generated at the start of each session to provide clear visual cues. Market session labels and times are also displayed on the chart for quick reference.
Dual Mode Display:
In addition to the reset-based currency strength calculations, the indicator can plot "normal" (continuous) currency strength lines at 50% transparency, allowing you to compare different calculation methods side by side.
Fully Customizable Appearance:
Customize line colors, widths, and offsets for each currency pair via user inputs, enabling a personalized and clear display that fits your trading style.
This indicator is ideal for Forex traders who require a dynamic and highly customizable tool to monitor currency strength, adapt to different market sessions, and make informed trading decisions based on real-time performance data.
Minute Markers ATT MethodStrategic Implementation Guide: Time-Based Market Analysis Indicator
Overview:
The Minute Markers indicator is designed to provide traders with precise time-based reference points throughout the trading session. By marking specific minutes of each hour with vertical lines, this tool enables traders to identify potential market turning points and execute trades with enhanced timing precision.
Key Features:
The indicator displays vertical lines at minutes 3, 11, 17, 29, 41, 47, 53, and 59 of each hour within user-defined trading hours. These specific time markers have been selected to align with common institutional trading patterns and market microstructure elements.
Strategic Applications:
Market Structure Analysis:
The indicator helps traders identify recurring patterns in market behavior at specific times during each hour. This can be particularly valuable for understanding institutional order flow and potential price action patterns that may develop around these time points.
Trade Timing Optimization:
Traders can use these time markers to:
Refine entry and exit points for their trades
Avoid entering positions during potentially volatile time periods
Plan their trades around known institutional trading windows
Coordinate their trading activities with specific market events
Risk Management:
The customizable trading hours feature allows traders to focus on their preferred market sessions while avoiding periods of reduced liquidity or increased volatility. This can help in managing risk exposure during specific market conditions.
Implementation Recommendations:
Initial Observation Phase:
Begin by observing how your traded instruments behave around these time markers over several trading sessions. Document any recurring patterns or notable price action characteristics.
Pattern Recognition:
Pay particular attention to:
Price reaction at these time points
Volume changes around the marked times
Trend continuation or reversal patterns
Changes in volatility
Strategy Integration:
Incorporate these time markers into your existing trading strategy by:
Using them as potential entry or exit points
Setting time-based stop losses
Planning position sizing based on time-related volatility patterns
Adjusting trade management techniques around these specific times
Performance Optimization:
The indicator's customizable visual settings allow traders to:
Adjust line styles for better visibility
Modify colors to match their chart theme
Set specific trading hours to focus on their preferred sessions
Conclusion:
The Minute Markers indicator serves as a sophisticated timing tool that can enhance trading precision and market analysis capabilities. When properly integrated into a comprehensive trading strategy, it can provide valuable insights into market structure and help optimize trade execution timing.
ICT Digital open Daily DividersDescription for "ICT Digital Open Daily Dividers" TradingView Indicator
Overview
The "ICT Digital Open Daily Dividers" is a versatile and comprehensive TradingView Pine Script indicator designed for traders who utilize Institutional Order Flow methodologies, particularly in ICT (Inner Circle Trader) trading. This indicator provides a structured visual framework to assist traders in identifying key daily market sessions, critical opening prices, and distinguishing different trading days, especially focusing on the Sunday open, which is a crucial element in the ICT trading strategy.
Core Functionalities
Daily Vertical Lines: The script plots vertical lines at the start of each trading day, which helps to demarcate daily trading sessions. These lines are customizable, allowing traders to choose their color, style (solid, dashed, or dotted), and width. This feature helps in visually segmenting each trading day, making it easier to analyze daily price action patterns.
Sunday Open Differentiation: Unlike many other daily divider indicators, this script uniquely provides the option to highlight the Sunday open at 6 PM EST with distinct lines. This feature is especially valuable for ICT traders who consider the Sunday open as a critical reference point for weekly analysis. The color, style, and width of the Sunday open lines can be set separately, providing a clear visual distinction from regular weekday separators.
12 AM Open Toggle: For markets that are influenced by midnight opens, the indicator includes an option to shift the daily open line to 12 AM instead of the default 6 PM. This flexibility allows traders to adapt the indicator to different market dynamics or trading strategies.
Timezone Customization: The indicator allows traders to set the timezone for the open lines, ensuring that the vertical lines align accurately with the trader’s specific market hours, whether they follow New York time or any other timezone.
Session Time Filters: The script can hide or show specific trading session markers, such as the New York session open and close, which are pivotal for ICT traders. These markers help in focusing on the most active and liquid trading times.
Customizable Style Settings: The script includes comprehensive styling options for the plotted lines and session markers, allowing traders to personalize their charts to suit their visual preferences and improve clarity.
Day of the Week Labels: The indicator can plot labels for each day of the week, providing a quick reference to the day’s price action. This feature is particularly useful in reviewing weekly trading patterns and performance.
Use in ICT Trading
In ICT trading, the concept of the "open" is fundamental. The "ICT Digital Open Daily Dividers" indicator serves multiple purposes:
Market Structure Identification: By clearly marking daily opens, traders can easily identify market structure changes such as breakouts, retracements, or consolidations around these key levels.
Reference Points: The Sunday open is often a key level in ICT analysis, serving as a benchmark for assessing market direction for the upcoming week. This indicator’s ability to plot Sunday opens separately makes it uniquely suited for ICT strategies.
Time-based Analysis: ICT methodology often involves analyzing the market at specific times of the day. This indicator supports such analysis by marking significant session opens and closes.
Uniqueness and Advantages
The "ICT Digital Open Daily Dividers" stands out from other similar indicators due to its specialized features:
Sunday Open Highlighting: Few indicators offer the capability to specifically mark the Sunday open with distinct styling options.
Flexibility in Time Adjustments: With options to adjust the open time to either 6 PM or 12 AM, this indicator caters to a broader range of trading strategies and market conditions.
Enhanced Visualization: The wide range of customization options ensures that traders can tailor the indicator to their specific needs, enhancing the usability and visual clarity of their charts.
Compliance with TradingView's Pine Script Community Guidelines
The description adheres to TradingView's guidelines by being comprehensive, clear, and informative. It highlights the utility of the script, its unique features, and its application in trading strategies without making exaggerated claims about performance or profitability. The detailed customization options and unique functionalities are emphasized to differentiate this script from other standard daily divider indicators.
Working HoursWorking Hours Visualization
Description:
This script is designed to visually highlight specific "Working Hours" sessions on the chart using background colors. It is tailored and optimized for the 15-minute timeframe, ensuring accurate session representation and proper functionality. If you choose to use this script on other timeframes, adjustments may be necessary to maintain its effectiveness.
Key Features:
Working Hours Highlighting: Displays background colors to mark predefined working hours, helping you focus on specific trading sessions.
Future Session Projection: Highlights working hours for future candles, providing a clear visual guide for planning trades.
Customizable Appearance: Offers adjustable colors, transparency, and session timings to suit individual preferences.
Weekly Separators: Includes optional weekly separators to visually distinguish trading weeks.
Important Notes:
Timeframe Compatibility:
This script is optimized for the 15-minute timeframe.
Using it on other timeframes may require optimization of session inputs and related logic.
Please feel free to reach out if you need assistance with adjustments for different timeframes.
Customization:
You can customize session timings, colors, and transparency levels through the input settings.
Support:
If you encounter any issues or need help optimizing the script for your specific needs, don't hesitate to contact me.
Globex time (New York Time)This indicator is designed to highlight and analyze price movements within the Globex session. Primarily geared toward the Globex Trap trading strategy, this tool visually identifies the session's high and low prices, allowing traders to better assess price action during extended hours. Here’s a comprehensive breakdown of its features and functionality:
Purpose
The "Globex Time (New York Time)" indicator tracks price levels during the Globex trading session, providing a clear view of overnight market activity. This session, typically running from 6 p.m. ET (18:00) until the following morning at 8:30 a.m. ET, is a critical period where significant market positioning can occur before the regular session opens. In the Globex Trap strategy, the session high and low are essential levels, as price movements around these areas often indicate potential support, resistance, or reversal zones, which traders use to set up entries or exits when the regular trading session begins.
Key Features
Customizable Session Start and End Times
The indicator allows users to specify the exact start and end times of the Globex session in New York time. The default settings are:
Start: 6 p.m. ET (18:00)
End: 8:30 a.m. ET
These settings can be adjusted to align with specific market hours or personal preferences.
Session High and Low Identification
Throughout the defined session, the indicator dynamically calculates and tracks:
Session High: The highest price reached within the session.
Session Low: The lowest price reached within the session.
These levels are essential for the Globex Trap strategy, as price action around them can indicate likely breakout or reversal points when regular trading resumes.
Vertical Lines for Session Start and End
The indicator draws vertical lines at both the session start and end times:
Session Start Line: A solid line marking the exact beginning of the Globex session.
Session End Line: A similar vertical line marking the session’s conclusion.
Both lines are customizable in terms of color and thickness, making it easy to distinguish the session boundaries visually on the chart.
Horizontal Lines for Session High and Low
At the end of the session, the indicator plots horizontal lines representing the Globex session's high and low levels. Users can customize these lines:
Color: Define specific colors for the session high (default: red) and session low (default: green) to easily differentiate them.
Line Style: Options to set the line style (solid, dashed, or dotted) provide flexibility for visual preferences and chart organization.
Automatic Reset for Daily Tracking
To adapt to the next trading day, the indicator resets the session high and low data once the current session ends. This reset prepares it to start tracking new levels at the beginning of the next session without manual intervention.
Practical Application in the Globex Trap Strategy
In the Globex Trap strategy, traders are primarily interested in price behavior around the high and low levels established during the overnight session. Common applications of this indicator for this strategy include:
Breakout Trades: Watching for price to break above the Globex high or below the Globex low, indicating potential momentum in the breakout direction.
Reversal Trades: Monitoring for failed breakouts or traps where price tests and rejects the Globex high or low, suggesting a reversal as liquidity is trapped in these zones.
Support and Resistance Zones: Using the session high and low as key support and resistance levels during the regular trading session, with potential entry or exit points when price approaches these areas.
Additional Configuration Options
Vertical Line Color and Width: Define the color and thickness of the vertical session start and end lines to match your chart’s theme.
Upper and Lower Line Colors and Styles: Customize the appearance of the session high and low horizontal lines by setting color and line style (solid, dashed, or dotted), making it easy to distinguish these critical levels from other chart markings.
Summary
This indicator is a valuable tool for traders implementing the Globex Trap strategy. It visually segments the Globex session and marks essential price levels, helping traders analyze market behavior overnight. Through its customizable options and clear visual representation, it simplifies tracking overnight price activity and identifying strategic levels for potential trade setups during the regular session.
Initial Balance |ASE|Introduction
Initial Balance (IB) refers to the price data that is formed during the first hour of a trading session. It is an important concept in trading as it provides insights into the market's opening sentiment and potential trading opportunities or reversals for the day. There are multiple trading sessions throughout the day. The most popular, the NY Session, is open from 9:30 am to 4:00pm EST making the Initial Balance(IB) range the first hour (9:30-10:30) The other sessions include London, Tokyo, and Sydney.
IB Customization
The Initial Balance lines are fully customizable to fit the traders need.
Show Initial Balance
This setting will plot the Initial Balance
Fill/Extend IB Range
The Fill IB Range toggle fills the area in between the IB High and IB Low. Use the IB Fill Color option to change the fill color in the “Line Settings” group on the settings panel.
The Extend IB Range extends the IB lines until the market closes.
Show 1x/2x Extensions
The Show 1x Extension toggle displays 1 times the IB range line (IB High - IB Low) above IB High and 1 times the IB range line below IB Low.
The Show 2x Extension toggle displays the 2 times the IB range line (IB High - IB Low) above IB High and 2 times the IB range line below IB Low.
*Use the Extension Level Color in the “Line Settings” to change the color of the lines.
Show Middle Levels
The Show Middle Levels toggle shows all the 50% lines between the upper 2x and upper 1x line, upper 1x and IB high, IB high and IB low, IB low and lower 1x line, and the lower 1x and lower 2x line.
*Use the Mid Level Color in the “Line Settings” to change the color of the lines.
Delete Previous Day’s Levels
This setting will only show the current day's Initial Balance and delete all previous day levels to produce a clean chart.
How To Use:
The Initial Balance Range can support a bias as it shows the opening market sentiment. By watching price action interact with the Initial Balance Range we can watch for indications of trending or failing moves at the high or the low and overall a ranging or trending session.
The extension levels are projections as to where price could potentially reach in a trending market. If we are bullish and trending higher, we would want to see price reach the first extension, signs of strength at these levels can be used as confirmation to target other levels.
Overall, all these levels can and should be used as support and resistance levels, and as always, can not be used by themselves and require additional confirmation, whether that be an indicator or price action. Below you can see chart examples of these levels in action.
Auto Support & Resistance With Wick Signals & Percentage GapsThis auto support and resistance indicator uses percentage deviations from the previous session close to calculate levels. It provides arrows as signals when it detects 2 wicks in the last 5 bars from a support or resistance level. Includes alerts for price crossing any level as well as real time percentage gaps from current price to the next closest support and resistance level. You also have the option to set up to 3 major levels of your own for any levels that are very important on longer timeframes that you want included. Those will show on the chart as well as within your percentage gap table with color coded background. All features can be customized or turned off to suit your preferences.
SOURCE
This indicator uses the previous session close as a source by default but can be adjusted to use the previous session high or the previous session low. I find the close setting to provide the most accurate levels.
SESSION
The default setting for the previous session used is the daily session but can be adjusted to use the daily, weekly, monthly, quarterly or yearly session. Use longer sessions when looking at longer time frame charts.
SIGNALS
The signals by default are set to only show an arrow if there have been 2 bullish or bearish wicks off of a support or resistance level in the last 5 bars. This can be changed to one bullish wick off of support and one bearish wick off of resistance or it can be set to give a signal anytime a bar crosses a support or resistance level. This can be controlled in the indicator settings.
PERCENTAGE DEVIATION LEVELS
The default percentage deviation is set to 1% but can and should be adjusted according to whatever ticker you are using. For example use .25% or .5% when looking at forex intraday charts since they are not as volatile as other markets. For leveraged etfs used 1% multiplied by the leverage on the etf, so for SQQQ use 3% as it is a 3x leveraged etf. When looking at longer timeframes or highly volatile charts, set the percentage deviation to 2%, 5%, 10%, etc.
LINE COLORS
The color of the lines will change from red to green depending on if the price is above or below that level. You can customize these colors in the settings.
MAJOR LEVELS
If you have major levels of support and resistance from longer timeframes and your own charting, you can add up to 3 major levels that will show on the chart as well as show the percentage gaps in the table. The label for each major level will be colored to match the color of the line on the chart individually.
PERCENTAGE GAP TABLE
The gap table will update live with percentages to go from current price to the next closest support and resistance levels so you don’t have to calculate them manually. The position of the percentage gap table can also be changed within the indicator settings.
TURN FEATURES ON/OFF
There are 3 toggle switches so you can easily turn on or off certain features such as: the support and resistance lines, the percentage gaps table and the arrow signals.
LINE WIDTHS
You can also set the line width of all levels and the line width of the starting level within the indicator settings.
***MARKETS***
This indicator can be used as a signal on all markets, including stocks, crypto, futures and forex.
***TIMEFRAMES***
This automatic support and resistance indicator can be used on all timeframes as long as there is enough data for the session used.
***TIPS***
Try using numerous indicators of ours on your chart so you can instantly see the bullish or bearish trend of multiple indicators in real time without having to analyze the data. Some of our favorites are our Volume Spike Scanner, Volume Profile, Momentum and Trend Friend in combination with this auto support and resistance indicator. They all have real time Bullish and Bearish labels as well so you can immediately understand each indicator's trend.
Quarterly Cycle Theory with DST time AdjustedThe Quarterly Theory removes ambiguity, as it gives specific time-based reference points to look for when entering trades. Before being able to apply this theory to trading, one must first understand that time is fractal:
Yearly Quarters = 4 quarters of three months each.
Monthly Quarters = 4 quarters of one week each.
Weekly Quarters = 4 quarters of one day each (Monday - Thursday). Friday has its own specific function.
Daily Quarters = 4 quarters of 6 hours each = 4 trading sessions of a trading day.
Sessions Quarters = 4 quarters of 90 minutes each.
90 Minute Quarters = 4 quarters of 22.5 minutes each.
Yearly Cycle: Analogously to financial quarters, the year is divided in four sections of three months each:
Q1 - January, February, March.
Q2 - April, May, June (True Open, April Open).
Q3 - July, August, September.
Q4 - October, November, December.
S&P 500 E-mini Futures (daily candles) — Monthly Cycle.
Monthly Cycle: Considering that we have four weeks in a month, we start the cycle on the first month’s Monday (regardless of the calendar Day):
Q1 - Week 1: first Monday of the month.
Q2 - Week 2: second Monday of the month (True Open, Daily Candle Open Price).
Q3 - Week 3: third Monday of the month.
Q4 - Week 4: fourth Monday of the month.
S&P 500 E-mini Futures (4 hour candles) — Weekly Cycle.
Weekly Cycle: Daye determined that although the trading week is composed by 5 trading days, we should ignore Friday, and the small portion of Sunday’s price action:
Q1 - Monday.
Q2 - Tuesday (True Open, Daily Candle Open Price).
Q3 - Wednesday.
Q4 - Thursday.
S&P 500 E-mini Futures (1 hour candles) — Daily Cycle.
Daily Cycle: The Day can be broken down into 6 hour quarters. These times roughly define the sessions of the trading day, reinforcing the theory’s validity:
Q1 - 18:00 - 00:00 Asia.
Q2 - 00:00 - 06:00 London (True Open).
Q3 - 06:00 - 12:00 NY AM.
Q4 - 12:00 - 18:00 NY PM.
S&P 500 E-mini Futures (15 minute candles) — 6 Hour Cycle.
6 Hour Quarters or 90 Minute Cycle / Sessions divided into four sections of 90 minutes each (EST/EDT):
Asian Session
Q1 - 18:00 - 19:30
Q2 - 19:30 - 21:00 (True Open)
Q3 - 21:00 - 22:30
Q4 - 22:30 - 00:00
London Session
Q1 - 00:00 - 01:30
Q2 - 01:30 - 03:00 (True Open)
Q3 - 03:00 - 04:30
Q4 - 04:30 - 06:00
NY AM Session
Q1 - 06:00 - 07:30
Q2 - 07:30 - 09:00 (True Open)
Q3 - 09:00 - 10:30
Q4 - 10:30 - 12:00
NY PM Session
Q1 - 12:00 - 13:30
Q2 - 13:30 - 15:00 (True Open)
Q3 - 15:00 - 16:30
Q4 - 16:30 - 18:00
S&P 500 E-mini Futures (5 minute candles) — 90 Minute Cycle.
Micro Cycles: Dividing the 90 Minute Cycle yields 22.5 Minute Quarters, also known as Micro Sessions or Micro Quarters:
Asian Session
Q1/1 18:00:00 - 18:22:30
Q2 18:22:30 - 18:45:00
Q3 18:45:00 - 19:07:30
Q4 19:07:30 - 19:30:00
Q2/1 19:30:00 - 19:52:30 (True Session Open)
Q2/2 19:52:30 - 20:15:00
Q2/3 20:15:00 - 20:37:30
Q2/4 20:37:30 - 21:00:00
Q3/1 21:00:00 - 21:23:30
etc. 21:23:30 - 21:45:00
London Session
00:00:00 - 00:22:30 (True Daily Open)
00:22:30 - 00:45:00
00:45:00 - 01:07:30
01:07:30 - 01:30:00
01:30:00 - 01:52:30 (True Session Open)
01:52:30 - 02:15:00
02:15:00 - 02:37:30
02:37:30 - 03:00:00
03:00:00 - 03:22:30
03:22:30 - 03:45:00
03:45:00 - 04:07:30
04:07:30 - 04:30:00
04:30:00 - 04:52:30
04:52:30 - 05:15:00
05:15:00 - 05:37:30
05:37:30 - 06:00:00
New York AM Session
06:00:00 - 06:22:30
06:22:30 - 06:45:00
06:45:00 - 07:07:30
07:07:30 - 07:30:00
07:30:00 - 07:52:30 (True Session Open)
07:52:30 - 08:15:00
08:15:00 - 08:37:30
08:37:30 - 09:00:00
09:00:00 - 09:22:30
09:22:30 - 09:45:00
09:45:00 - 10:07:30
10:07:30 - 10:30:00
10:30:00 - 10:52:30
10:52:30 - 11:15:00
11:15:00 - 11:37:30
11:37:30 - 12:00:00
New York PM Session
12:00:00 - 12:22:30
12:22:30 - 12:45:00
12:45:00 - 13:07:30
13:07:30 - 13:30:00
13:30:00 - 13:52:30 (True Session Open)
13:52:30 - 14:15:00
14:15:00 - 14:37:30
14:37:30 - 15:00:00
15:00:00 - 15:22:30
15:22:30 - 15:45:00
15:45:00 - 15:37:30
15:37:30 - 16:00:00
16:00:00 - 16:22:30
16:22:30 - 16:45:00
16:45:00 - 17:07:30
17:07:30 - 18:00:00
S&P 500 E-mini Futures (30 second candles) — 22.5 Minute Cycle.
Cz ASR indicatorAverage session range indicator built by me. Great tool to gauge volatility and intraday reversal zones. Great for FX as there is an included table that shows range in pips; however, this can be applied across all assets as a volatility measure.
How it works:
The script measures the range of sessions, including Asia, London, and New York. The lookback period could be adjusted so you can find what length works best and is most accurate. This is then averaged out to provide the ASR. This provides us with an upper and lower bound of which the price could potentially fluctuate in based on the past session ranges. I have also added the 50% ASR, which is also a super useful metric for reversals or continuations.
There is also a configurable UTC so that you can adjust the indicator so it can accurately measure the range within certain sessions.
Note - different session start and stop times vary from market to market. I have set the code to the standard forex market opens however, if you wish to change the time ,you are able to do so by editing the variables in the script
Enjoy :)
FVG Labels [wac]Indicator Overview
This indicator identifies new sessions based on your chosen timeframe and automatically highlights any Fair Value Gaps (FVGs) that emerge within those sessions. By clearly differentiating between Bullish (rising) and Bearish (falling) FVGs, it allows traders to pinpoint imbalances in the market with ease.
Key Features
Session Range Tracking – Logs daily (or user-defined) session highs and lows, offering a clear snapshot of each session’s price behavior.
Automated FVG Detection & Labeling - Identifies fair value gap zones and marks them on the chart. Once an FVG is resolved (“FVG Filled”), you can instantly recognize the gap’s closure.
Multi-Timeframe VWAP DashboardMulti-Timeframe VWAP Dashboard with Advanced Customization**
Unlock the power of **Volume-Weighted Average Price (VWAP)** across multiple timeframes with this highly customizable and feature-rich Pine Script. Designed for traders who demand precision and flexibility, this script provides a **comprehensive VWAP dashboard** that adapts to your trading style and strategy. Whether you're a day trader, swing trader, or long-term investor, this tool offers unparalleled insights into market trends and price levels.
---
### **Key Features:**
1. **Multi-Timeframe VWAP Calculation:**
- Calculate VWAP across **12-minute, 48-minute, 96-minute, 192-minute, daily, weekly, monthly, and even yearly timeframes**.
- Supports **custom timeframes** for tailored analysis.
2. **Price Source Selection:**
- Choose from multiple price sources for VWAP calculation, including **Open, High, Low, Close, HL2, HLC3, HLCC4, and All**.
- Optimize VWAP for **uptrends and downtrends** by selecting the most relevant price source.
3. **Customizable Labels:**
- Add **dynamic labels** to each VWAP line for quick reference.
- Customize label **colors, sizes, and offsets** to suit your chart setup.
- Display **price values** and **session types** (e.g., "12 Min", "Daily", "Weekly") directly on the chart.
4. **Advanced Session Detection:**
- Automatically detect new sessions for **intraday, daily, weekly, monthly, and yearly timeframes**.
- Ensures accurate VWAP calculations for each session.
5. **Plot Visibility Control:**
- Toggle the visibility of individual VWAP plots to **reduce clutter** and focus on the most relevant timeframes.
- Includes options for **short-term, medium-term, and long-term VWAPs**.
6. **Comprehensive Timeframe Coverage:**
- From **12-minute intervals** to **12-month intervals**, this script covers all major timeframes.
- Perfect for traders who analyze markets across multiple horizons.
7. **User-Friendly Inputs:**
- Intuitive input options for **timeframes, colors, labels, and offsets**.
- Easily customize the script to match your trading preferences.
8. **Dynamic Label Positioning:**
- Labels adjust automatically based on price movements and session changes.
- Choose from **multiple offset options** to position labels precisely.
9. **Miscellaneous Customization:**
- Adjust **text color, label size, and price display settings**.
- Enable or disable **price values** and **session type labels** for a cleaner chart.
---
### **Why Use This Script?**
- **Versatility:** Suitable for all trading styles, including scalping, day trading, swing trading, and long-term investing.
- **Precision:** Accurate VWAP calculations across multiple timeframes ensure you never miss key price levels.
- **Customization:** Tailor the script to your specific needs with a wide range of input options.
- **Clarity:** Dynamic labels and customizable plots make it easy to interpret market trends at a glance.
---
### **How It Works:**
1. **Select Your Price Source:**
- Choose the price source (e.g., Open, Close, HL2) for VWAP calculation based on your trading strategy.
2. **Choose Timeframes:**
- Define the timeframes for VWAP calculation, from intraday to yearly intervals.
3. **Customize Labels and Plots:**
- Enable or disable labels and plots for each timeframe.
- Adjust colors, sizes, and offsets to match your chart setup.
4. **Analyze Market Trends:**
- Use the VWAP lines and labels to identify **support/resistance levels**, **trend direction**, and **potential reversal points**.
5. **Adapt to Market Conditions:**
- Switch between timeframes and price sources to adapt to changing market conditions.
---
### **Ideal For:**
- **Day Traders:** Use short-term VWAPs (e.g., 12-minute, 48-minute) to identify intraday trends and key levels.
- **Swing Traders:** Leverage medium-term VWAPs (e.g., 96-minute, daily) to spot swing opportunities.
- **Long-Term Investors:** Analyze long-term VWAPs (e.g., weekly, monthly) to gauge overall market direction.
---
### **How to Get Started:**
1. Add the script to your TradingView chart.
2. Customize the inputs to match your trading preferences.
3. Analyze the VWAP lines and labels to make informed trading decisions.
---
### **Pro Tip:**
Combine this script with other technical indicators (e.g., moving averages, RSI) for a **holistic view** of the market. Use the VWAP lines as dynamic support/resistance levels to enhance your entry and exit strategies.
This script is a must-have tool for traders who value precision, flexibility, and clarity. Share it with your audience to help them elevate their trading game. Whether they're beginners or seasoned professionals, this **Multi-Timeframe VWAP Dashboard** will become an essential part of their toolkit.
Opening Range, Initial Balance, Opening Price, Pre-market Levels### Description of the Indicator: **Opening Range, Initial Balance, Opening Price, Pre-market Levels**
This custom TradingView indicator provides a comprehensive view of key price levels for intraday trading, specifically designed to track important levels from the Opening Range (OR), Initial Balance (IB), Opening Price (OP), and Pre-market session (PM). These levels are essential for traders to gauge potential market movements and identify critical areas of support and resistance.
#### **Features:**
1. **Opening Range (OR):**
- This is the high and low of the first 30 minutes of the regular market session (09:30 - 10:00 EST).
- The OR high and low act as significant levels that may influence price movement for the rest of the day.
- The mid-level of the Opening Range (OR Mid) is also plotted to give a more detailed view of potential price action.
2. **Initial Balance (IB):**
- The Initial Balance is the range created during the first hour of market activity (09:30 - 10:30 EST).
- This range often sets the tone for the market's direction. The IB high and low, along with the IB midline, are plotted for quick reference.
3. **Opening Price (OP):**
- The opening price of the market is marked as a circle and labeled "OP."
- This level provides context for market sentiment when compared to the high and low levels.
4. **Pre-market Levels (PM):**
- The pre-market session (04:00 - 09:30 EST) has its own important levels that are calculated for the high, low, and mid range (PM High, PM Low, and PM Mid).
- These levels are plotted and are useful for traders to understand where the market stood before the regular session opened.
#### **Customization Options:**
- **Exchange Timezone:** You can choose whether to display the times in the exchange's local timezone or in your own preferred timezone.
- **Mid Levels Display:** You can toggle whether the mid levels for each range (OR, IB, PM) should be shown on the chart.
- **Level Color Change:** The colors of the plotted levels (high, low, mid) change based on whether the price is above or below the respective level, making it easy to visualize potential support and resistance.
- **Label Positions:** The position of the labels (OR, IB, OP, PM) on the chart can be customized to avoid overlap with other data points.
#### **Key Use Cases:**
- **Intraday Trend Analysis:** Use the OR and IB to identify key levels for the day, providing insights into the possible trend or range for the day.
- **Pre-market Insights:** The PM levels are crucial for understanding where the market stood during the pre-market hours and can be used as reference points during the regular session.
- **Potential Support and Resistance:** The high and low levels of the OR, IB, and PM sessions can act as potential support or resistance, which are useful for setting stop-loss and take-profit levels.
#### **How to Use:**
- Pay attention to the levels provided for OR, IB, and PM as potential entry and exit points.
- Watch for breakouts or reversals around these levels, especially when combined with other technical indicators or price action patterns.
- The mid levels offer an additional reference to assess price direction or identify possible areas of consolidation.
This indicator is perfect for day traders who rely on key intraday levels and pre-market activity to make informed trading decisions. It helps to streamline the process of identifying potential breakouts, reversals, and ranges in the market.
Liquidity + SP y RS + Zones [AlgoRich]This indicator is designed to identify key areas in the market, such as support and resistance levels, liquidity zones, and important price structures.
Additionally, it highlights operational areas based on specific time frames, facilitating technical analysis and decision-making in trading.
How does it work?
1. Identification of Pivot Levels
The indicator identifies local highs and lows on the chart, known as pivot levels, which are zones where the price tends to react, such as:
Support zones: Areas where the price is likely to stop falling.
Resistance zones: Areas where the price might encounter obstacles to keep rising.
These levels are calculated by analyzing a range of bars around the current price and are highlighted with lines, boxes, and labels on the chart.
2. Liquidity Zones
Liquidity zones are defined as areas where there has been an accumulation of orders, either for buying or selling. These zones are significant because they often signal future price movements.
The indicator creates visual boxes around these levels, allowing traders to quickly identify areas where the price might react.
3. Support and Resistance Lines
Horizontal lines are drawn at the identified highs and lows, representing support and resistance levels on the chart.
These lines can be extended forward until the price touches them, showing whether the level has been respected or "broken."
4. Visual Labels
The indicator can also display labels at key levels to provide additional information, such as whether the level corresponds to a high or low.
5. Operational Zones
In addition to support and resistance levels, the indicator allows users to mark specific time periods, referred to as operational sessions.
These zones highlight user-defined periods, such as:
New York session
London session
Daily session
This helps focus analysis on the most active market periods.
6. Customization
The user can customize the following:
Pivot sizes (how many bars to consider to the left and right).
Colors and styles of the lines, boxes, and labels.
Visibility of elements such as boxes, lines, and labels.
Whether to extend the levels forward until the price reaches them.
What is this indicator used for?
Identifying key areas in the market: Support, resistance levels, and liquidity zones are essential for understanding where the price is most likely to react.
Defining entry and exit points: Highlighted zones help determine when to open or close trades.
Highlighting key market moments: With operational sessions, you can focus on the most relevant periods for your strategy.
Simplifying technical analysis: By visualizing levels and zones directly on the chart, it reduces the time needed to identify critical areas.
Benefits for Traders
This indicator is ideal for traders who:
Want to analyze key market levels quickly and efficiently.
Are looking for high-probability zones to trade, based on support, resistance, and liquidity areas.
Need a visual approach to highlight operational levels and important time frames on their charts.
In summary, this indicator serves as a comprehensive tool that combines advanced technical analysis with a user-friendly visual interface, allowing traders to make more informed and precise decisions.
-----------------
TRADUCCIÓN AL ESPAÑOL:
Este indicador está diseñado para identificar zonas clave en el mercado, como niveles de soporte y resistencia, zonas de liquidez, y estructuras importantes de precios. Además, resalta las áreas operativas de acuerdo con horarios específicos, facilitando el análisis técnico y la toma de decisiones en el trading.
¿Cómo funciona?
1. Identificación de Niveles Pivot
El indicador busca máximos y mínimos locales en el gráfico, conocidos como niveles pivote, los cuales son zonas donde el precio suele reaccionar, como en:
Zonas de soporte: Donde el precio tiene probabilidades de detener su caída.
Zonas de resistencia: Donde el precio podría encontrar obstáculos para seguir subiendo.
Estos niveles son calculados analizando un rango de barras alrededor del precio actual, y se destacan con líneas, cajas y etiquetas en el gráfico.
2. Zonas de Liquidez
Las zonas de liquidez se definen como áreas donde ha habido una acumulación de órdenes, ya sea de compra o venta. Estas zonas son importantes porque suelen marcar movimientos futuros significativos en el precio.
El indicador crea cajas visuales alrededor de estos niveles, permitiendo identificar rápidamente las áreas donde el precio puede reaccionar.
3. Líneas de Soporte y Resistencia
Se trazan líneas horizontales en los máximos y mínimos identificados, representando los niveles de soporte y resistencia en el gráfico.
Estas líneas se pueden extender hacia adelante hasta que el precio las toque, mostrando si el nivel ha sido respetado o "roto".
4. Etiquetas Visuales
El indicador también puede mostrar etiquetas en los niveles clave para proporcionar información adicional, como si el nivel corresponde a un máximo o un mínimo.
5. Zonas Operativas
Además de los niveles de soporte y resistencia, el indicador permite marcar zonas de tiempo específicas, llamadas sesiones operativas.
Estas zonas resaltan períodos definidos por el usuario, como:
Sesión de Nueva York.
Sesión de Londres.
Diario.
Esto ayuda a enfocar el análisis en los momentos más activos del mercado.
6. Personalización
El usuario puede personalizar:
Tamaños de pivote (cuántas barras a la izquierda y derecha considerar).
Colores y estilos de las líneas, cajas y etiquetas.
La visibilidad de elementos como cajas, líneas y etiquetas.
Extender o no los niveles hacia adelante hasta que el precio los alcance.
¿Para qué sirve este indicador?
Identificar zonas importantes del mercado: Los niveles de soporte, resistencia y las zonas de liquidez son esenciales para entender dónde es más probable que el precio reaccione.
Definir puntos de entrada y salida: Las zonas destacadas ayudan a determinar cuándo abrir o cerrar operaciones.
Resaltar momentos clave del mercado: Con las sesiones operativas, puedes enfocarte en los períodos más relevantes para tu estrategia.
Simplificar el análisis técnico: Visualizando niveles y zonas directamente en el gráfico, se reduce el tiempo necesario para identificar áreas críticas.
Beneficio para los Traders
Este indicador es ideal para traders que:
Quieren analizar niveles clave del mercado de forma rápida y eficiente.
Buscan zonas de alta probabilidad para operar, basándose en soportes, resistencias y zonas de liquidez.
Necesitan un enfoque visual para destacar niveles operativos y horarios importantes en sus gráficos.
En resumen, este indicador actúa como una herramienta integral para combinar análisis técnico avanzado con una interfaz visual amigable, lo que permite a los traders tomar decisiones más informadas y precisas.
NVOL Normalized Volume & VolatilityOVERVIEW
Plots a normalized volume (or volatility) relative to a given bar's typical value across all charted sessions. The concept is similar to Relative Volume (RVOL) and Average True Range (ATR), but rather than using a moving average, this script uses bar data from previous sessions to more accurately separate what's normal from what's anomalous. Compatible on all timeframes and symbols.
Having volume and volatility processed within a single indicator not only allows you to toggle between the two for a consistent data display, it also allows you to measure how correlated they are. These measurements are available in the data table.
DATA & MATH
The core formula used to normalize each bar is:
( Value / Basis ) × Scale
Value
The current bar's volume or volatility (see INPUTS section). When set to volume, it's exactly what you would expect (the volume of the bar). When set to volatility, it's the bar's range (high - low).
Basis
A statistical threshold (Mean, Median, or Q3) plus a Sigma multiple (standard deviations). The default is set to the Mean + Sigma × 3 , which represents 99.7% of data in a normal distribution. The values are derived from the current bar's equivalent in other sessions. For example, if the current bar time is 9:30 AM, all previous 9:30 AM bars would be used to get the Mean and Sigma. Thus Mean + Sigma × 3 would represent the Normal Bar Vol at 9:30 AM.
Scale
Depends on the Normalize setting, where it is 1 when set to Ratio, and 100 when set to Percent. This simply determines the plot's scale (ie. 0 to 1 vs. 0 to 100).
INPUTS
While the default configuration is recommended for a majority of use cases (see BEST PRACTICES), settings should be adjusted so most of the Normalized Plot and Linear Regression are below the Signal Zone. Only the most extreme values should exceed this area.
Normalize
Allows you to specify what should be normalized (Volume or Volatility) and how it should be measured (as a Ratio or Percentage). This sets the value and scale in the core formula.
Basis
Specifies the statistical threshold (Mean, Median, or Q3) and how many standard deviations should be added to it (Sigma). This is the basis in the core formula.
Mean is the sum of values divided by the quantity of values. It's what most people think of when they say "average."
Median is the middle value, where 50% of the data will be lower and 50% will be higher.
Q3 is short for Third Quartile, where 75% of the data will be lower and 25% will be higher (think three quarters).
Sample
Determines the maximum sample size.
All Charted Bars is the default and recommended option, and ignores the adjacent lookback number.
Lookback is not recommended, but it is available for comparisons. It uses the adjacent lookback number and is likely to produce unreliable results outside a very specific context that is not suitable for most traders. Normalization is not a moving average. Unless you have a good reason to limit the sample size, do not use this option and instead use All Charted Bars .
Show Vol. name on plot
Overlays "VOLUME" or "VOLATILITY" on the plot (whichever you've selected).
Lin. Reg.
Polynomial regressions are great for capturing non-linear patterns in data. TradingView offers a "linear regression curve", which this script uses as a substitute. If you're unfamiliar with either term, think of this like a better moving average.
You're able to specify the color, length, and multiple (how much to amplify the value). The linear regression derives its value from the normalized values.
Norm. Val.
This is the color of the normalized value of the current bar (see DATA & MATH section). You're able to specify the default, within signal, and beyond signal colors. As well as the plot style.
Fade in colors between zero and the signal
Programmatically adjust the opacity of the primary plot color based on it's normalized value. When enabled, values equal to 0 will be fully transparent, become more opaque as they move away from 0, and be fully opaque at the signal. Adjusting opacity in this way helps make difference more obvious.
Plot relative to bar direction
If enabled, the normalized value will be multiplied by -1 when a bar's open is greater than the bar's close, mirroring price direction.
Technically volume and volatility are directionless. Meaning there's really no such thing as buy volume, sell volume, positive volatility, or negative volatility. There is just volume (1 buy = 1 sell = 1 volume) and volatility (high - low). Even so, visually reflecting the net effect of pricing pressure can still be useful. That's all this setting does.
Sig. Zone
Signal zones make identifying extremes easier. They do not signal if you should buy or sell, only that the current measurement is beyond what's normal. You are able to adjust the color and bounds of the zone.
Int. Levels
Interim levels can be useful when you want to visually bracket values into high / medium / low. These levels can have a value anywhere between 0 and 1. They will automatically be multiplied by 100 when the scale is set to Percent.
Zero Line
This setting allows you to specify the visibility of the zero line to best suit your trading style.
Volume & Volatility Stats
Displays a table of core values for both volume and volatility. Specifically the actual value, threshold (mean, median, or Q3), sigma (standard deviation), basis, normalized value, and linear regression.
Correlation Stats
Displays a table of correlation statistics for the current bar, as well as the data set average. Specifically the coefficient, R2, and P-Value.
Indices & Sample Size
Displays a table of mixed data. Specifically the current bar's index within the session, the current bar's index within the sample, and the sample size used to normalize the current bar's value.
BEST PRACTICES
NVOL can tell you what's normal for 9:30 AM. RVOL and ATR can only tell you if the current value is higher or lower than a moving average.
In a normal distribution (bell curve) 99.7% of data occurs within 3 standard deviations of the mean. This is why the default basis is set to "Mean, 3"; it includes the typical day-to-day fluctuations, better contextualizing what's actually normal, minimizing false positives.
This means a ratio value greater than 1 only occurs 0.3% of the time. A series of these values warrants your attention. Which is why the default signal zone is between 1 and 2. Ratios beyond 2 would be considered extreme with the default settings.
Inversely, ratio values less than 1 (the normal daily fluctuations) also tell a story. We should expect most values to occur around the middle 3rd, which is why interim levels default to 0.33 and 0.66, visually simplifying a given move's participation. These can be set to whatever you like and only serve as visual aids for your specific trading style.
It's worth noting that the linear regression oscillates when plotted directionally, which can help clarify short term move exhaustion and continuation. Akin to a relative strength index (RSI), it may be used to inform a trading decision, but it should not be the only factor.
Profitability Visualization with Bid-Ask Spread ApproximationOverview
The " Profitability Visualization with Bid-Ask Spread Approximation " indicator is designed to assist traders in assessing potential profit and loss targets in relation to the current market price or a simulated entry price. It provides flexibility by allowing users to choose between two methods for calculating the offset from the current price:
Bid-Ask Spread Approximation: The indicator attempts to estimate the bid-ask spread by using the highest (high) and lowest (low) prices within a given period (typically the current bar or a user-defined timeframe) as proxies for the ask and bid prices, respectively. This method provides a dynamic offset that adapts to market volatility.
Percentage Offset: Alternatively, users can specify a fixed percentage offset from the current price. This method offers a consistent offset regardless of market conditions.
Key Features
Dual Offset Calculation Methods: Choose between a dynamic bid-ask spread approximation or a fixed percentage offset to tailor the indicator to your trading style and market analysis.
Entry Price Consideration: The indicator can simulate an entry price at the beginning of each trading session (or the first bar on the chart if no sessions are defined). This feature enables a more realistic visualization of potential profit and loss levels based on a hypothetical entry point.
Profit and Loss Targets: When the entry price consideration is enabled, the indicator plots profit target (green) and loss target (red) lines. These lines represent the price levels at which a trade entered at the simulated entry price would achieve a profit or incur a loss equivalent to the calculated offset amount.
Offset Visualization: Regardless of whether the entry price is considered, the indicator always displays upper (aqua) and lower (fuchsia) offset lines. These lines represent the calculated offset levels based on the chosen method (bid-ask approximation or percentage offset).
Customization: Users can adjust the percentage offset, toggle the bid-ask approximation and entry price consideration, and customize the appearance of the lines through the indicator's settings.
Inputs
useBidAskApproximation A boolean (checkbox) input that determines whether to use the bid-ask spread approximation (true) or the percentage offset (false). Default is false.
percentageOffset A float input that allows users to specify the percentage offset to be used when useBidAskApproximation is false. The default value is 0.63.
considerEntryPrice A boolean input that enables the consideration of a simulated entry price for calculating and displaying profit and loss targets. Default is true.
Calculations
Bid-Ask Approximation (if enabled): bidApprox = request.security(syminfo.tickerid, timeframe.period, low) Approximates the bid price using the lowest price (low) of the current period. askApprox = request.security(syminfo.tickerid, timeframe.period, high) Approximates the ask price using the highest price (high) of the current period. spreadApprox = askApprox - bidApprox Calculates the approximate spread.
Offset Amount: offsetAmount = useBidAskApproximation ? spreadApprox / 2 : close * (percentageOffset / 100) Determines the offset amount based on the selected method. If useBidAskApproximation is true, the offset is half of the approximated spread; otherwise, it's the current closing price (close) multiplied by the percentageOffset.
Entry Price (if enabled): var entryPrice = 0.0 Initializes a variable to store the entry price. if considerEntryPrice Checks if entry price consideration is enabled. if barstate.isnew Checks if the current bar is the first bar of a new session. entryPrice := close Sets the entryPrice to the closing price of the first bar of the session.
Profit and Loss Targets (if entry price is considered): profitTarget = entryPrice + offsetAmount Calculates the profit target price level. lossTarget = entryPrice - offsetAmount Calculates the loss target price level.
Plotting
Profit Target Line: Plotted in green (color.green) with a dashed line style (plot.style_linebr) and increased linewidth (linewidth=2) when considerEntryPrice is true.
Loss Target Line: Plotted in red (color.red) with a dashed line style (plot.style_linebr) and increased linewidth (linewidth=2) when considerEntryPrice is true.
Upper Offset Line: Always plotted in aqua (color.aqua) to show the offset level above the current price.
Lower Offset Line: Always plotted in fuchsia (color.fuchsia) to show the offset level below the current price.
Limitations
Approximation: The bid-ask spread approximation is based on high and low prices and may not perfectly reflect the actual bid-ask spread of a specific broker, especially during periods of high volatility or low liquidity.
Simplified Entry: The entry price simulation is basic and assumes entry at the beginning of each session. It does not account for specific entry signals or order types.
No Order Execution: This indicator is purely for visualization and does not execute any trades.
Data Discrepancies: The high and low values used for approximation might not always align with real-time bid and ask prices due to differences in data aggregation and timing between TradingView and various brokers.
Disclaimer
This indicator is for educational and informational purposes only and should not be considered financial advice. Trading involves substantial risk, and past performance is not indicative of future results. Always conduct thorough research and consider your own risk tolerance before making any trading decisions. It is recommended to combine this indicator with other technical analysis tools and a well-defined trading strategy.