Cluster Reversal Zones📌 Cluster Reversal Zones – Smart Market Turning Point Detector
📌 Category : Public (Restricted/Closed-Source) Indicator
📌 Designed for : Traders looking for high-accuracy reversal zones based on price clustering & liquidity shifts.
🔍 Overview
The Cluster Reversal Zones Indicator is an advanced market reversal detection tool that helps traders identify key turning points using a combination of price clustering, order flow analysis, and liquidity tracking. Instead of relying on static support and resistance levels, this tool dynamically adjusts to live market conditions, ensuring traders get the most accurate reversal signals possible.
📊 Core Features:
✅ Real-Time Reversal Zone Mapping – Detects high-probability market turning points using price clustering & order flow imbalance.
✅ Liquidity-Based Support/Resistance Detection – Identifies strong rejection zones based on real-time liquidity shifts.
✅ Order Flow Sensitivity for Smart Filtering – Filters out weak reversals by detecting real market participation behind price movements.
✅ Momentum Divergence for Confirmation – Aligns reversal zones with momentum divergences to increase accuracy.
✅ Adaptive Risk Management System – Adjusts risk parameters dynamically based on volatility and trend state.
🔒 Justification for Mashup
The Cluster Reversal Zones Indicator contains custom-built methodologies that extend beyond traditional support/resistance indicators:
✔ Smart Price Clustering Algorithm: Instead of plotting fixed support/resistance lines, this system analyzes historical price clustering to detect active reversal areas.
✔ Order Flow Delta & Liquidity Shift Sensitivity: The tool tracks real-time order flow data, identifying price zones with the highest accumulation or distribution levels.
✔ Momentum-Based Reversal Validation: Unlike traditional indicators, this tool requires a momentum shift confirmation before validating a potential reversal.
✔ Adaptive Reversal Filtering Mechanism: Uses a combination of historical confluence detection + live market validation to improve accuracy.
🛠️ How to Use:
• Works well for reversal traders, scalpers, and swing traders seeking precise turning points.
• Best combined with VWAP, Market Profile, and Delta Volume indicators for confirmation.
• Suitable for Forex, Indices, Commodities, Crypto, and Stock markets.
🚨 Important Note:
For educational & analytical purposes only.
Cari dalam skrip untuk "algo"
Ehlers Maclaurin Ultimate Smoother [CT]Ehlers Maclaurin Ultimate Smoother
Introduction
The Ehlers Maclaurin Ultimate Smoother is an innovative enhancement of the classic Ehlers SuperSmoother. By leveraging advanced Maclaurin series approximations, this indicator offers superior market analysis and signal generation.
The indicator combines Ehlers' Ultimate Smoother with Maclaurin series approximations to create a more efficient and accurate smoothing mechanism:
Input price data passes through the initial smoothing phase
Maclaurin series approximates trigonometric functions
Enhanced high-pass filter removes market noise
Final smoothing phase produces the output signal
Why the Maclaurin Approach?
The Maclaurin series is a special form of the Taylor series, centered around 0. It provides an efficient way to approximate complex functions using polynomial terms. In this indicator, we use the Maclaurin approach to improve the sine and cosine functions, resulting in:
Faster Calculations: By using polynomial approximations, we significantly reduce computational complexity.
Improved Stability: The approximation provides a more stable numerical basis for calculations.
Preservation of Precision: Despite the approximation, we maintain the precision needed for price smoothing.
Calculations
The indicator employs several key mathematical components:
Maclaurin Series Approximation:
sin(x) ≈ x - x³/3! + x⁵/5! - x⁷/7! + x⁹/9!
cos(x) ≈ 1 - x²/2! + x⁴/4! - x⁶/6! + x⁸/8!
Smoothing Algorithm:
Uses exponential smoothing with optimized coefficients
Implements high-pass filtering for noise reduction
Applies dynamic weighting based on market conditions
Mathematical Foundation
Utilizes Maclaurin series for trigonometric approximation
Implements Ehlers' smoothing principles
Incorporates advanced filtering techniques
Technical Advantages
Signal Processing:
Lag Reduction: Faster signal detection with less delay.
Noise Filtration: Effective elimination of high-frequency noise.
Precision Enhancement: Preservation of critical price movements.
Adaptive Processing: Dynamic response to market volatility.
Visual Enhancements:
Smart color intensity mapping.
Real-time visualization of trend strength.
Adaptive opacity based on movement significance.
Implementation
Core Configuration:
Plot Type: Choose between the original and the Maclaurin enhanced version.
Length: Default set to 30, optimal for daily timeframes.
hpLength: Default set to 10 for enhanced noise reduction.
Advanced Parameters:
The indicator offers advanced control with:
Dual processing modes (Original/Maclaurin).
Dynamic color intensity system.
Customizable smoothing parameters.
Professional Analysis Tools:
Accurate trend reversal identification.
Advanced support/resistance detection.
Superior performance in volatile markets.
Technical Specifications
Maclaurin Series Implementation:
The indicator employs a 5-term Maclaurin series approximation for both sine and cosine, ensuring efficient and accurate computation.
Performance Metrics
Improved processing efficiency.
Reduced memory utilization.
Increased signal accuracy.
Licensing & Attribution
© 2024 Mupsje aka CasaTropical
Professional Credits
Original Ultimate and SuperSmoother concept: John F. Ehlers
Maclaurin enhancement: Casa Tropical (CT)
www.mathsisfun.com
True Amplitude Envelopes (TAE)The True Envelopes indicator is an adaptation of the True Amplitude Envelope (TAE) method, based on the research paper " Improved Estimation of the Amplitude Envelope of Time Domain Signals Using True Envelope Cepstral Smoothing " by Caetano and Rodet. This indicator aims to create an asymmetric price envelope with strong predictive power, closely following the methodology outlined in the paper.
Due to the inherent limitations of Pine Script, the indicator utilizes a Kernel Density Estimator (KDE) in place of the original Cepstral Smoothing technique described in the paper. While this approach was chosen out of necessity rather than superiority, the resulting method is designed to be as effective as possible within the constraints of the Pine environment.
This indicator is ideal for traders seeking an advanced tool to analyze price dynamics, offering insights into potential price movements while working within the practical constraints of Pine Script. Whether used in dynamic mode or with a static setting, the True Envelopes indicator helps in identifying key support and resistance levels, making it a valuable asset in any trading strategy.
Key Features:
Dynamic Mode: The indicator dynamically estimates the fundamental frequency of the price, optimizing the envelope generation process in real-time to capture critical price movements.
High-Pass Filtering: Uses a high-pass filtered signal to identify and smoothly interpolate price peaks, ensuring that the envelope accurately reflects significant price changes.
Kernel Density Estimation: Although implemented as a workaround, the KDE technique allows for flexible and adaptive smoothing of the envelope, aimed at achieving results comparable to the more sophisticated methods described in the original research.
Symmetric and Asymmetric Envelopes: Provides options to select between symmetric and asymmetric envelopes, accommodating various trading strategies and market conditions.
Smoothness Control: Features adjustable smoothness settings, enabling users to balance between responsiveness and the overall smoothness of the envelopes.
The True Envelopes indicator comes with a variety of input settings that allow traders to customize the behavior of the envelopes to match their specific trading needs and market conditions. Understanding each of these settings is crucial for optimizing the indicator's performance.
Main Settings
Source: This is the data series on which the indicator is applied, typically the closing price (close). You can select other price data like open, high, low, or a custom series to base the envelope calculations.
History: This setting determines how much historical data the indicator should consider when calculating the envelopes. A value of 0 will make the indicator process all available data, while a higher value restricts it to the most recent n bars. This can be useful for reducing the computational load or focusing the analysis on recent market behavior.
Iterations: This parameter controls the number of iterations used in the envelope generation algorithm. More iterations will typically result in a smoother envelope, but can also increase computation time. The optimal number of iterations depends on the desired balance between smoothness and responsiveness.
Kernel Style: The smoothing kernel used in the Kernel Density Estimator (KDE). Available options include Sinc, Gaussian, Epanechnikov, Logistic, and Triangular. Each kernel has different properties, affecting how the smoothing is applied. For example, Gaussian provides a smooth, bell-shaped curve, while Epanechnikov is more efficient computationally with a parabolic shape.
Envelope Style: This setting determines whether the envelope should be Static or Dynamic. The Static mode applies a fixed period for the envelope, while the Dynamic mode automatically adjusts the period based on the fundamental frequency of the price data. Dynamic mode is typically more responsive to changing market conditions.
High Q: This option controls the quality factor (Q) of the high-pass filter. Enabling this will increase the Q factor, leading to a sharper cutoff and more precise isolation of high-frequency components, which can help in better identifying significant price peaks.
Symmetric: This setting allows you to choose between symmetric and asymmetric envelopes. Symmetric envelopes maintain an equal distance from the central price line on both sides, while asymmetric envelopes can adjust differently above and below the price line, which might better capture market conditions where upside and downside volatility are not equal.
Smooth Envelopes: When enabled, this setting applies additional smoothing to the envelopes. While this can reduce noise and make the envelopes more visually appealing, it may also decrease their responsiveness to sudden market changes.
Dynamic Settings
Extra Detrend: This setting toggles an additional high-pass filter that can be applied when using a long filter period. The purpose is to further detrend the data, ensuring that the envelope focuses solely on the most recent price oscillations.
Filter Period Multiplier: This multiplier adjusts the period of the high-pass filter dynamically based on the detected fundamental frequency. Increasing this multiplier will lengthen the period, making the filter less sensitive to short-term price fluctuations.
Filter Period (Min) and Filter Period (Max): These settings define the minimum and maximum bounds for the high-pass filter period. They ensure that the filter period stays within a reasonable range, preventing it from becoming too short (and overly sensitive) or too long (and too sluggish).
Envelope Period Multiplier: Similar to the filter period multiplier, this adjusts the period for the envelope generation. It scales the period dynamically to match the detected price cycles, allowing for more precise envelope adjustments.
Envelope Period (Min) and Envelope Period (Max): These settings establish the minimum and maximum bounds for the envelope period, ensuring the envelopes remain adaptive without becoming too reactive or too slow.
Static Settings
Filter Period: In static mode, this setting determines the fixed period for the high-pass filter. A shorter period will make the filter more responsive to price changes, while a longer period will smooth out more of the price data.
Envelope Period: This setting specifies the fixed period used for generating the envelopes in static mode. It directly influences how tightly or loosely the envelopes follow the price action.
TAE Smoothing: This controls the degree of smoothing applied during the TAE process in static mode. Higher smoothing values result in more gradual envelope curves, which can be useful in reducing noise but may also delay the envelope’s response to rapid price movements.
Visual Settings
Top Band Color: This setting allows you to choose the color for the upper band of the envelope. This band represents the resistance level in the price action.
Bottom Band Color: Similar to the top band color, this setting controls the color of the lower band, which represents the support level.
Center Line Color: This is the color of the central price line, often referred to as the carrier. It represents the detrended price around which the envelopes are constructed.
Line Width: This determines the thickness of the plotted lines for the top band, bottom band, and center line. Thicker lines can make the envelopes more visible, especially when overlaid on price data.
Fill Alpha: This controls the transparency level of the shaded area between the top and bottom bands. A lower alpha value will make the fill more transparent, while a higher value will make it more opaque, helping to highlight the envelope more clearly.
The envelopes generated by the True Envelopes indicator are designed to provide a more precise and responsive representation of price action compared to traditional methods like Bollinger Bands or Keltner Channels. The core idea behind this indicator is to create a price envelope that smoothly interpolates the significant peaks in price action, offering a more accurate depiction of support and resistance levels.
One of the critical aspects of this approach is the use of a high-pass filtered signal to identify these peaks. The high-pass filter serves as an effective method of detrending the price data, isolating the rapid fluctuations in price that are often lost in standard trend-following indicators. By filtering out the lower frequency components (i.e., the trend), the high-pass filter reveals the underlying oscillations in the price, which correspond to significant peaks and troughs. These oscillations are crucial for accurately constructing the envelope, as they represent the most responsive elements of the price movement.
The algorithm works by first applying the high-pass filter to the source price data, effectively detrending the series and isolating the high-frequency price changes. This filtered signal is then used to estimate the fundamental frequency of the price movement, which is essential for dynamically adjusting the envelope to current market conditions. By focusing on the peaks identified in the high-pass filtered signal, the algorithm generates an envelope that is both smooth and adaptive, closely following the most significant price changes without overfitting to transient noise.
Compared to traditional envelopes and bands, such as Bollinger Bands and Keltner Channels, the True Envelopes indicator offers several advantages. Bollinger Bands, which are based on standard deviations, and Keltner Channels, which use the average true range (ATR), both tend to react to price volatility but do not necessarily follow the peaks and troughs of the price with precision. As a result, these traditional methods can sometimes lag behind or fail to capture sudden shifts in price momentum, leading to either false signals or missed opportunities.
In contrast, the True Envelopes indicator, by using a high-pass filtered signal and a dynamic period estimation, adapts more quickly to changes in price behavior. The envelopes generated by this method are less prone to the lag that often affects standard deviation or ATR-based bands, and they provide a more accurate representation of the price's immediate oscillations. This can result in better predictive power and more reliable identification of support and resistance levels, making the True Envelopes indicator a valuable tool for traders looking for a more responsive and precise approach to market analysis.
In conclusion, the True Envelopes indicator is a powerful tool that blends advanced theoretical concepts with practical implementation, offering traders a precise and responsive way to analyze price dynamics. By adapting the True Amplitude Envelope (TAE) method through the use of a Kernel Density Estimator (KDE) and high-pass filtering, this indicator effectively captures the most significant price movements, providing a more accurate depiction of support and resistance levels compared to traditional methods like Bollinger Bands and Keltner Channels. The flexible settings allow for extensive customization, ensuring the indicator can be tailored to suit various trading strategies and market conditions.
Hybrid Adaptive Double Exponential Smoothing🙏🏻 This is HADES (Hybrid Adaptive Double Exponential Smoothing) : fully data-driven & adaptive exponential smoothing method, that gains all the necessary info directly from data in the most natural way and needs no subjective parameters & no optimizations. It gets applied to data itself -> to fit residuals & one-point forecast errors, all at O(1) algo complexity. I designed it for streaming high-frequency univariate time series data, such as medical sensor readings, orderbook data, tick charts, requests generated by a backend, etc.
The HADES method is:
fit & forecast = a + b * (1 / alpha + T - 1)
T = 0 provides in-sample fit for the current datum, and T + n provides forecast for n datapoints.
y = input time series
a = y, if no previous data exists
b = 0, if no previous data exists
otherwise:
a = alpha * y + (1 - alpha) * a
b = alpha * (a - a ) + (1 - alpha) * b
alpha = 1 / sqrt(len * 4)
len = min(ceil(exp(1 / sig)), available data)
sig = sqrt(Absolute net change in y / Sum of absolute changes in y)
For the start datapoint when both numerator and denominator are zeros, we define 0 / 0 = 1
...
The same set of operations gets applied to the data first, then to resulting fit absolute residuals to build prediction interval, and finally to absolute forecasting errors (from one-point ahead forecast) to build forecasting interval:
prediction interval = data fit +- resoduals fit * k
forecasting interval = data opf +- errors fit * k
where k = multiplier regulating intervals width, and opf = one-point forecasts calculated at each time t
...
How-to:
0) Apply to your data where it makes sense, eg. tick data;
1) Use power transform to compensate for multiplicative behavior in case it's there;
2) If you have complete data or only the data you need, like the full history of adjusted close prices: go to the next step; otherwise, guided by your goal & analysis, adjust the 'start index' setting so the calculations will start from this point;
3) Use prediction interval to detect significant deviations from the process core & make decisions according to your strategy;
4) Use one-point forecast for nowcasting;
5) Use forecasting intervals to ~ understand where the next datapoints will emerge, given the data-generating process will stay the same & lack structural breaks.
I advise k = 1 or 1.5 or 4 depending on your goal, but 1 is the most natural one.
...
Why exponential smoothing at all? Why the double one? Why adaptive? Why not Holt's method?
1) It's O(1) algo complexity & recursive nature allows it to be applied in an online fashion to high-frequency streaming data; otherwise, it makes more sense to use other methods;
2) Double exponential smoothing ensures we are taking trends into account; also, in order to model more complex time series patterns such as seasonality, we need detrended data, and this method can be used to do it;
3) The goal of adaptivity is to eliminate the window size question, in cases where it doesn't make sense to use cumulative moving typical value;
4) Holt's method creates a certain interaction between level and trend components, so its results lack symmetry and similarity with other non-recursive methods such as quantile regression or linear regression. Instead, I decided to base my work on the original double exponential smoothing method published by Rob Brown in 1956, here's the original source , it's really hard to find it online. This cool dude is considered the one who've dropped exponential smoothing to open access for the first time🤘🏻
R&D; log & explanations
If you wanna read this, you gotta know, you're taking a great responsability for this long journey, and it gonna be one hell of a trip hehe
Machine learning, apprentissage automatique, машинное обучение, digital signal processing, statistical learning, data mining, deep learning, etc., etc., etc.: all these are just artificial categories created by the local population of this wonderful world, but what really separates entities globally in the Universe is solution complexity / algorithmic complexity.
In order to get the game a lil better, it's gonna be useful to read the HTES script description first. Secondly, let me guide you through the whole R&D; process.
To discover (not to invent) the fundamental universal principle of what exponential smoothing really IS, it required the review of the whole concept, understanding that many things don't add up and don't make much sense in currently available mainstream info, and building it all from the beginning while avoiding these very basic logical & implementation flaws.
Given a complete time t, and yet, always growing time series population that can't be logically separated into subpopulations, the very first question is, 'What amount of data do we need to utilize at time t?'. Two answers: 1 and all. You can't really gain much info from 1 datum, so go for the second answer: we need the whole dataset.
So, given the sequential & incremental nature of time series, the very first and basic thing we can do on the whole dataset is to calculate a cumulative , such as cumulative moving mean or cumulative moving median.
Now we need to extend this logic to exponential smoothing, which doesn't use dataset length info directly, but all cool it can be done via a formula that quantifies the relationship between alpha (smoothing parameter) and length. The popular formulas used in mainstream are:
alpha = 1 / length
alpha = 2 / (length + 1)
The funny part starts when you realize that Cumulative Exponential Moving Averages with these 2 alpha formulas Exactly match Cumulative Moving Average and Cumulative (Linearly) Weighted Moving Average, and the same logic goes on:
alpha = 3 / (length + 1.5) , matches Cumulative Weighted Moving Average with quadratic weights, and
alpha = 4 / (length + 2) , matches Cumulative Weighted Moving Average with cubic weghts, and so on...
It all just cries in your shoulder that we need to discover another, native length->alpha formula that leverages the recursive nature of exponential smoothing, because otherwise, it doesn't make sense to use it at all, since the usual CMA and CMWA can be computed incrementally at O(1) algo complexity just as exponential smoothing.
From now on I will not mention 'cumulative' or 'linearly weighted / weighted' anymore, it's gonna be implied all the time unless stated otherwise.
What we can do is to approach the thing logically and model the response with a little help from synthetic data, a sine wave would suffice. Then we can think of relationships: Based on algo complexity from lower to higher, we have this sequence: exponential smoothing @ O(1) -> parametric statistics (mean) @ O(n) -> non-parametric statistics (50th percentile / median) @ O(n log n). Based on Initial response from slow to fast: mean -> median Based on convergence with the real expected value from slow to fast: mean (infinitely approaches it) -> median (gets it quite fast).
Based on these inputs, we need to discover such a length->alpha formula so the resulting fit will have the slowest initial response out of all 3, and have the slowest convergence with expected value out of all 3. In order to do it, we need to have some non-linear transformer in our formula (like a square root) and a couple of factors to modify the response the way we need. I ended up with this formula to meet all our requirements:
alpha = sqrt(1 / length * 2) / 2
which simplifies to:
alpha = 1 / sqrt(len * 8)
^^ as you can see on the screenshot; where the red line is median, the blue line is the mean, and the purple line is exponential smoothing with the formulas you've just seen, we've met all the requirements.
Now we just have to do the same procedure to discover the length->alpha formula but for double exponential smoothing, which models trends as well, not just level as in single exponential smoothing. For this comparison, we need to use linear regression and quantile regression instead of the mean and median.
Quantile regression requires a non-closed form solution to be solved that you can't really implement in Pine Script, but that's ok, so I made the tests using Python & sklearn:
paste.pics
^^ on this screenshot, you can see the same relationship as on the previous screenshot, but now between the responses of quantile regression & linear regression.
I followed the same logic as before for designing alpha for double exponential smoothing (also considered the initial overshoots, but that's a little detail), and ended up with this formula:
alpha = sqrt(1 / length) / 2
which simplifies to:
alpha = 1 / sqrt(len * 4)
Btw, given the pattern you see in the resulting formulas for single and double exponential smoothing, if you ever want to do triple (not Holt & Winters) exponential smoothing, you'll need len * 2 , and just len * 1 for quadruple exponential smoothing. I hope that based on this sequence, you see the hint that Maybe 4 rounds is enough.
Now since we've dealt with the length->alpha formula, we can deal with the adaptivity part.
Logically, it doesn't make sense to use a slower-than-O(1) method to generate input for an O(1) method, so it must be something universal and minimalistic: something that will help us measure consistency in our data, yet something far away from statistics and close enough to topology.
There's one perfect entity that can help us, this is fractal efficiency. The way I define fractal efficiency can be checked at the very beginning of the post, what matters is that I add a square root to the formula that is not typically added.
As explained in the description of my metric QSFS , one of the reasons for SQRT-transformed values of fractal efficiency applied in moving window mode is because they start to closely resemble normal distribution, yet with support of (0, 1). Data with this interesting property (normally distributed yet with finite support) can be modeled with the beta distribution.
Another reason is, in infinitely expanding window mode, fractal efficiency of every time series that exhibits randomness tends to infinitely approach zero, sqrt-transform kind of partially neutralizes this effect.
Yet another reason is, the square root might better reflect the dimensional inefficiency or degree of fractal complexity, since it could balance the influence of extreme deviations from the net paths.
And finally, fractals exhibit power-law scaling -> measures like length, area, or volume scale in a non-linear way. Adding a square root acknowledges this intrinsic property, while connecting our metric with the nature of fractals.
---
I suspect that, given analogies and connections with other topics in geometry, topology, fractals and most importantly positive test results of the metric, it might be that the sqrt transform is the fundamental part of fractal efficiency that should be applied by default.
Now the last part of the ballet is to convert our fractal efficiency to length value. The part about inverse proportionality is obvious: high fractal efficiency aka high consistency -> lower window size, to utilize only the last data that contain brand new information that seems to be highly reliable since we have consistency in the first place.
The non-obvious part is now we need to neutralize the side effect created by previous sqrt transform: our length values are too low, and exponentiation is the perfect candidate to fix it since translating fractal efficiency into window sizes requires something non-linear to reflect the fractal dynamics. More importantly, using exp() was the last piece that let the metric shine, any other transformations & formulas alike I've tried always had some weird results on certain data.
That exp() in the len formula was the last piece that made it all work both on synthetic and on real data.
^^ a standalone script calculating optimal dynamic window size
Omg, THAT took time to write. Comment and/or text me if you need
...
"Versace Pip-Boy, I'm a young gun coming up with no bankroll" 👻
∞
Fourier Extrapolation of PriceThis advanced algorithm leverages Fourier analysis to predict price trends by decomposing historical price data into its frequency components. Unlike traditional algorithms that often operate in lower-dimensional spaces, this method harnesses a multidimensional approach to capture intricate market behaviors. By utilizing additional dimensions, the algorithm identifies and extrapolates subtle patterns and oscillations that are typically overlooked, providing a more robust and nuanced forecast.
Ideal for traders seeking a deeper understanding of market dynamics, this tool offers an enhanced predictive capability by aligning its calculations with the complexity of real-world financial systems.
Volume Based Price Prediction [EdgeTerminal]This indicator combines price action, volume analysis, and trend prediction to forecast potential future price movements. The indicator creates a dynamic prediction zone with confidence bands, helping you visualize possible price trajectories based on current market conditions.
Key Features
Dynamic price prediction based on volume-weighted trend analysis
Confidence bands showing potential price ranges
Volume-based candle coloring for enhanced market insight
VWAP and Moving Average overlay
Customizable prediction parameters
Real-time updates with each new bar
Technical Components:
Volume-Price Correlation: The indicator analyzes the relationship between price movements and volume, Identifies stronger trends through volume confirmation and uses Volume-Weighted Average Price (VWAP) for price equilibrium
Trend Strength Analysis: Calculates trend direction using exponential moving averages, weights trend strength by relative volume and incorporates momentum for improved accuracy
Prediction Algorithm: combines current price, trend, and volume metrics, projects future price levels using weighted factors and generates confidence bands based on price volatility
Customizable Parameters:
Moving Average Length: Controls the smoothing period for calculations
Volume Weight Factor: Adjusts how much volume influences predictions
Prediction Periods: Number of bars to project into the future
Confidence Band Width: Controls the width of prediction bands
How to use it:
Look for strong volume confirmation with green candles, watch for prediction line slope changes, use confidence bands to gauge potential volatility and compare predictions with key support/resistance levels
Some useful tips:
Start with default settings and adjust gradually
Use wider confidence bands in volatile markets
Consider prediction lines as zones rather than exact levels
Best applications of this indicator:
Trend continuation probability assessment
Potential reversal point identification
Risk management through confidence bands
Volume-based trend confirmation
MACD Cloud with Moving Average and ATR BandsThe algorithm implements a technical analysis indicator that combines the MACD Cloud, Moving Averages (MA), and volatility bands (ATR) to provide signals on market trends and potential reversal points. It is divided into several sections:
🎨 Color Bars:
Activated based on user input.
Controls bar color display according to price relative to ATR levels and moving average (MA).
Logic:
⚫ Black: Potential bearish reversal (price above the upper ATR band).
🔵 Blue: Potential bullish reversal (price below the lower ATR band).
o
🟢 Green: Bullish trend (price between the MA and upper ATR band).
o
🔴 Red: Bearish trend (price between the lower ATR band and MA).
o
📊 MACD Bars:
Description:
The MACD Bars section is activated by default and can be modified based on user input.
🔴 Red: Indicates a bearish trend, shown when the MACD line is below the Signal line (Signal line is a moving average of MACD).
🔵 Blue: Indicates a bullish trend, shown when the MACD line is above the Signal line.
Matching colors between MACD Bars and MACD Cloud visually confirms trend direction.
MACD Cloud Logic: The MACD Cloud is based on Moving Average Convergence Divergence (MACD), a momentum indicator showing the relationship between two moving averages of price.
MACD and Signal Lines: The cloud visualizes the MACD line relative to the Signal line. If the MACD line is above the Signal line, it indicates a potential bullish trend, while below it suggests a potential bearish trend.
☁️ MA Cloud:
The MA Cloud uses three moving averages to analyze price direction:
Moving Average Relationship: Three MAs of different periods are plotted. The cloud turns green when the shorter MA is above the longer MA, indicating an uptrend, and red when below, suggesting a downtrend.
Trend Visualization: This graphical representation shows the trend direction.
📉 ATR Bands:
The ATR bands calculate overbought and oversold limits using a weighted moving average (WMA) and ATR.
Center (matr): Shows general trend; prices above suggest an uptrend, while below indicate a downtrend.
Up ATR 1: Marks the first overbought level, suggesting a potential bearish reversal if the price moves above this band.
Down ATR 1: Marks the first oversold level, suggesting a possible bullish reversal if the price moves below this band.
Up ATR 2: Extends the overbought range to an extreme, reinforcing the possibility of a bearish reversal at this level.
Down ATR 2: Extends the oversold range to an extreme, indicating a stronger bullish reversal possibility if price reaches here.
Español:
El algoritmo implementa un indicador de análisis técnico que combina la nube MACD, promedios móviles (MA) y bandas de volatilidad (ATR) para proporcionar señales sobre tendencias del mercado y posibles puntos de reversión. Se divide en varias secciones:
🎨 Barras de Color:
- Activado según la entrada del usuario.
- Controla la visualización del color de las barras según el precio en relación con los niveles de ATR y el promedio móvil (MA).
- **Lógica:**
- ⚫ **Negro**: Reversión bajista potencial (precio por encima de la banda superior ATR).
- 🔵 **Azul**: Reversión alcista potencial (precio por debajo de la banda inferior ATR).
- 🟢 **Verde**: Tendencia alcista (precio entre el MA y la banda superior ATR).
- 🔴 **Rojo**: Tendencia bajista (precio entre la banda inferior ATR y el MA).
### 📊 Barras MACD:
- **Descripción**:
- La sección de barras MACD se activa por defecto y puede modificarse según la entrada del usuario.
- 🔴 **Rojo**: Indica una tendencia bajista, cuando la línea MACD está por debajo de la línea de señal (la línea de señal es una media móvil de la MACD).
- 🔵 **Azul**: Indica una tendencia alcista, cuando la línea MACD está por encima de la línea de señal.
- La coincidencia de colores entre las barras MACD y la nube MACD confirma visualmente la dirección de la tendencia.
### 🌥️ Nube MACD:
- **Lógica de la Nube MACD**: Basada en el indicador de convergencia-divergencia de medias móviles (MACD), que muestra la relación entre dos medias móviles del precio.
- **Líneas MACD y de Señal**: La nube visualiza la relación entre la línea MACD y la línea de señal. Si la línea MACD está por encima de la de señal, indica una tendencia alcista potencial; si está por debajo, sugiere una tendencia bajista.
### ☁️ Nube MA:
- **Relación entre Medias Móviles**: Se trazan tres medias móviles de diferentes períodos. La nube se vuelve verde cuando la media más corta está por encima de la más larga, indicando una tendencia alcista, y roja cuando está por debajo, sugiriendo una tendencia bajista.
- **Visualización de Tendencias**: Proporciona una representación gráfica de la dirección de la tendencia.
### 📉 Bandas ATR:
- Las bandas ATR calculan límites de sobrecompra y sobreventa usando una media ponderada y el ATR.
- **Centro (matr)**: Muestra la tendencia general; precios por encima indican tendencia alcista y debajo, bajista.
- **Up ATR 1**: Marca el primer nivel de sobrecompra, sugiriendo una reversión bajista potencial si el precio sube por encima de esta banda.
- **Down ATR 1**: Marca el primer nivel de sobreventa, sugiriendo una reversión alcista potencial si el precio baja por debajo de esta banda.
- **Up ATR 2**: Amplía el rango de sobrecompra a un nivel extremo, reforzando la posibilidad de reversión bajista.
- **Down ATR 2**: Extiende el rango de sobreventa a un nivel extremo, sugiriendo una reversión alcista más fuerte si el precio alcanza esta banda.
Third-order moment by TonymontanovThe "Third-order moment" indicator is designed to help traders identify asymmetries and potential turning points in a financial instrument's price distribution over a specified period. By calculating the skewness of the price distribution, this indicator provides insights into the potential future movement direction of the market.
User Parameters:
- Length: This parameter defines the number of bars (or periods) used to compute the mean and third-order moment. A longer length provides a broader historical context, which may smooth out short-term volatility.
- Source: The data input for calculations, defaulting to the closing price of each bar, although users can select alternatives like open, high, low, or any custom value to suit their analysis preferences.
Operational Algorithm:
1. Mean Calculation:
- The indicator begins by calculating the arithmetic mean of the selected data source over the specified period.
2. Third-order Moment Calculation:
- A deviation from the mean is calculated for each data point. These deviations are then cubed to capture any asymmetry in the price distribution.
- The third-order moment is determined by summing these cubed deviations over the specified length and dividing by the number of periods, providing a measure of skewness.
3. Graphical Representation:
- The indicator plots the third-order moment as a column plot. The color of the columns changes based on the sign of the moment: green for positive and red for negative, suggesting bullish and bearish skewness, respectively.
- A zero line is included to help visualize transitions between positive and negative skewness clearly.
- Additionally, the background color shifts depending on whether the third-order moment is above or below zero, further highlighting the prevailing market sentiment.
The "Third-order moment" indicator is a valuable tool for traders looking to gauge the market's skewness, helping identify potential trend continuations or reversals. By understanding the dominance of positive or negative skewness, traders can make more informed decisions.
Asymmetric volatilityThe "Asymmetric Volatility" indicator is designed to visualize the differences in volatility between upward and downward price movements of a selected instrument. It operates on the principle of analyzing price movements over a specified time period, with particular focus on the symmetrical evaluation of both price rises and falls.
User Parameters:
- Length: This parameter specifies the number of bars (candles) used to calculate the average volatility. The larger the value, the longer the time period, and the smoother the volatility data will be.
- Source: This represents the input data for the indicator calculations. By default, the close value of each bar is used, but the user can choose another data source (such as open, high, low, or any custom value).
Operational Algorithm:
1. Movement Calculation:
- UpMoves: Computed as the positive difference between the current bar value and the previous bar value, if it is greater than zero.
- DownMoves: Computed as the positive difference between the previous bar value and the current bar value, if it is greater than zero.
2. Volatility Calculation:
- UpVolatility: This is the arithmetic mean of the UpMoves values over the specified period.
- DownVolatility: This is the arithmetic mean of the DownMoves values over the specified period.
3. Graphical Representation:
- The indicator displays two plots: upward and downward volatility, represented by green and red lines, respectively.
- The background color changes based on which volatility is dominant: a green background indicates that upward volatility prevails, while a red background indicates downward volatility.
The indicator allows traders to quickly assess in which direction the market is more volatile at the moment, which can be useful for making trading decisions and evaluating the current market situation.
VATICAN BANK CARTELVATICAN BANK CARTEL - Precision Signal Detection for Buyers.
The VATICAN BANK CARTEL indicator is a highly sophisticated tool designed specifically for buyers, helping them identify key market trends and generate actionable buy signals. Utilizing advanced algorithms, this indicator employs a multi-variable detection mechanism that dynamically adapts to price movements, offering real-time insights to assist in executing profitable buy trades. This indicator is optimized solely for identifying buying opportunities, ensuring that traders are equipped to make well-timed entries and exits, without signals for shorting or selling.
The recommended settings for VATICAN BANK CARTEL indicator is as follows:-
Depth Engine = 20,30,40,50,100.
Deviation Engine = 3,5,7,15,20.
Backstep Engine = 15,17,20,25.
NOTE:- But you can also use this indicator as per your setting, whichever setting gives you best results use that setting.
Key Features:
1.Adaptive Depth, Deviation, and Backstep Inputs:
The core of this indicator is its customizable Depth Engine, Deviation Engine, and Backstep Engine parameters. These inputs allow traders to adjust the sensitivity of the trend detection algorithm based on specific market conditions:
Depth: Defines how deep the indicator scans historical price data for potential trend reversals.
Deviation: Determines the minimum required price fluctuation to confirm a market movement.
Backstep: Sets the retracement level to filter false signals and maintain the accuracy of trend detection.
2. Visual Signal Representation:
The VATICAN BANK CARTEL plots highly visible labels on the chart to mark trend reversals. These labels are customizable in terms of size and transparency, ensuring clarity in various chart environments. Traders can quickly spot buying opportunities with green labels and potential square-off points with red labels, focusing exclusively on buy-side signals.
3.Real-Time Alerts:
The indicator is equipped with real-time alert conditions to notify traders of significant buy or square-off buy signals. These alerts, which are triggered based on the indicator’s internal signal logic, ensure that traders never miss a critical market movement on the buy side.
4.Custom Label Size and Transparency:
To enhance visual flexibility, the indicator allows the user to adjust label size (from small to large) and transparency levels. This feature provides a clean, adaptable view suited for different charting styles and timeframes.
How It Works:
The VATICAN BANK CARTEL analyzes the price action using a sophisticated algorithm that considers historical low and high points, dynamically detecting directional changes. When a change in market direction is detected, the indicator plots a label at the key reversal points, helping traders confirm potential entry points:
- Buy Signal (Green): Indicates potential buying opportunities based on a trend reversal.
- Square-Off Buy Signal (Red): Marks the exit point for open buy positions, allowing traders to take profits or protect capital from potential market reversals.
Note: This indicator is exclusively designed to provide signals for buyers. It does not generate sell or short signals, making it ideal for traders focused solely on identifying optimal buying opportunities in the market.
Customizable Parameters:
- Depth Engine: Fine-tunes the historical data analysis for signal generation.
- Deviation Engine: Adjusts the minimum price change required for detecting trends.
- Backstep Engine: Controls the indicator's sensitivity to retracements, minimizing false signals.
- Labels Transparency: Adjusts the opacity of the labels, ensuring they integrate seamlessly into any chart layout.
- Buy and Sell Colors: Customizable color options for buy and square-off buy labels to match your preferred color scheme.
- Label Size: Select between five different label sizes for optimal chart visibility.
Ideal For:
This indicator is ideal for both beginner and experienced traders looking to enhance their buying strategy with a highly reliable, visual, and alert-driven tool. The VATICAN BANK CARTEL adapts to various timeframes, making it suitable for day traders, swing traders, and long-term investors alike—focused exclusively on buying opportunities.
Benefits and Applications:
1.Intraday Trading: The VATICAN BANK CARTEL indicator is particularly well-suited for intraday trading, as it provides accurate and timely "buy" and "square-off buy" signals based on the current market dynamics.
2.Trend-following Strategies: Traders who employ trend-following strategies can leverage the indicator's ability to identify the overall market direction, allowing them to align their trades with the dominant trend.
3.Swing Trading: The dynamic price tracking and signal generation capabilities of the indicator can be beneficial for swing traders, who aim to capture medium-term price movements.
Security Measures:
1. The code includes a security notice at the beginning, indicating that it is subject to the Mozilla Public License 2.0, which is a reputable open-source license.
2. The code does not appear to contain any obvious security vulnerabilities or malicious content that could compromise user data or accounts.
NOTE:- This indicator is provided under the Mozilla Public License 2.0 and is subject to its terms and conditions.
Disclaimer: The usage of VATICAN BANK CARTEL indicator might or might not contribute to your trading capital(money) profits and losses and the author is not responsible for the same.
IMPORTANT NOTICE:
While the indicator aims to provide reliable "buy" and "square-off buy" signals, it is crucial to understand that the market can be influenced by unpredictable events, such as natural disasters, political unrest, changes in monetary policies, or economic crises. These unforeseen situations may occasionally lead to false signals generated by the VATICAN BANK CARTEL indicator.
Users should exercise caution and diligence when relying on the indicator's signals, as the market's behavior can be unpredictable, and external factors may impact the accuracy of the signals. It is recommended to thoroughly backtest the indicator's performance in various market conditions and to use it as one of the many tools in a comprehensive trading strategy, rather than solely relying on its output.
Ultimately, the success of the VATICAN BANK CARTEL indicator will depend on the user's ability to adapt it to their specific trading style, market conditions, and risk management approach. Continuous monitoring, analysis, and adjustment of the indicator's settings may be necessary to maintain its effectiveness in the ever-evolving financial markets.
DEVELOPER:- yashgode9
PineScript:- version:- 5
This indicator aims to enhance trading decision-making by combining DEPTH, DEVIATION, BACKSTEP with custom signal generation, offering a comprehensive tool for traders seeking clear "buy" and "square-off buy" signals on the TradingView platform.
Stationarity Test: Dickey-Fuller & KPSS [Pinescriptlabs]
📊 Kwiatkowski-Phillips-Schmidt-Shin Model Indicator & Dickey-Fuller Test 📈
This algorithm performs two statistical tests on the price spread between two selected instruments: the first from the current chart and the second determined in the settings. The purpose is to determine if their relationship is stationary. It then uses this information to generate **visual signals** based on how far the current relationship deviates from its historical average.
⚙️ Key Components:
• 🧪 ADF Test (Augmented Dickey-Fuller):** Checks if the spread between the two instruments is stationary.
• 🔬 KPSS Test (Kwiatkowski-Phillips-Schmidt-Shin):** Another test for stationarity, complementing the ADF test.
• 📏 Z-Score Calculation:** Measures how many standard deviations the current spread is from its historical mean.
• 📊 Dynamic Threshold:** Adjusts the trading signal threshold based on recent market volatility.
🔍 What the Values Mean:
The indicator displays several key values in a table:
• 📈 ADF Stationarity:** Shows "Stationary" or "Non-Stationary" based on the ADF test result.
• 📉 KPSS Stationarity:** Shows "Stationary" or "Non-Stationary" based on the KPSS test result.
• 📏 Current Z-Score:** The current Z-score of the spread.
• 🔗 Hedge Ratio:** The relationship coefficient between the two instruments.
• 🌐 Market State:** Describes the current market condition based on the Z-score.
📊 How to Interpret the Chart:
• The main chart displays the Z-score of the spread over time.
• The green and red lines represent the upper and lower thresholds for trading signals.
• The area between the **Z-score** and the thresholds is filled when a trading signal is active.
• Additional charts show the **statistics of the ADF and KPSS tests** and their critical values.
**📉 Practical Example: NVIDIA Corporation (NVDA)**
Looking at the chart for **NVIDIA Corporation (NVDA)**, we can see how the indicator applies in a real case:
1. **Main Chart (Top):**
• Shows the **historical price** of NVIDIA on a weekly scale.
• A general **uptrend** is observed with periods of consolidation.
2. **KPSS & ADF Indicator (Bottom):**
• The lower chart shows the KPSS & ADF Model indicator applied to NVIDIA.
• The **green line** represents the Z-score of the spread.
• The **green shaded areas** indicate periods where the Z-score exceeded the thresholds, generating trading signals.
3. **📋 Current Values in the Table:**
• **ADF Stationarity:** Non-Stationary
• **KPSS Stationarity:** Non-Stationary
• **Current Z-Score:** 3.45
• **Hedge Ratio:** -164.8557
• **Market State:** Moderate Volatility
4. **🔍 Interpretation:**
• A Z-score of **3.45** suggests that NVIDIA’s price is significantly above its historical average relative to **EURUSD**.
• Both the **ADF** and **KPSS** tests indicate **non-stationarity**, suggesting **caution** when using mean reversion signals at this moment.
• The market state "Moderate Volatility" indicates noticeable deviation, but not extreme.
---
**💡 Usage:**
• **When Both Tests Show Stationarity:**
• **🔼 If Z-score > Upper Threshold:** Consider **buying the first instrument** and **selling the second**.
• **🔽 If Z-score < Lower Threshold:** Consider **selling the first instrument** and **buying the second**.
• **When Either Test Shows Non-Stationarity:**
• Wait for the relationship to become **stationary** before trading.
• **Market State:**
• Use this information to evaluate **general market conditions** and adjust your trading strategy accordingly.
**Mirror Comparison of the Same as Symbol 2 🔄📊**
**📊 Table Values:**
• **Extreme Volatility Threshold:** This value is displayed when the **Z-score** exceeds **100%**, indicating **extreme deviation**. It signals a potential **trading opportunity**, as the spread has reached unusually high or low levels, suggesting a **reversion or correction** in the market.
• **Mean Reversion Threshold:** Appears when the **Z-score** begins returning towards the mean after a period of **high or extreme volatility**. It indicates that the spread between the assets is returning to normal levels, suggesting a phase of **stabilization**.
• **Neutral Zone:** Displayed when the **Z-score** is near **zero**, signaling that the spread between assets is within expected limits. This indicates a **balanced market** with no significant volatility or clear trading opportunities.
• **Low Volatility Threshold:** Appears when the **Z-score** is below **70%** of the dynamic threshold, reflecting a period of **low volatility** and market stability, indicating fewer trading opportunities.
Español:
📊 Indicador del Modelo Kwiatkowski-Phillips-Schmidt-Shin & Prueba de Dickey-Fuller 📈
Este algoritmo realiza dos pruebas estadísticas sobre la diferencia de precios (spread) entre dos instrumentos seleccionados: el primero en el gráfico actual y el segundo determinado en la configuración. El objetivo es determinar si su relación es estacionaria. Luego utiliza esta información para generar señales visuales basadas en cuánto se desvía la relación actual de su promedio histórico.
⚙️ Componentes Clave:
• 🧪 Prueba ADF (Dickey-Fuller Aumentada): Verifica si el spread entre los dos instrumentos es estacionario.
• 🔬 Prueba KPSS (Kwiatkowski-Phillips-Schmidt-Shin): Otra prueba para la estacionariedad, complementando la prueba ADF.
• 📏 Cálculo del Z-Score: Mide cuántas desviaciones estándar se encuentra el spread actual de su media histórica.
• 📊 Umbral Dinámico: Ajusta el umbral de la señal de trading en función de la volatilidad reciente del mercado.
🔍 Qué Significan los Valores:
El indicador muestra varios valores clave en una tabla:
• 📈 Estacionariedad ADF: Muestra "Estacionario" o "No Estacionario" basado en el resultado de la prueba ADF.
• 📉 Estacionariedad KPSS: Muestra "Estacionario" o "No Estacionario" basado en el resultado de la prueba KPSS.
• 📏 Z-Score Actual: El Z-score actual del spread.
• 🔗 Ratio de Cobertura: El coeficiente de relación entre los dos instrumentos.
• 🌐 Estado del Mercado: Describe la condición actual del mercado basado en el Z-score.
📊 Cómo Interpretar el Gráfico:
• El gráfico principal muestra el Z-score del spread a lo largo del tiempo.
• Las líneas verdes y rojas representan los umbrales superior e inferior para las señales de trading.
• El área entre el Z-score y los umbrales se llena cuando una señal de trading está activa.
• Los gráficos adicionales muestran las estadísticas de las pruebas ADF y KPSS y sus valores críticos.
📉 Ejemplo Práctico: NVIDIA Corporation (NVDA)
Observando el gráfico para NVIDIA Corporation (NVDA), podemos ver cómo se aplica el indicador en un caso real:
Gráfico Principal (Superior): • Muestra el precio histórico de NVIDIA en escala semanal. • Se observa una tendencia alcista general con períodos de consolidación.
Indicador KPSS & ADF (Inferior): • El gráfico inferior muestra el indicador Modelo KPSS & ADF aplicado a NVIDIA. • La línea verde representa el Z-score del spread. • Las áreas sombreadas en verde indican períodos donde el Z-score superó los umbrales, generando señales de trading.
📋 Valores Actuales en la Tabla: • Estacionariedad ADF: No Estacionario • Estacionariedad KPSS: No Estacionario • Z-Score Actual: 3.45 • Ratio de Cobertura: -164.8557 • Estado del Mercado: Volatilidad Moderada
🔍 Interpretación: • Un Z-score de 3.45 sugiere que el precio de NVIDIA está significativamente por encima de su promedio histórico en relación con EURUSD. • Tanto la prueba ADF como la KPSS indican no estacionariedad, lo que sugiere precaución al usar señales de reversión a la media en este momento. • El estado del mercado "Volatilidad Moderada" indica una desviación notable, pero no extrema.
💡 Uso:
• Cuando Ambas Pruebas Muestran Estacionariedad:
• 🔼 Si Z-score > Umbral Superior: Considera comprar el primer instrumento y vender el segundo.
• 🔽 Si Z-score < Umbral Inferior: Considera vender el primer instrumento y comprar el segundo.
• Cuando Alguna Prueba Muestra No Estacionariedad:
• Espera a que la relación se vuelva estacionaria antes de operar.
• Estado del Mercado:
• Usa esta información para evaluar las condiciones generales del mercado y ajustar tu estrategia de trading en consecuencia.
Comparativo en Espejo del Mismo Como Símbolo 2 🔄📊
📊 Valores de la Tabla:
• Umbral de Volatilidad Extrema: Este valor se muestra cuando el Z-score supera el 100%, indicando desviación extrema. Señala una posible oportunidad de trading, ya que el spread entre los activos ha alcanzado niveles inusualmente altos o bajos, lo que podría indicar una reversión o corrección en el mercado.
• Umbral de Reversión a la Media: Aparece cuando el Z-score comienza a volver hacia la media tras un período de alta o extrema volatilidad. Indica que el spread entre los activos está regresando a niveles normales, sugiriendo una fase de estabilización.
• Zona Neutral: Se muestra cuando el Z-score está cerca de cero, señalando que el spread entre activos está dentro de lo esperado. Esto indica un mercado equilibrado con ninguna volatilidad significativa ni oportunidades claras de trading.
• Umbral de Baja Volatilidad: Aparece cuando el Z-score está por debajo del 70% del umbral dinámico, reflejando un período de baja volatilidad y estabilidad del mercado, indicando menos oportunidades de trading.
Smoothed SuperTrend with VWAP Confirmation [CHE] Smoothed SuperTrend with Automated Optimization and VWAP Confirmation
Overview
The "Smoothed SuperTrend with VWAP Confirmation" is an advanced technical analysis indicator designed for precise trend identification and trading signal generation. This script integrates a smoothed version of the popular SuperTrend indicator with an additional layer of confirmation using the Volume-Weighted Average Price (VWAP). The combination of these two elements offers traders a powerful tool for identifying optimal entry and exit points in the market.
Key Features
1. Smoothed SuperTrend
- Super Smoother Algorithm: The SuperTrend in this script is not just a regular one; it is enhanced by the Super Smoother filter, which reduces market noise and provides more reliable trend signals.
- Customizable Parameters: Traders can adjust three different sets of SuperTrend parameters (factor and ATR length), allowing them to tailor the indicator to their specific trading strategies.
- Automatic Optimization: The script automatically evaluates the performance of each SuperTrend parameter set and selects the one with the best cumulative performance. This selection process can be set to pick either the best or the worst performing parameter set, depending on the trader's preference.
2. VWAP Confirmation
- Precise Trend Confirmation: Once the best-performing SuperTrend is identified, the script further refines the signals by using VWAP as a confirmation tool. VWAP is a highly respected indicator in the trading community, often used to assess the true average price of an asset.
- Long and Short Signal Generation: The script generates Long and Short signals only when the price action is confirmed by both the SuperTrend and VWAP. For a Long signal, the price must be above the VWAP, and for a Short signal, it must be below the VWAP. This dual confirmation ensures higher accuracy and reduces the likelihood of false signals.
3. Visual and Informative Labels
- Signal Labels: Upon confirmation of a trend reversal by both the SuperTrend and VWAP, the script plots clear labels on the chart, indicating confirmed Long or Short signals. These labels are customizable in terms of color, text, and size, ensuring they fit seamlessly into any chart setup.
- Best Parameters Display: At the close of the most recent bar, the script displays a label that provides detailed information about the best-performing SuperTrend parameters and their cumulative performance. This feature keeps traders informed about which settings are currently most effective.
Input Customization Options
1. Super Smoother Length
- Traders can define the length of the Super Smoother filter, which is used to smooth both price data and ATR (Average True Range) values. This input allows traders to control the sensitivity of the indicator, with shorter lengths providing faster responses and longer lengths offering smoother trends.
2. SuperTrend Parameters
- Factor: For each of the three SuperTrends, traders can set a unique factor that determines the distance of the SuperTrend bands from the average price. A higher factor results in wider bands and fewer signals, while a lower factor results in narrower bands and more signals.
- ATR Length: Traders can also specify the length of the ATR used in each SuperTrend calculation. A longer ATR period captures broader market volatility, while a shorter period focuses on more immediate price movements.
3. Label Settings
- Label Colors: The script allows full customization of label colors for Long and Short signals, ensuring that they match the trader’s chart aesthetics.
- Label Text Colors and Sizes: Traders can adjust the text color and size of the labels for Long, Short, and information labels, allowing them to prioritize visibility and readability on their charts.
4. Performance Selection Mode
- Best or Worst Performer: This input allows traders to select whether the script should optimize for the best or worst performing SuperTrend parameter set. This flexibility is useful in different market conditions, where a trader might want to analyze either the strongest trend or focus on a contrarian strategy.
5. VWAP Calculation
- The script automatically recalculates the VWAP based on trend changes, ensuring that the confirmation signals are as accurate and relevant as possible to the current market context.
Important Note
This script is designed to provide more accurate trend signals and confirmations, but like all technical indicators, it should not be used in isolation. It is recommended to use this tool as part of a broader trading strategy, including proper risk management and consideration of fundamental market conditions.
Conclusion
The "Smoothed SuperTrend with VWAP Confirmation" script is an innovative trading tool that combines the strengths of the SuperTrend and VWAP indicators. By integrating smoothing techniques and automatic parameter optimization, this indicator provides traders with more accurate and reliable trend signals. The added confirmation by VWAP further enhances the precision of the entry and exit points, making it an excellent choice for traders looking to improve their technical analysis and trading outcomes. This tool is especially valuable for those who prefer customizable inputs and a systematic approach to trading, ensuring that the indicator adapts to various market conditions and individual trading styles.
Best regards
Chervolino
Price Action Smart Money Concepts [BigBeluga]THE SMART MONEY CONCEPTS Toolkit
The Smart Money Concepts [ BigBeluga ] is a comprehensive toolkit built around the principles of "smart money" behavior, which refers to the actions and strategies of institutional investors.
The Smart Money Concepts Toolkit brings together a suite of advanced indicators that are all interconnected and built around a unified concept: understanding and trading like institutional investors, or "smart money." These indicators are not just randomly chosen tools; they are features of a single overarching framework, which is why having them all in one place creates such a powerful system.
This all-in-one toolkit provides the user with a unique experience by automating most of the basic and advanced concepts on the chart, saving them time and improving their trading ideas.
Real-time market structure analysis simplifies complex trends by pinpointing key support, resistance, and breakout levels.
Advanced order block analysis leverages detailed volume data to pinpoint high-demand zones, revealing internal market sentiment and predicting potential reversals. This analysis utilizes bid/ask zones to provide supply/demand insights, empowering informed trading decisions.
Imbalance Concepts (FVG and Breakers) allows traders to identify potential market weaknesses and areas where price might be attracted to fill the gap, creating opportunities for entry and exit.
Swing failure patterns help traders identify potential entry points and rejection zones based on price swings.
Liquidity Concepts, our advanced liquidity algorithm, pinpoints high-impact events, allowing you to predict market shifts, strong price reactions, and potential stop-loss hunting zones. This gives traders an edge to make informed trading decisions based on liquidity dynamics.
🔵 FEATURES
The indicator has quite a lot of features that are provided below:
Swing market structure
Internal market structure
Mapping structure
Adjustable market structure
Strong/Weak H&L
Sweep
Volumetric Order block / Breakers
Fair Value Gaps / Breakers (multi-timeframe)
Swing Failure Patterns (multi-timeframe)
Deviation area
Equal H&L
Liquidity Prints
Buyside & Sellside
Sweep Area
Highs and Lows (multi-timeframe)
🔵 BASIC DEMONSTRATION OF ALL FEATURES
1. MARKET STRUCTURE
The preceding image illustrates the market structure functionality within the Smart Money Concepts indicator.
➤ Solid lines: These represent the core indicator's internal structure, forming the foundation for most other components. They visually depict the overall market direction and identify major reversal points marked by significant price movements (denoted as 'x').
➤ Internal Structure: These represent an alternative internal structure with the potential to drive more rapid market shifts. This is particularly relevant when a significant gap exists in the established swing structure, specifically between the Break of Structure (BOS) and the most recent Change of High/Low (CHoCH). Identifying these formations can offer opportunities for quicker entries and potential short-term reversals.
➤ Sweeps (x): These signify potential turning points in the market where liquidity is removed from the structure. This suggests a possible trend reversal and presents crucial entry opportunities. Sweeps are identified within both swing and internal structures, providing valuable insights for informed trading decisions.
➤ Mapping structure: A tool that automatically identifies and connects significant price highs and lows, creating a zig-zag pattern. It visualizes market structure, highlights trends, support/resistance levels, and potential breakouts. Helps traders quickly grasp price action patterns and make informed decisions.
➤ Color-coded candles based on market structure: These colors visually represent the underlying market structure, making it easier for traders to quickly identify trends.
➤ Extreme H&L: It visualizes market structure with extreme high and lows, which gives perspective for macro Market Structure.
2. VOLUMETRIC ORDER BLOCKS
Order blocks are specific areas on a financial chart where significant buying or selling activity has occurred. These are not just simple zones; they contain valuable information about market dynamics. Within each of these order blocks, volume bars represent the actual buying and selling activity that took place. These volume bars offer deeper insights into the strength of the order block by showing how much buying or selling power is concentrated in that specific zone.
Additionally, these order blocks can be transformed into Breaker Blocks. When an order block fails—meaning the price breaks through this zone without reversing—it becomes a breaker block. Breaker blocks are particularly useful for trading breakouts, as they signal that the market has shifted beyond a previously established zone, offering opportunities for traders to enter in the direction of the breakout.
Here's a breakdown:
➤ Bear Order Blocks (Red): These are zones where a lot of selling happened. Traders see these areas as places where sellers were strong, pushing the price down. When the price returns to these zones, it might face resistance and drop again.
➤ Bull Order Blocks (Green): These are zones where a lot of buying happened. Traders see these areas as places where buyers were strong, pushing the price up. When the price returns to these zones, it might find support and rise again.
These Order Blocks help traders identify potential areas for entering or exiting trades based on past market activity. The volume bars inside blocks show the amount of trading activity that occurred in these blocks, giving an idea of the strength of buying or selling pressure.
➤ Breaker Block: When an order block fails, meaning the price breaks through this zone without reversing, it becomes a breaker block. This indicates a significant shift in market liquidity and structure.
➤ A bearish breaker block occurs after a bullish order block fails. This typically happens when there's an upward trend, and a certain level that was expected to support the market's rise instead gives way, leading to a sharp decline. This decline indicates that sellers have overcome the buyers, absorbing liquidity and shifting the sentiment from bullish to bearish.
Conversely, a bullish breaker block is formed from the failure of a bearish order block. In a downtrend, when a level that was expected to act as resistance is breached, and the price shoots up, it signifies that buyers have taken control, overpowering the sellers.
3. FAIR VALUE GAPS:
A fair value gap (FVG), also referred to as an imbalance, is an essential concept in Smart Money trading. It highlights the supply and demand dynamics. This gap arises when there's a notable difference between the volume of buy and sell orders. FVGs can be found across various asset classes, including forex, commodities, stocks, and cryptocurrencies.
FVGs in this toolkit have the ability to detect raids of FVG which helps to identify potential price reversals.
Mitigation option helps to change from what source FVGs will be identified: Close, Wicks or AVG.
4. SWING FAILURE PATTERN (SFP):
The Swing Failure Pattern is a liquidity engineering pattern, generally used to fill large orders. This means, the SFP generally occurs when larger players push the price into liquidity pockets with the sole objective of filling their own positions.
SFP is a technical analysis tool designed to identify potential market reversals. It works by detecting instances where the price briefly breaks a previous high or low but fails to maintain that breakout, quickly reversing direction.
How it works:
Pattern Detection: The indicator scans for price movements that breach recent highs or lows.
Reversal Confirmation: If the price quickly reverses after breaching these levels, it's identified as an SFP.
➤ SFP Display:
Bullish SFP: Marked with a green symbol when price drops below a recent low before reversing upwards.
Bearish SFP: Marked with a red symbol when price rises above a recent high before reversing downwards.
➤ Deviation Levels: After detecting an SFP, the indicator projects white lines showing potential price deviation:
For bullish SFPs, the deviation line appears above the current price.
For bearish SFPs, the deviation line appears below the current price.
These deviation levels can serve as a potential trading opportunity or areas where the reversal might lose momentum.
With Volume Threshold and Filtering of SFP traders can adjust their trading style:
Volume Threshold: This setting allows traders to filter SFPs based on the volume of the reversal candle. By setting a higher volume threshold, traders can focus on potentially more significant reversals that are backed by higher trading activity.
SFP Filtering: This feature enables traders to filter SFP detection. It includes parameters such as:
5. LIQUIDITY CONCEPTS:
➤ Equal Lows (EQL) and Equal Highs (EQH) are important concepts in liquidity-based trading.
EQL: A series of two or more swing lows that occur at approximately the same price level.
EQH: A series of two or more swing highs that occur at approximately the same price level.
EQLs and EQHs are seen as potential liquidity pools where a large number of stop loss orders or limit orders may be clustered. They can be used as potential reverse points for trades.
This multi-period feature allows traders to select less and more significant EQL and EQH:
➤ Liquidity wicks:
Liquidity wicks are a minor representation of a stop-loss hunt during the retracement of a pivot point:
➤ Buy and Sell side liquidity:
The buy side liquidity represents a concentration of potential buy orders below the current price level. When price moves into this area, it can lead to increased buying pressure due to the execution of these orders.
The sell side liquidity indicates a pool of potential sell orders below the current price level. Price movement into this area can result in increased selling pressure as these orders are executed.
➤ Sweep Liquidation Zones:
Sweep Liquidation Zones are crucial for understanding market structure and potential future price movements. They provide insights into areas where significant market participants have been forced out of their positions, potentially setting up new trading opportunities.
🔵 USAGE & EXAMPLES
The core principle behind the success of this toolkit lies in identifying "confluence." This refers to the convergence of multiple trading indicators all signaling the same information at a specific point or area. By seeking such alignment, traders can significantly enhance the likelihood of successful trades.
MS + OBs
The chart illustrates a highly bullish setup where the price is rejecting from a bullish order block (POC), while simultaneously forming a bullish Swing Failure Pattern (SFP). This occurs after an internal structure change, marked by a bullish Change of Character (CHoCH). The price broke through a bearish order block, transforming it into a breaker block, further confirming the bullish momentum.
The combination of these elements—bullish order blocks, SFP, and CHoCH—creates a powerful bullish signal, reinforcing the potential for upward movement in the market.
SFP + Bear OB
This chart above displays a bearish setup with a high probability of a price move lower. The price is currently rejecting from a bear order block, which represents a key resistance area where significant selling pressure has previously occurred. A Swing Failure Pattern (SFP) has also formed near this bear order block, indicating that the price briefly attempted to break above a recent high but failed to sustain that upward movement. This failure suggests that buyers are losing momentum, and the market could be preparing for a move to the downside.
Additionally, we can toggle on the Deviation Area in the SFP section to highlight potential levels where price deviation might occur. These deviation areas represent zones where the price is likely to react after the Swing Failure Pattern:
BUY – SELL sides + EQL
The chart showcases a bullish setup with a high probability of price breaking out of the current sell-side resistance level. The market structure indicates a formation of Equal Lows (EQL), which often suggests a build-up of liquidity that could drive the price higher.
The presence of strong buy-side pressure (69%), indicated by the green zone at the bottom, reinforces this bullish outlook. This area represents a key support zone where buyers are outpacing sellers, providing the foundation for a potential upward breakout.
EQL + Bull ChoCh
This chart illustrates a potential bullish setup, driven by the formation of Equal Lows (EQL) followed by a bullish Change of Character (CHoCH). The presence of Equal Lows often signals a liquidity build-up, which can lead to a reversal when combined with additional bullish signals.
Liquidity grab + Bull ChoCh + FVGs
This chart demonstrates a strong bullish scenario, where several important market dynamics are at play. The price begins its upward momentum from Liquidity grab following a bullish Change of Character (CHoCH), signaling the transition from a bearish phase to a bullish one.
As the price progresses, it performs liquidity grabs, which serve to gather the necessary fuel for further movement. These liquidity grabs often occur before significant price surges, as large market participants exploit these areas to accumulate positions before pushing the price higher.
The chart also highlights a market imbalance area, showing strong momentum as the price moves swiftly through this zone.
In this examples, we see how the combination of multiple “smart money” tools helps identify a potential trade opportunities. This is just one of the many scenarios that traders can spot using this toolkit. Other combinations—such as order blocks, liquidity grabs, fair value gaps, and Swing Failure Patterns (SFPs)—can also be layered on top of these concepts to further refine your trading strategy.
🔵 SETTINGS
Window: limit calculation period
Swing: limit drawing function
Mapping structure: show structural points
Algorithmic Logic: (Extreme-Adjusted) Use max high/low or pivot point calculation
Algorithmic loopback: pivot point look back
Show Last: Amount of Order block to display
Hide Overlap: hide overlapping order blocks
Construction: Size of the order blocks
Fair value gaps: Choose between normal FVG or Breaker FVG
Mitigation: (close - wick - avg) point to mitigate the order block/imbalance
SFP lookback: find a higher / lower point to improve accuracy
Threshold: remove less relevant SFP
Equal H&L: (short-mid-long term) display longer term
Liquidity Prints: Shows wicks of candles where liquidity was grabbed
Sweep Area: Identify Sweep Liquidation areas
By combining these indicators in one toolkit, traders are equipped with a comprehensive suite of tools that address every angle of the Smart Money Concept. Instead of relying on disparate tools spread across various platforms, having them integrated into a single, cohesive system allows traders to easily see confluence and make more informed trading decisions.
Trend Strength | Flux Charts💎 GENERAL OVERVIEW
Introducing the new Trend Strength indicator! Latest trends and their strengths play an important role for traders. This indicator aims to make trend and strength detection much easier by coloring candlesticks based on the current strength of trend. More info about the process in the "How Does It Work" section.
Features of the new Trend Strength Indicator :
3 Trend Detection Algorithms Combined (RSI, Supertrend & EMA Cross)
Fully Customizable Algorithm
Strength Labels
Customizable Colors For Bullish, Neutral & Bearish Trends
📌 HOW DOES IT WORK ?
This indicator uses three different methods of trend detection and combines them all into one value. First, the RSI is calculated. The RSI outputs a value between 0 & 100, which this indicator maps into -100 <-> 100. Let this value be named RSI. Then, the Supertrend is calculated. Let SPR be -1 if the calculated Supertrend is bearish, and 1 if it's bullish. After that, latest EMA Cross is calculated. This is done by checking the distance between the two EMA's adjusted by the user. Let EMADiff = EMA1 - EMA2. Then EMADiff is mapped from -ATR * 2 <-> ATR * 2 to -100 <-> 100.
Then a Total Strength (TS) is calculated by given formula : RSI * 0.5 + SPR * 0.2 + EMADiff * 0.3
The TS value is between -100 <-> 100, -100 being fully bearish, 0 being true neutral and 100 being fully bullish.
Then the Total Strength is converted into a color adjusted by the user. The candlesticks in the chart will be presented with the calculated color.
If the Labels setting is enabled, each time the trend changes direction a label will appear indicating the new direction. The latest candlestick will always show the current trend with a label.
EMA = Exponential Moving Average
RSI = Relative Strength Index
ATR = Average True Range
🚩 UNIQUENESS
The main point that differentiates this indicator from others is it's simplicity and customization options. The indicator interprets trend and strength detection in it's own way, combining 3 different well-known trend detection methods: RSI, Supertrend & EMA Cross into one simple method. The algorithm is fully customizable and all styling options are adjustable for the user's liking.
⚙️ SETTINGS
1. General Configuration
Detection Length -> This setting determines the amount of candlesticks the indicator will look for trend detection. Higher settings may help the indicator find longer trends, while lower settings will help with finding smaller trends.
Smoothing -> Higher settings will result in longer periods of time required for trend to change direction from bullish to bearish and vice versa.
EMA Lengths -> You can enter two EMA Lengths here, the second one must be longer than the first one. When the shorter one crosses under the longer one, this will be a bearish sign, and if it crosses above it will be a bullish sign for the indicator.
Labels -> Enables / Disables trend strength labels.
SpectrumLibrary "Spectrum"
This library includes spectrum analysis tools such as the Fast Fourier Transform (FFT).
method toComplex(data, polar)
Creates an array of complex type objects from a float type array.
Namespace types: array
Parameters:
data (array) : The float type array of input data.
polar (bool) : Initialization coordinates; the default is false (cartesian).
Returns: The complex type array of converted data.
method sAdd(data, value, end, start, step)
Performs scalar addition of a given float type array and a simple float value.
Namespace types: array
Parameters:
data (array) : The float type array of input data.
value (float) : The simple float type value to be added.
end (int) : The last index of the input array (exclusive) on which the operation is performed.
start (int) : The first index of the input array (inclusive) on which the operation is performed; the default value is 0.
step (int) : The step by which the function iterates over the input data array between the specified boundaries; the default value is 1.
Returns: The modified input array.
method sMult(data, value, end, start, step)
Performs scalar multiplication of a given float type array and a simple float value.
Namespace types: array
Parameters:
data (array) : The float type array of input data.
value (float) : The simple float type value to be added.
end (int) : The last index of the input array (exclusive) on which the operation is performed.
start (int) : The first index of the input array (inclusive) on which the operation is performed; the default value is 0.
step (int) : The step by which the function iterates over the input data array between the specified boundaries; the default value is 1.
Returns: The modified input array.
method eMult(data, data02, end, start, step)
Performs elementwise multiplication of two given complex type arrays.
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : the first complex type array of input data.
data02 (array type from RezzaHmt/Complex/1) : The second complex type array of input data.
end (int) : The last index of the input arrays (exclusive) on which the operation is performed.
start (int) : The first index of the input arrays (inclusive) on which the operation is performed; the default value is 0.
step (int) : The step by which the function iterates over the input data array between the specified boundaries; the default value is 1.
Returns: The modified first input array.
method eCon(data, end, start, step)
Performs elementwise conjugation on a given complex type array.
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : The complex type array of input data.
end (int) : The last index of the input array (exclusive) on which the operation is performed.
start (int) : The first index of the input array (inclusive) on which the operation is performed; the default value is 0.
step (int) : The step by which the function iterates over the input data array between the specified boundaries; the default value is 1.
Returns: The modified input array.
method zeros(length)
Creates a complex type array of zeros.
Namespace types: series int, simple int, input int, const int
Parameters:
length (int) : The size of array to be created.
method bitReverse(data)
Rearranges a complex type array based on the bit-reverse permutations of its size after zero-padding.
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : The complex type array of input data.
Returns: The modified input array.
method R2FFT(data, inverse)
Calculates Fourier Transform of a time series using Cooley-Tukey Radix-2 Decimation in Time FFT algorithm, wikipedia.org
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : The complex type array of input data.
inverse (int) : Set to -1 for FFT and to 1 for iFFT.
Returns: The modified input array containing the FFT result.
method LBFFT(data, inverse)
Calculates Fourier Transform of a time series using Leo Bluestein's FFT algorithm, wikipedia.org This function is nearly 4 times slower than the R2FFT function in practice.
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : The complex type array of input data.
inverse (int) : Set to -1 for FFT and to 1 for iFFT.
Returns: The modified input array containing the FFT result.
method DFT(data, inverse)
This is the original DFT algorithm. It is not suggested to be used regularly.
Namespace types: array
Parameters:
data (array type from RezzaHmt/Complex/1) : The complex type array of input data.
inverse (int) : Set to -1 for DFT and to 1 for iDFT.
Returns: The complex type array of DFT result.
Gaussian Weighted Moving Average with Forecast [CHE]Presentation for TradingView: Gaussian Weighted Moving Average with Forecast
Introduction
Welcome to our presentation on the "Gaussian Weighted Moving Average with Forecast" (GWMA). This script, written in Pine Script™, offers an enhanced method for analyzing and predicting price movements on TradingView. The script combines Gaussian Weighted Moving Averages and polynomial regression to provide accurate and customizable forecasts.
Overview
Title: Gaussian Weighted Moving Average with Forecast
Author: chervolino
License: Mozilla Public License 2.0
Main Features
1. Gaussian Weighted Moving Average (GWMA):
- Calculates a weighted moving average using a Gaussian weighting function.
- Parameters for length and standard deviation allow fine-tuning of the smoothing effect.
2. Polynomial Regression with Forecast:
- Creates a model to predict future price movements.
- Adjustable length and degree of polynomial regression.
- Option to extrapolate predictions and visualize them.
3. Visual Representation:
- Uses lines and colors to depict trend changes.
- Customizable colors for upward and downward trends.
Input Parameters
Length: Length of the moving average (default: 50)
Standard Deviation: Standard deviation for Gaussian weighting (default: 10.0)
Width: Width of the plotted lines (default: 1)
Colors: Customizable colors for upward and downward trends
Forecast Length: Length of the forecast period (default: 20)
Extrapolate Length: Length of the extrapolation (default: 50)
Polynomial Degree: Degree of the polynomial regression (default: 3)
Lock Forecast: Option to lock and stabilize the forecast
Core Algorithms
1. Gaussian Weight Calculation:
gaussian_weight(x, std_dev) =>
1 / (std_dev * math.sqrt(2 * math.pi)) * math.exp(-0.5 * math.pow(x / std_dev, 2))
2. GWMA Calculation:
calculate_gwma(length, std_dev) =>
// Algorithm to calculate the weighted moving average
3. Initialize Lines for Polynomial Regression:
initialize_lines_array(extrapolate, length) =>
// Initialize array lines
4. Create Design Matrix for Polynomial Regression:
get_design_matrix(length, degree) =>
// Create the design matrix
5. Calculate and Plot Polynomial Regression:
calculate_polynomial_regression(src, length, degree, extrapolate, lines_arr, lock, width, upward_color, downward_color) =>
// Algorithm to calculate polynomial regression and plot the forecast
Combining Indicators: Originality and Usefulness
The combination of Gaussian Weighted Moving Average and polynomial regression provides traders with a robust tool for trend analysis and prediction. The GWMA smooths out price data while emphasizing recent prices, making it sensitive to short-term trends. Polynomial regression, on the other hand, offers a mathematical approach to model and forecast future prices based on historical data. By integrating these two methodologies, traders can achieve a more comprehensive view of market trends and potential future movements, making the tool highly valuable for decision-making.
Explanation for Users
Most TradingView users are not familiar with Pine Script, so a clear description is essential for understanding how to use the script.
Gaussian Weighted Moving Average (GWMA): This indicator calculates a moving average using Gaussian weights, which gives more importance to recent prices. The length and standard deviation parameters allow users to control the sensitivity and smoothness of the average.
Polynomial Regression with Forecast: This feature uses polynomial regression to model the price trend and predict future movements. Users can adjust the length of the historical data used, the degree of the polynomial, and the length of the forecast. The script plots these predictions, making it easier for traders to visualize potential future price paths.
Visualization of Results
1. GWMA Plotting:
plot(gaussian_ma_result, title="GWMA", color=line_color, linewidth=width_input)
2. Forecast Extrapolation:
plot(forecast_val, 'Extrapolation', offset=extrapolate_setting, linewidth=width_input, style=plot.style_circles)
Conclusion
The "Gaussian Weighted Moving Average with Forecast" script provides a powerful tool for analyzing and predicting price movements on TradingView. By combining Gaussian weighting and polynomial regression, it offers a precise and customizable method for trend analysis and forecasting.
Thank you for your attention! For any questions or further information, please feel free to reach out.
Venit A.I Trading V1RSI indicatorThis indicator is designed to provide buy and sell signals based on the Relative Strength Index (RSI). Here's a breakdown of its components and functionality:
1. **Input Parameters**:
- `Period`: This parameter allows the user to adjust the period used in calculating the RSI.
- `Upper Threshold` and `Lower Threshold`: These parameters define the overbought and oversold levels for the RSI.
- `Imverse Algorithm`: This parameter allows the user to toggle between different algorithms for generating buy and sell signals.
- `Show Lines`: This parameter toggles the visibility of lines on the chart indicating buy and sell signals.
- `Show Labels`: This parameter toggles the visibility of labels on the chart indicating buy and sell signals.
2. **RSI Calculation**:
- The RSI is calculated using the specified period (`myPeriod`), typically representing the closing prices of the asset.
3. **Buy and Sell Conditions**:
- Buy conditions are determined based on whether the RSI crosses below the lower threshold (`myThresholdDn`), indicating potential oversold conditions.
- Sell conditions are determined based on whether the RSI crosses above the upper threshold (`myThresholdUp`), indicating potential overbought conditions.
- The choice of buy and sell conditions can be toggled using the `Imverse Algorithm` parameter.
4. **Position Tracking**:
- The indicator maintains a variable `myPosition` to track the current position (buy or sell) based on the generated signals.
- If a buy signal occurs (`buy` condition is true), `myPosition` is set to 0. If a sell signal occurs (`sell` condition is true) or the previous position was a buy, `myPosition` is set to 1. Otherwise, `myPosition` remains unchanged.
5. **Visualization**:
- Buy and sell signals are plotted on the chart using shapes (`plotshape`) based on the `myLineToggle` and `myLabelToggle` parameters.
- Lines are drawn on the chart to visually represent buy and sell signals.
- Labels are placed on the chart indicating buy and sell signals.
6. **Alerts**:
- The indicator provides alerts for buy and sell signals using the `alertcondition` function.
Overall, this indicator aims to provide traders with signals based on RSI movements, helping them identify potential buying and selling opportunities in the market. The flexibility in parameters allows users to customize the indicator based on their trading preferences and strategies.
Fair Value Gap Screener | Flux Charts💎 GENERAL OVERVIEW
Introducing our new Fair Value Gap Screener! This screener can provide information about the latest Fair Value Gaps in up to 5 tickers. You can also customize the algorithm that finds the Fair Value Gaps and the styling of the screener.
Features of the new Fair Value Gap (FVG) Screener :
Find Latest Fair Value Gaps Accross 5 Tickers
Shows Their Information Of :
Latest Status
Number Of Retests
Consumption Percent
Bullish & Bearish Volume
Customizable Algoritm / Styling
📌 HOW DOES IT WORK ?
A Fair Value Gap generally occur when there is an imbalance in the market. They can be detected by specific formations within the chart. This screener then finds Fair Value Gaps accross 5 different tickers, and shows the latest information about them.
Status ->
Far -> The current price is far away from the FVG.
Approaching ⬆️/⬇️ -> The current price is approaching the FVG, and the direction it's approaching from.
Inside -> The price is currently inside the FVG.
Retests -> Retest means the price tried to invalidate the FVG, but failed to do so. Here you can see how many times the price retested the FVG.
Consumed -> FVGs get consumed when a Close / Wick enters the FVG zone. For example, if the price hits the middle of the FVG zone, the zone is considered 50% consumed.
Bullish / Bearish Volume -> Bullish & Bearish volume of a FVG is calculated by analyzing the bars that formed it. For example in a bullish FVG, the bullish volume is the total volume of the first 2 bars forming the FVG, and the bearish volume is the volume of the 3rd bar that forms it.
🚩UNIQUENESS
This screener can detect latest Fair Value Gaps and give information about them for up to 5 tickers. This saves the user time by showing them all in a dashboard at the same time. The screener also uniquely shows information about the number of retests and the consumed percent of the FVG, as well as it's bullish & bearish volume. We believe that this extra information will help you spot reliable FVGs easier.
⚙️SETTINGS
1. Tickers
You can set up to 5 tickers for the screener to scan Fair Value Gaps here. You can also enable / disable them and set their individual timeframes.
2. General Configuration
Zone Invalidation -> Select between Wick & Close price for FVG Zone Invalidation.
Zone Filtering -> With "Average Range" selected, algorithm will find FVG zones in comparison with average range of last bars in the chart. With the "Volume Threshold" option, you may select a Volume Threshold % to spot FVGs with a larger total volume than average.
FVG Detection -> With the "Same Type" option, all 3 bars that formed the FVG should be the same type. (Bullish / Bearish). If the "All" option is selected, bar types may vary between Bullish / Bearish.
Detection Sensitivity -> You may select between Low, Normal or High FVG detection sensitivity. This will essentially determine the size of the spotted FVGs, with lower sensitivies resulting in spotting bigger FVGs, and higher sensitivies resulting in spotting all sizes of FVGs.
Adaptive Fisherized Z-scoreHello Fellas,
It's time for a new adaptive fisherized indicator of me, where I apply adaptive length and more on a classic indicator.
Today, I chose the Z-score, also called standard score, as indicator of interest.
Special Features
Advanced Smoothing: JMA, T3, Hann Window and Super Smoother
Adaptive Length Algorithms: In-Phase Quadrature, Homodyne Discriminator, Median and Hilbert Transform
Inverse Fisher Transform (IFT)
Signals: Enter Long, Enter Short, Exit Long and Exit Short
Bar Coloring: Presents the trade state as bar colors
Band Levels: Changes the band levels
Decision Making
When you create such a mod you need to think about which concepts are the best to conclude. I decided to take Inverse Fisher Transform instead of normalization to make a version which fits to a fixed scale to avoid the usual distortion created by normalization.
Moreover, I chose JMA, T3, Hann Window and Super Smoother, because JMA and T3 are the bleeding-edge MA's at the moment with the best balance of lag and responsiveness. Additionally, I chose Hann Window and Super Smoother because of their extraordinary smoothing capabilities and because Ehlers favours them.
Furthermore, I decided to choose the half length of the dominant cycle instead of the full dominant cycle to make the indicator more responsive which is very important for a signal emitter like Z-score. Signal emitters always need to be faster or have the same speed as the filters they are combined with.
Usage
The Z-score is a low timeframe scalper which works best during choppy/ranging phases. The direction you should trade is determined by the last trend change. E.g. when the last trend change was from bearish market to bullish market and you are now in a choppy/ranging phase confirmed by e.g. Chop Zone or KAMA slope you want to do long trades.
Interpretation
The Z-score indicator is a momentum indicator which shows the number of standard deviations by which the value of a raw score (price/source) is above or below the mean value of what is being observed or measured. Easily explained, it is almost the same as Bollinger Bands with another visual representation form.
Signals
B -> Buy -> Z-score crosses above lower band
S -> Short -> Z-score crosses below upper band
BE -> Buy Exit -> Z-score crosses above 0
SE -> Sell Exit -> Z-score crosses below 0
If you were reading till here, thank you already. Now, follows a bunch of knowledge for people who don't know the concepts I talk about.
T3
The T3 moving average, short for "Tim Tillson's Triple Exponential Moving Average," is a technical indicator used in financial markets and technical analysis to smooth out price data over a specific period. It was developed by Tim Tillson, a software project manager at Hewlett-Packard, with expertise in Mathematics and Computer Science.
The T3 moving average is an enhancement of the traditional Exponential Moving Average (EMA) and aims to overcome some of its limitations. The primary goal of the T3 moving average is to provide a smoother representation of price trends while minimizing lag compared to other moving averages like Simple Moving Average (SMA), Weighted Moving Average (WMA), or EMA.
To compute the T3 moving average, it involves a triple smoothing process using exponential moving averages. Here's how it works:
Calculate the first exponential moving average (EMA1) of the price data over a specific period 'n.'
Calculate the second exponential moving average (EMA2) of EMA1 using the same period 'n.'
Calculate the third exponential moving average (EMA3) of EMA2 using the same period 'n.'
The formula for the T3 moving average is as follows:
T3 = 3 * (EMA1) - 3 * (EMA2) + (EMA3)
By applying this triple smoothing process, the T3 moving average is intended to offer reduced noise and improved responsiveness to price trends. It achieves this by incorporating multiple time frames of the exponential moving averages, resulting in a more accurate representation of the underlying price action.
JMA
The Jurik Moving Average (JMA) is a technical indicator used in trading to predict price direction. Developed by Mark Jurik, it’s a type of weighted moving average that gives more weight to recent market data rather than past historical data.
JMA is known for its superior noise elimination. It’s a causal, nonlinear, and adaptive filter, meaning it responds to changes in price action without introducing unnecessary lag. This makes JMA a world-class moving average that tracks and smooths price charts or any market-related time series with surprising agility.
In comparison to other moving averages, such as the Exponential Moving Average (EMA), JMA is known to track fast price movement more accurately. This allows traders to apply their strategies to a more accurate picture of price action.
Inverse Fisher Transform
The Inverse Fisher Transform is a transform used in DSP to alter the Probability Distribution Function (PDF) of a signal or in our case of indicators.
The result of using the Inverse Fisher Transform is that the output has a very high probability of being either +1 or –1. This bipolar probability distribution makes the Inverse Fisher Transform ideal for generating an indicator that provides clear buy and sell signals.
Hann Window
The Hann function (aka Hann Window) is named after the Austrian meteorologist Julius von Hann. It is a window function used to perform Hann smoothing.
Super Smoother
The Super Smoother uses a special mathematical process for the smoothing of data points.
The Super Smoother is a technical analysis indicator designed to be smoother and with less lag than a traditional moving average.
Adaptive Length
Length based on the dominant cycle length measured by a "dominant cycle measurement" algorithm.
Happy Trading!
Best regards,
simwai
---
Credits to
@cheatcountry
@everget
@loxx
@DasanC
@blackcat1402
DNA GRAVITY PRICE V1 PINESCRIPTLABSWe can observe that this indicator displays the range within which the asset fluctuates around the average price, and its behavior depends on the parameters of amplitude and angular frequency. "price_mas" is a measure calculated as part of the indicator. It is derived by adding an adjusted amplitude (A_mas) multiplied by the cosine of the combination of angular frequency (w), time, and a phase shift (phi) to the average price (P0). This calculated value oscillates around the actual asset price and is used to identify potential turning points and the range where the price has established itself within the specified lookback period.
2.- At its core, the indicator utilizes the innovative concept of 'price_mas,' a calculated metric visualized in three essential colors: green to indicate low levels, blue for medium levels, and red for high levels. These colors reflect the position of the price in relation to a range determined by historical highs and lows.
In the context of the "DNA GRAVITY PRICE V1 " indicator, low, medium, and high levels specifically refer to the calculated value of 'price_mas,' which is a derived measure within the indicator. They do not directly refer to the actual asset price but rather to a calculated value that the indicator uses to analyze and predict the behavior of the asset's price.
This algorithm stands out for its ability to capture the 'strength' of the price through the 'price_mas' zones. Once the price exits the zones marked by the 'price_mas' (red, blue, and green plots), it tends to return with significant force.
Buy & Sell Signals:
Buy Signal: If the price and the Donchian lines cross above the high threshold, visually represented by red diamonds, it indicates a strong bullish momentum. This not only shows that the price is rising but also that the trend is strong enough to push the Donchian lines, which represent price extremes over a certain period, above the threshold. This convergence of movements, marked by the crossing over the red diamonds, suggests a higher probability of the bullish trend continuing.
Sell Signal: Similarly, if the price and the Donchian lines fall below the low threshold, visualized as green diamonds, this signals a significant bearish momentum. The simultaneous decline of the price and the Donchian lines below this threshold, marked by the green diamonds, indicates that not only is the price decreasing, but the bearish trend is strong enough to influence the price extremes calculated by the Donchian lines.
Configuration:
-The "Initial Dynamic Length of MAS Price" parameter controls the smoothness and sensitivity of the indicator. A high value smooths the Simple Moving Average (SMA), making the indicator less responsive to short-term price fluctuations. On the other hand, a low value makes the indicator more sensitive to short-term price fluctuations, generating faster and more volatile signals
-This parameter, "MAS Amplitude Percentage," determines the amplitude as a percentage. Increasing the Initial Dynamic Price will result in a larger amplitude relative to the price, leading to wider ranges for the indicator. Decreasing this value will have the opposite effect, reducing the amplitude relative to the price. Increasing "A_mas_pct" can make signals more extreme and less frequent, while decreasing it will make signals smoother and more frequent.
-This parameter, "Angular Frequency of MAS," affects the frequency of oscillations in the calculation of the "Initial Dynamic Price." A higher value of "w" will make the oscillations faster and more frequent, which means that the indicator will be more responsive to abrupt price changes. Conversely, a lower value will make the oscillations slower and smoother, making the indicator less sensitive to rapid price changes. Modifying ""Angular Frequency of MAS,"" directly impacts the frequency of oscillations in the indicator.
Español:
Podemos observar que este indicador muestra el rango en el cual el activo fluctúa alrededor del precio promedio y su comportamiento depende de los parámetros de amplitud y frecuencia angular. "price_mas" es una medida calculada como parte del indicador. Se deriva al sumar una amplitud ajustada (A_mas) multiplicada por el coseno de la combinación de frecuencia angular (w), tiempo y un desplazamiento de fase (phi) al precio promedio (P0). Este valor calculado oscila alrededor del precio real del activo y se utiliza para identificar posibles puntos de giro y el rango donde el precio se ha establecido dentro del período de búsqueda especificado.
En su núcleo, el indicador utiliza el innovador concepto de 'price_mas', una métrica calculada visualizada en tres colores esenciales: verde para indicar niveles bajos, azul para niveles medios y rojo para niveles altos. Estos colores reflejan la posición del precio en relación con un rango determinado por los máximos y mínimos históricos.
En el contexto del indicador "DNA GRAVITY PRICE V1", los niveles bajos, medios y altos se refieren específicamente al valor calculado de 'price_mas', que es una medida derivada dentro del indicador. No se refieren directamente al precio real del activo, sino a un valor calculado que el indicador utiliza para analizar y predecir el comportamiento del precio del activo.
Este algoritmo se destaca por su capacidad para capturar la 'fortaleza' del precio a través de las zonas de 'price_mas'. Una vez que el precio sale de las zonas marcadas por 'price_mas' (trazas rojas, azules y verdes), tiende a regresar con una fuerza significativa. Este comportamiento es crucial para los operadores, ya que proporciona oportunidades tanto para capitalizar las retracciones de precios como para anticipar posibles cambios de tendencia.
Señales de Compra y Venta:
Señal de Compra: Si el precio y las líneas Donchian cruzan por encima del umbral alto, visualmente representado por diamantes rojos, indica un fuerte impulso alcista. Esto no solo muestra que el precio está aumentando, sino que la tendencia es lo suficientemente fuerte como para empujar las líneas Donchian, que representan los extremos de precio durante un período determinado, por encima del umbral. Esta convergencia de movimientos, marcada por el cruce sobre los diamantes rojos, sugiere una mayor probabilidad de que la tendencia alcista continúe.
Señal de Venta: De manera similar, si el precio y las líneas Donchian caen por debajo del umbral bajo, visualizado como diamantes verdes, esto señala un fuerte impulso bajista. La caída simultánea del precio y las líneas Donchian por debajo de este umbral, marcada por los diamantes verdes, indica que no solo el precio está disminuyendo, sino que la tendencia bajista es lo suficientemente fuerte como para influir en los extremos de precio calculados por las líneas Donchian.
Configuración:
El parámetro "Longitud Dinámica Inicial de MAS Price" controla la suavidad y la sensibilidad del indicador. Un valor alto suaviza el Promedio Móvil Simple (SMA), lo que hace que el indicador sea menos sensible a las fluctuaciones de precio a corto plazo. Por otro lado, un valor bajo hace que el indicador sea más sensible a las fluctuaciones de precio a corto plazo, generando señales más rápidas y volátiles.
Este parámetro, "Porcentaje de Amplitud de MAS," determina la amplitud como un porcentaje. Aumentar el valor de "Longitud Dinámica Inicial de MAS Price" dará como resultado una amplitud más grande en relación con el precio, lo que conducirá a rangos más amplios para el indicador. Disminuir este valor tendrá el efecto contrario, reduciendo la amplitud en relación con el precio. Aumentar "Porcentaje de A_mas" puede hacer que las señales sean más extremas y menos frecuentes, mientras que disminuirlo hará que las señales sean más suaves y más frecuentes.
Este parámetro, "Frecuencia Angular de MAS," afecta la frecuencia de las oscilaciones en el cálculo del "Precio Móvil Simple Inicial." Un valor más alto de "w" hará que las oscilaciones sean más rápidas y frecuentes, lo que significa que el indicador será más receptivo a cambios abruptos en el precio. Por otro lado, un valor más bajo hará que las oscilaciones sean más lentas y suaves, haciendo que el indicador sea menos sensible a cambios rápidos en el precio. Modificar "Frecuencia Angular de MAS" afecta directamente la frecuencia de las oscilaciones en el indicador.
KNN Regression [SS]Another indicator release, I know.
But note, this isn't intended to be a stand-alone indicator, this is just a functional addition for those who program Machine Learning algorithms in Pinescript! There isn't enough content here to merit creating a library for (it's only 1 function), but it's a really useful function for those who like machine learning and Nearest Known Neighbour Algos (or KNN).
About the indicator:
This indicator creates a function to perform KNN-based regression.
In contrast to traditional linear regression, KNN-based regression has the following advantages over linear regression:
Advantages of KNN Regression vs. Linear Regression:
🎯 Non-linearity: KNN is a non-parametric method, meaning it makes no assumptions about the underlying data distribution. This allows it to capture non-linear relationships between features and the target variable.
🎯Simple Implementation: KNN is conceptually simple and easy to understand. It doesn't require the estimation of parameters, making it straightforward to implement.
🎯Robust to Outliers: KNN is less sensitive to outliers compared to linear regression. Outliers can have a significant impact on linear regression models, but KNN tends to be less affected.
Disadvantages of KNN Regression vs. Linear Regression:
🎯 Resource Intensive for Computation: Because KNN operates on identifying the nearest neighbors in a dataset, each new instance has to be searched for and identified within the dataset, vs. linear regression which can create a coefficient-based model and draw from the coefficient for each new data point.
🎯Curse of Dimensionality: KNN performance can degrade with an increasing number of features, leading to a "curse of dimensionality." This is because, in high-dimensional spaces, the concept of proximity becomes less meaningful.
🎯Sensitive to Noise: KNN can be sensitive to noisy data, as it relies on the local neighborhood for predictions. Noisy or irrelevant features may affect its performance.
Which is better?
I am very biased, coming from a statistics background. I will always love linear regression and will always prefer it over KNN. But depending on what you want to accomplish, KNN makes sense. If you are using highly skewed data or data that you cannot identify linearity in, KNN is probably preferable.
However, if you require precise estimations of ranges and outliers, such as creating co-integration models, I would advise sticking with linear regression. However, out of curiosity, I exported the function into a separate dummy indicator and pulled in data from QQQ to predict SPY close, and the results are actually very admirable:
And plotted with showing the standard error variance:
Pretty impressive, I must say I was a little shocked, it's really giving linear regression a run for its money. In school I was taught LinReg is the gold standard for modeling, nothing else compares. So as with most things in trading, this is challenging some biases of mine ;).
Functionality of the function
I have permitted 3 types of KNN regression. Traditional KNN regression, as I understand it, revolves around clustering. ( Clustering refers to identifying a cluster, normally 3, of identical cases and averaging out the Dependent variable in each of those cases) . Clustering is great, but when you are working with a finite dataset, identifying exact matches for 2 or 3 clusters can be challenging when you are only looking back at 500 candles or 1000 candles, etc.
So to accommodate this, I have added a functionality to clustering called "Tolerance". And it allows you to set a tolerance level for your Euclidean distance parameters. As a default, I have tested this with a default of 0.5 and it has worked great and no need to change even when working with large numbers such as NQ and ES1!.
However, I have added 2 additional regression types that can be done with KNN.
#1 One is a regression by the last IDENTICAL instance, which will find the most recent instance of a similar Independent variable and pull the Dependent variable from that instance. Or
#2 Average from all IDENTICAL instances.
Using the function
The code has the instructions for integrating the function into your own code, the parameters, and such, so I won't exhaust you with the boring details about that here.
But essentially, it exports 3, float variables, the Result, the Correlation, and the simplified R2.
As this is KNN regression, there are no coefficients, slopes, or intercepts and you do not need to test for linearity before applying it.
Also, the output can be a bit choppy, so I tend to like to throw in a bit of smoothing using the ta.sma function at a deault of 14.
For example, here is SPY from QQQ smoothed as a 14 SMA:
And it is unsmoothed:
It seems relatively similar but it does make a bit of an aesthetic difference. And if you are doing it over 14, there is no data loss and it is still quite reactive to changes in data.
And that's it! Hopefully you enjoy and find some interesting uses for this function in your own scripts :-).
Safe trades everyone!
IPDA Standard Deviations [DexterLab x TFO x toodegrees]> Introduction and Acknowledgements
The IPDA Standard Deviations tool encompasses the Time and price relationship as studied by @TraderDext3r .
I am not the creator of this Theory, and I do not hold the answers to all the questions you may have; I suggest you to study it from Dexter's tweets, videos, and material.
This tool was born from a collaboration between @TraderDext3r, @tradeforopp and I, with the objective of bringing a comprehensive IPDA Standard Deviations tool to Tradingview.
> Tool Description
This is purely a graphical aid for traders to be able to quickly determine Fractal IPDA Time Windows, and trace the potential Standard Deviations of the moves at their respective high and low extremes.
The disruptive value of this tool is that it allows traders to save Time by automatically adapting the Time Windows based on the current chart's Timeframe, as well as providing customizations to filter and focus on the appropriate Standard Deviations.
> IPDA Standard Deviations by TraderDext3r
The underlying idea is based on the Interbank Price Delivery Algorithm's lookback windows on the daily chart as taught by the Inner Circle Trader:
IPDA looks at the past three months of price action to determine how to deliver price in the future.
Additionally, the ICT concept of projecting specific manipulation moves prior to large displacement upwards/downwards is used to navigate and interpret the priorly mentioned displacement move. We pay attention to specific Standard Deviations based on the current environment and overall narrative.
Dexter being one of the most prominent Inner Circle Trader students, harnessed the fractal nature of price to derive fractal IPDA Lookback Time Windows for lower Timeframes, and studied the behaviour of price at specific Deviations.
For Example:
The -1 to -2 area can initiate an algorithmic retracement before continuation.
The -2 to -2.5 area can initiate an algorithmic retracement before continuation, or a Smart Money Reversal.
The -4 area should be seen as the ultimate objective, or the level at which the displacement will slow down.
Given that these ideas stem from ICT's concepts themselves, they are to be used hand in hand with all other ICT Concepts (PD Array Matrix, PO3, Institutional Price Levels, ...).
> Fractal IPDA Time Windows
The IPDA Lookbacks Types identified by Dexter are as follows:
Monthly – 1D Chart: one widow per Month, highlighting the past three Months.
Weekly – 4H to 8H Chart: one window per Week, highlighting the past three Weeks.
Daily – 15m to 1H Chart: one window per Day, highlighting the past three Days.
Intraday – 1m to 5m Chart: one window per 4 Hours highlighting the past 12 Hours.
Inside these three respective Time Windows, the extreme High and Low will be identified, as well as the prior opposing short term market structure point. These represent the anchors for the Standard Deviation Projections.
> Tool Settings
The User is able to plot any type of Standard Deviation they want by inputting them in the settings, in their own line of the text box. They will always be plotted from the Time Windows extremes.
As previously mentioned, the User is also able to define their own Timeframe intervals for the respective IPDA Lookback Types. The specific Timeframes on which the different Lookback Types are plotted are edge-inclusive. In case of an overlap, the higher Timeframe Lookback will be prioritized.
Finally the User is able to filter and remove Standard Deviations in two ways:
"Remove Once Invalidated" will automatically delete a Deviation once its outer anchor extreme is traded through.
Manual Toggles will allow to remove the Upward or Downward Deviation of each Time Window at the discretion of the User.
Major shoutout to Dexter and TFO for their Time, it was a pleasure to collaborate and create this tool with them.
GLGT!
Machine Learning Momentum Oscillator [ChartPrime]The Machine Learning Momentum Oscillator brings together the K-Nearest Neighbors (KNN) algorithm and the predictive strength of the Tactical Sector Indicator (TSI) Momentum. This unique oscillator not only uses the insights from TSI Momentum but also taps into the power of machine learning therefore being designed to give traders a more comprehensive view of market momentum.
At its core, the Machine Learning Momentum Oscillator blends TSI Momentum with the capabilities of the KNN algorithm. Introducing KNN logic allows for better handling of noise in the data set. The TSI Momentum is known for understanding how strong trends are and which direction they're headed, and now, with the added layer of machine learning, we're able to offer a deeper perspective on market trends. This is a fairly classical when it comes to visuals and trading.
Green bars show the trader when the asset is in an uptrend. On the flip side, red bars mean things are heading down, signaling a bearish movement driven by selling pressure. These color cues make it easier to catch the sentiment and direction of the market in a glance.
Yellow boxes are also displayed by the oscillator. These boxes highlight potential turning points or peaks. When the market comes close to these points, they can provide a heads-up about the possibility of changes in momentum or even a trend reversal, helping a trader make informed choices quickly. These can be looked at as possible reversal areas simply put.
Settings:
Users can adjust the number of neighbours in the KNN algorithm and choose the periods they prefer for analysis. This way, the tool becomes a part of a trader's strategy, adapting to different market conditions as they see fit. Users can also adjust the smoothing used by the oscillator via the smoothing input.