Hurst Exponent - Detrended Fluctuation AnalysisIn stochastic processes, chaos theory and time series analysis, detrended fluctuation analysis (DFA) is a method for determining the statistical self-affinity of a signal. It is useful for analyzing time series that appear to be long-memory processes and noise.
█ OVERVIEW
We have introduced the concept of Hurst Exponent in our previous open indicator Hurst Exponent (Simple). It is an indicator that measures market state from autocorrelation. However, we apply a more advanced and accurate way to calculate Hurst Exponent rather than simple approximation. Therefore, we recommend using this version of Hurst Exponent over our previous publication going forward. The method we used here is called detrended fluctuation analysis. (For folks that are not interested in the math behind the calculation, feel free to skip to "features" and "how to use" section. However, it is recommended that you read it all to gain a better understanding of the mathematical reasoning).
█ Detrend Fluctuation Analysis
Detrended Fluctuation Analysis was first introduced by by Peng, C.K. (Original Paper) in order to measure the long-range power-law correlations in DNA sequences . DFA measures the scaling-behavior of the second moment-fluctuations, the scaling exponent is a generalization of Hurst exponent.
The traditional way of measuring Hurst exponent is the rescaled range method. However DFA provides the following benefits over the traditional rescaled range method (RS) method:
• Can be applied to non-stationary time series. While asset returns are generally stationary, DFA can measure Hurst more accurately in the instances where they are non-stationary.
• According the the asymptotic distribution value of DFA and RS, the latter usually overestimates Hurst exponent (even after Anis- Llyod correction) resulting in the expected value of RS Hurst being close to 0.54, instead of the 0.5 that it should be. Therefore it's harder to determine the autocorrelation based on the expected value. The expected value is significantly closer to 0.5 making that threshold much more useful, using the DFA method on the Hurst Exponent (HE).
• Lastly, DFA requires lower sample size relative to the RS method. While the RS method generally requires thousands of observations to reduce the variance of HE, DFA only needs a sample size greater than a hundred to accomplish the above mentioned.
█ Calculation
DFA is a modified root-mean-squares (RMS) analysis of a random walk. In short, DFA computes the RMS error of linear fits over progressively larger bins (non-overlapped “boxes” of similar size) of an integrated time series.
Our signal time series is the log returns. First we subtract the mean from the log return to calculate the demeaned returns. Then, we calculate the cumulative sum of demeaned returns resulting in the cumulative sum being mean centered and we can use the DFA method on this. The subtraction of the mean eliminates the “global trend” of the signal. The advantage of applying scaling analysis to the signal profile instead of the signal, allows the original signal to be non-stationary when needed. (For example, this process converts an i.i.d. white noise process into a random walk.)
We slice the cumulative sum into windows of equal space and run linear regression on each window to measure the linear trend. After we conduct each linear regression. We detrend the series by deducting the linear regression line from the cumulative sum in each windows. The fluctuation is the difference between cumulative sum and regression.
We use different windows sizes on the same cumulative sum series. The window sizes scales are log spaced. Eg: powers of 2, 2,4,8,16... This is where the scale free measurements come in, how we measure the fractal nature and self similarity of the time series, as well as how the well smaller scale represent the larger scale.
As the window size decreases, we uses more regression lines to measure the trend. Therefore, the fitness of regression should be better with smaller fluctuation. It allows one to zoom into the “picture” to see the details. The linear regression is like rulers. If you use more rulers to measure the smaller scale details you will get a more precise measurement.
The exponent we are measuring here is to determine the relationship between the window size and fitness of regression (the rate of change). The more complex the time series are the more it will depend on decreasing window sizes (using more linear regression lines to measure). The less complex or the more trend in the time series, it will depend less. The fitness is calculated by the average of root mean square errors (RMS) of regression from each window.
Root mean Square error is calculated by square root of the sum of the difference between cumulative sum and regression. The following chart displays average RMS of different window sizes. As the chart shows, values for smaller window sizes shows more details due to higher complexity of measurements.
The last step is to measure the exponent. In order to measure the power law exponent. We measure the slope on the log-log plot chart. The x axis is the log of the size of windows, the y axis is the log of the average RMS. We run a linear regression through the plotted points. The slope of regression is the exponent. It's easy to see the relationship between RMS and window size on the chart. Larger RMS equals less fitness of the regression. We know the RMS will increase (fitness will decrease) as we increases window size (use less regressions to measure), we focus on the rate of RMS increasing (how fast) as window size increases.
If the slope is < 0.5, It means the rate of of increase in RMS is small when window size increases. Therefore the fit is much better when it's measured by a large number of linear regression lines. So the series is more complex. (Mean reversion, negative autocorrelation).
If the slope is > 0.5, It means the rate of increase in RMS is larger when window sizes increases. Therefore even when window size is large, the larger trend can be measured well by a small number of regression lines. Therefore the series has a trend with positive autocorrelation.
If the slope = 0.5, It means the series follows a random walk.
█ FEATURES
• Sample Size is the lookback period for calculation. Even though DFA requires a lower sample size than RS, a sample size larger > 50 is recommended for accurate measurement.
• When a larger sample size is used (for example = 1000 lookback length), the loading speed may be slower due to a longer calculation. Date Range is used to limit numbers of historical calculation bars. When loading speed is too slow, change the data range "all" into numbers of weeks/days/hours to reduce loading time. (Credit to allanster)
• “show filter” option applies a smoothing moving average to smooth the exponent.
• Log scale is my work around for dynamic log space scaling. Traditionally the smallest log space for bars is power of 2. It requires at least 10 points for an accurate regression, resulting in the minimum lookback to be 1024. I made some changes to round the fractional log space into integer bars requiring the said log space to be less than 2.
• For a more accurate calculation a larger "Base Scale" and "Max Scale" should be selected. However, when the sample size is small, a larger value would cause issues. Therefore, a general rule to be followed is: A larger "Base Scale" and "Max Scale" should be selected for a larger the sample size. It is recommended for the user to try and choose a larger scale if increasing the value doesn't cause issues.
The following chart shows the change in value using various scales. As shown, sometimes increasing the value makes the value itself messy and overshoot.
When using the lowest scale (4,2), the value seems stable. When we increase the scale to (8,2), the value is still alright. However, when we increase it to (8,4), it begins to look messy. And when we increase it to (16,4), it starts overshooting. Therefore, (8,2) seems to be optimal for our use.
█ How to Use
Similar to Hurst Exponent (Simple). 0.5 is a level for determine long term memory.
• In the efficient market hypothesis, market follows a random walk and Hurst exponent should be 0.5. When Hurst Exponent is significantly different from 0.5, the market is inefficient.
• When Hurst Exponent is > 0.5. Positive Autocorrelation. Market is Trending. Positive returns tend to be followed by positive returns and vice versa.
• Hurst Exponent is < 0.5. Negative Autocorrelation. Market is Mean reverting. Positive returns trends to follow by negative return and vice versa.
However, we can't really tell if the Hurst exponent value is generated by random chance by only looking at the 0.5 level. Even if we measure a pure random walk, the Hurst Exponent will never be exactly 0.5, it will be close like 0.506 but not equal to 0.5. That's why we need a level to tell us if Hurst Exponent is significant.
So we also computed the 95% confidence interval according to Monte Carlo simulation. The confidence level adjusts itself by sample size. When Hurst Exponent is above the top or below the bottom confidence level, the value of Hurst exponent has statistical significance. The efficient market hypothesis is rejected and market has significant inefficiency.
The state of market is painted in different color as the following chart shows. The users can also tell the state from the table displayed on the right.
An important point is that Hurst Value only represents the market state according to the past value measurement. Which means it only tells you the market state now and in the past. If Hurst Exponent on sample size 100 shows significant trend, it means according to the past 100 bars, the market is trending significantly. It doesn't mean the market will continue to trend. It's not forecasting market state in the future.
However, this is also another way to use it. The market is not always random and it is not always inefficient, the state switches around from time to time. But there's one pattern, when the market stays inefficient for too long, the market participants see this and will try to take advantage of it. Therefore, the inefficiency will be traded away. That's why Hurst exponent won't stay in significant trend or mean reversion too long. When it's significant the market participants see that as well and the market adjusts itself back to normal.
The Hurst Exponent can be used as a mean reverting oscillator itself. In a liquid market, the value tends to return back inside the confidence interval after significant moves(In smaller markets, it could stay inefficient for a long time). So when Hurst Exponent shows significant values, the market has just entered significant trend or mean reversion state. However, when it stays outside of confidence interval for too long, it would suggest the market might be closer to the end of trend or mean reversion instead.
Larger sample size makes the Hurst Exponent Statistics more reliable. Therefore, if the user want to know if long term memory exist in general on the selected ticker, they can use a large sample size and maximize the log scale. Eg: 1024 sample size, scale (16,4).
Following Chart is Bitcoin on Daily timeframe with 1024 lookback. It suggests the market for bitcoin tends to have long term memory in general. It generally has significant trend and is more inefficient at it's early stage.
在腳本中搜尋"hurst"
Hurst Exponent Trend filterHello Traders !!
Hurst Exponent Trend filter utalises the Hurst Exponent and VAWMA (one of my other unique indicators - check my script publishings to use) to categorise the market and decide whether its Trending, H > 0.5, In random Geometric Brownian Motion (GBM) H = 0.5 or Mean reverting (Contrarian), H < 0.5, When Trending a Trend following indicator -The VAWMA- is color highlighted, By doing so, theoreticaly price noise is eleimnated leaving statsitcaly true zones of price action Trend.
What is The Hurst Exponent ?
Developed by The Hydrologist Edwin Harlod Hurst, The Hurst Exponent measures auto correlation in time series sets, Its first applicartions were in the natural world, e.g. in measureing the volume of water in a river.
Although since then it has had applications in Finance, this may be largly due to autocorrelation functions being usefull tools in univaritae time series anaylyis.
The Hurst Exponent (H) aims to segment the market into three differnet states, Trending (H > 0.5), Random Geometric Brownian Motion (H = 0.5) and Mean Reverting / Contrarian (H < 0.5). In my interpritation this can be used as a trend filter that iliminates market noise, which may be achived by only focusing on trending zones.
How to Interprit the Indicator :
Focusing on the Above image, When H > 0.5 A trend is presnet, to decide the directional bias, both VAWMA`s position is checked, given the fast VAWMA > slow VAWMA and the current close > the fast VAWMA a bulish bias is present, signafied by a vibrant green fill between the fast VAWMA and price action. note the exact opposite logic for a bearish bias and H > 0.5 (signafied by a vibrant red fill). .
I will continue to update this Trading Indicator.
PS : Thats given I can hopfully remmember
Happy Trading !!
Hurst Momentum Oscillator | AlphaNattHurst Momentum Oscillator | AlphaNatt
An adaptive oscillator that combines the Hurst Exponent - which identifies whether markets are trending or mean-reverting - with momentum analysis to create signals that automatically adjust to market regime.
"The Hurst Exponent reveals a hidden truth: markets aren't always trending. This oscillator knows when to ride momentum and when to fade it."
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📐 THE MATHEMATICS
Hurst Exponent (H):
Measures the long-term memory of time series:
H > 0.5: Trending (persistent) behavior
H = 0.5: Random walk
H < 0.5: Mean-reverting behavior
Originally developed for analyzing Nile river flooding patterns, now used in:
Fractal market analysis
Network traffic prediction
Climate modeling
Financial markets
The Innovation:
This oscillator multiplies momentum by the Hurst coefficient:
When trending (H > 0.5): Momentum is amplified
When mean-reverting (H < 0.5): Momentum is reduced
Result: Adaptive signals based on market regime
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
💎 KEY ADVANTAGES
Regime Adaptive: Automatically adjusts to trending vs ranging markets
False Signal Reduction: Reduces momentum signals in mean-reverting markets
Trend Amplification: Stronger signals when trends are persistent
Mathematical Edge: Based on fractal dimension analysis
No Repainting: All calculations on historical data
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📊 TRADING SIGNALS
Visual Interpretation:
Cyan zones: Bullish momentum in trending market
Magenta zones: Bearish momentum or mean reversion
Background tint: Blue = trending, Pink = mean-reverting
Gradient intensity: Signal strength
Trading Strategies:
1. Trend Following:
Trade momentum signals when background is blue (trending)
2. Mean Reversion:
Fade extreme readings when background is pink
3. Regime Transition:
Watch for background color changes as early warning
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🎯 OPTIMAL USAGE
Best Conditions:
Strong trending markets (crypto bull runs)
Clear ranging markets (forex sessions)
Regime transitions
Multi-timeframe analysis
Market Applications:
Crypto: Excellent for identifying trend persistence
Forex: Detects when pairs are ranging
Stocks: Identifies momentum stocks
Commodities: Catches persistent trends
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Developed by AlphaNatt | Fractal Market Analysis
Version: 1.0
Classification: Adaptive Regime Oscillator
Not financial advice. Always DYOR.
Hurst Exponent SmoothedDescription:
The Hurst Exponent Smoothed indicator provides a dynamic analysis of market behavior by calculating the Hurst Exponent over a specified lookback period. This tool is especially useful for identifying whether a market is trending or mean-reverting.
Key Features:
Lookback Period: Set to 90 by default, this parameter controls how many periods the indicator considers for its calculations. Adjusting this value allows you to fine-tune the sensitivity of the indicator to recent price action.
Market Analysis: The Hurst Exponent gives insights into the nature of price movement:
A value near 0.5 suggests a random walk, indicating that the market is unpredictable.
Values above 0.5 indicate a trending market where price movements exhibit persistence, suggesting that the current trend may continue.
Values below 0.5 point to a mean-reverting market, where price movements tend to reverse, making it a potential signal for contrarian trading strategies.
Usage:
Trend Following: When the Hurst Exponent is consistently above 0.5, it may indicate a strong trend. Traders can use this information to align with the current market direction.
Mean Reversion: If the Hurst Exponent falls below 0.5, it could signal that the market is more likely to revert to the mean, offering opportunities for mean-reversion strategies.
Visuals:
The indicator displays a smooth line oscillating between values, giving traders a clear visual cue for the current market condition.
The script is optimized for various timeframes, as demonstrated on the BTCUSD pair on a 270-minute chart. Traders can adapt the lookback period based on their trading style and the specific asset being analyzed.
Open Source: This script is open-source and free to use. Feel free to customize and adapt it to your needs!
Hurst Dual-Channel + ECDF Early Reentry (Single Trigger)Hello,
This indicator can be useful during ranging market phases, especially on short timeframes such as 5 minutes, within a statistically contrarian approach.
It combines two quantitative methodologies:
– Hurst-type adaptive channels, which measure short- and medium-term price deviations using the ATR (Average True Range);
– an Empirical Cumulative Distribution Function (ECDF), which locates the current price between its recent extremes (0 corresponding to the lower bound, 1 to the upper bound).
The goal is to identify relative overbought and oversold zones, where the price exceeds the channels and then begins to revert toward its statistical mean.
The indicator does not issue trading recommendations: it merely highlights specific statistical conditions for research and analytical purposes.
The “BUY” and “SELL” labels indicate such technical configurations:
– ECDF < 0.2 with price returning above the lower channels → bullish reentry.
– ECDF > 0.9 with price returning below the upper channels → bearish reentry.
The parameters (channel periods, ECDF window, smoothing) allow you to fine-tune the sensitivity of the analysis according to instrument volatility or chosen timeframe.
🟩 Buy Signal (BUY)
A buy signal is triggered when a strong downside deviation pushes the price below both channels, followed by a gradual reentry inside the bands.
More precisely:
– The low is below both channels (low < scb and low < mcb).
– The ECDF crosses back above 0.19 (exit from oversold).
– Both events occur within the last six bars.
– The price moves back above the lower channel (high > scb).
– No previous long signal is active.
This configuration represents a statistical reentry to the mean after an excessive drop.
🟥 Sell Signal (SELL)
Conversely, a sell signal appears when a strong upside deviation pushes the price above both channels, followed by a pullback below them:
– The high exceeds both channels (high > sct and high > mct).
– The ECDF crosses below 0.9 (exit from overbought).
– Both events occur within the last six bars.
– The price falls back below the upper channel (low < sct).
– No previous short signal is active.
This reflects a bearish reentry following a statistical overextension.
⚙️ Operating Logic
Each signal is triggered only once per cycle thanks to the variables triggered_long and triggered_short, preventing duplicates until a new extreme occurs.
The tool is designed for visual analysis and pattern research, not for automated execution.
🔍 ECDF Principle and Calculation
The ECDF is a non-parametric measure of a value’s position within its recent distribution:
ECDF(X)=number of values ≤XNECDF(X) = \frac{\text{number of values } \le X}{N}ECDF(X)=Nnumber of values ≤X
It expresses the empirical proportion of observations below the current value.
Example:
If, among the last 100 observations, 85 are below the current price, then
ECDF=0.85ECDF = 0.85ECDF=0.85
→ The price is at the 85th percentile, statistically high relative to recent history.
Strengths: robust, model-free, well-suited to asymmetric or non-normal market regimes.
Limitations: it does not measure amplitude and depends on the selected window size.
🌊 Intuitive Analogy: The River and the Gauge
Imagine a river with a depth gauge:
– The Z-Score tells you how many meters above the average level the water currently stands.
– The ECDF tells you in how many past cases the water level was lower than it is now.
The Z-Score assumes the river always follows the same symmetrical pattern.
The ECDF simply observes reality — adapting naturally, even when the current becomes unpredictable.
Final note:
This indicator is designed for visual and statistical exploration of price behavior.
The signals represent statistical states, not trade instructions.
Entering long or short positions based on them is entirely at your own discretion and risk.
HURST Channel StrategyBased on the work TJS / Trading Zoom / Svoboda
Strategy based on Hurst channel with loss averaging when an open position is below 0.5 channel range.
How it works:
1. opens the long position when the close price crosses over the lower band (from bottom to top)
2. opens additional position (double in size) when average position price is lower than average channel value (0.5)
3. closes the position when the close price crosses over the higher band (from top to bottom)
Works the best on :
- volatile and continuous instruments (futures)
- on timeframes above 15 minutes
- uptrends or consolidations
- downtrends require more capital to open double positions
Prometheus Analytics Hurst ExponentThis indicator uses market data to calculate the Hurst Exponent so traders can have knowledge of the long memory of the asset.
Users can control the lookback length for the H value (Hurst Exponent), lookback length for the SMA (Simple Moving Average) of the Hurst Exponent, to show either, and what to calculate the H value and SMA on.
Hurst Exponent:
The Hurst Exponent is a value between 0 and 1 with 0.5 as a midline.
An H value(Hurst Exponent) above 0.5 indicates a trending market, and a market that should have larger, longer moves.
An H value below 0.5 indicates a mean reverting market, and a market that should have smaller, shorter moves.
An H value of0.5 indicates a random walk. This would mean the price would follow a Brownian Motion model and future prices would be independent from past prices.
Just because the H value is above 0.5 does not indicate that there should be an UP trend, just as a value below 0.5 does not indicate a DOWN trend. It indicates that there should be a trend, up or down.
Scenarios:
An intuitive way to use the Hurst Exponent is as an asset is trending in whatever direction, as the H value crosses below 0.5 it indicates a reversal. It indicates that what was happening before isn’t impacting what is happening now as much.
Steps explained from picture:
Step 1: Strong uptrend is identified with the asset moving up aggressively with H above 0.5.
Step 2: The H value crosses below 0.5 and prices stay elevated.
Step 3: Price reverts back down as the H value stays below 0.5
Just because the H value is above 0.5 doesn’t mean the asset has to be uptrending. In this example we see the asset fall as the H value is above 0.5. Not only that, but every time it crosses below 0.5, the asset takes a breather on the way down
Step 1: As the H value crosses above 0.5, we can expect trends to appear in the asset.
Step 2: After the trend switches to down, we only see a breather and some chop after the H value crosses back below 0.5.
Step 3: Once The H value crosses back over we see the downtrend continue and new lows be made.
Step 4: We see it once again, simply the area of chop is bigger. We don’t see a higher high, breaking the overall downtrend, but once the H value crosses over again the downturn continues and we see a lower low.
It may occur when no strong trend is made in either direction. The H value above 0.5 does indeed sometimes correlate with an uptrend sometimes.
Step 1: After the strong downtrend we see a break below 0.5 with some consolidation.
Step 2: No clear big move on the asset or H value.
Step 3: H value above 0.5 leads to a break of highs and a new uptrend.
Users have the option to decide what to calculate the H value on. Close is the default, or dollar return per bar are the options. Dollar return per bar and offer an H value that may give a better indication of when price moves will be small and sporadic.
Using dollar move per bar.
Step 1: H value cross above 0.5, we see large candles and fast moves.
Step 2: H value crosses below 0.5, the candles immediately following are shorter. The big red candles come right before the cross back above.
Step 3: H value cross back above 0.5, after some chop, large move down.
Similar story
Step 1: H value above 0.5, big trends either direction
Step 2: After the H value crosses below, the moves are short and choppy.
Settings:
Options to show or remove either the H value or it’s SMA.
Options to adjust the period uses, default is (32, 16)
Dominant Cycle Detection OscillatorThis is a Dominant Cycle Detection Oscillator that searches multiple ranges of wavelengths within a spectrum. Choose one of 4 different dominant cycle detection methods (MESA MAMA cycle, Pearson Autocorrelation, Discreet Fourier Transform, and Phase Accumulation) to determine the most dominant cycles and see the historical results. Straight lines can indicate a steady dominant cycle; while Wavy lines might indicate a varying dominant cycle length. The steadier the cycle, the easier it may be to predict future events in that cycle (keep the log scale in mind when considering steadiness). The presence of evenly divisible (or harmonic) cycle lengths may also indicate stronger cycles; for example, 19, 38, and 76 dominant lengths for the 2x, 4x, and 8x cycles. Practically, a trader can use these cycle outputs as the default settings for other Hurst/cycle indicators. For example, if you see dominant cycle oscillator outputs of 38 & 76 for the 4x and 8x cycle respectively, you might want to test/use defaults of 38 & 76 for the 4x & 8x lengths in the bandpass, diamond/semi-circle notation, moving average & envelope, and FLD instead of the defaults 40 & 80 for a more fine-tuned analysis.
Muting the oscillator's historical lines and overlaying the indicator on the chart can visually cue a trader to the cycle lengths without taking up extra panes. The DFT Cycle lengths with muted historical lines have been overlayed on the chart in the photo.
The y-axis scale for this indicator's pane (just the oscillator pane, not the chart) most likely needs to be changed to logarithmic to look normal, but it depends on the search ranges in your settings. There are instructions in the settings. In the photo, the MESA MAMA scale is set to regular (not logarithmic) which demonstrates how difficult it can be to read if not changed.
In the Spectral Analysis chapter of Hurst's book Profit Magic, he recommended doing a Fourier analysis across a spectrum of frequencies. Hurst acknowledged there were many ways to do this analysis but recommended the method described by Lanczos. Currently in this indicator, the closest thing to the method described by Lanczos is the DFT Discreet Fourier Transform method.
Shoutout to @lastguru for the dominant cycle library referenced in this code. He mentioned that he may add more methods in the future.
SemiCircle Cycle Notation PivotsFor decades, traders have sought to decode the rhythm of the markets through cycle theory. From the groundbreaking work of HM Gartley in the 1930s to modern-day cycle trading tools on TradingView, the concept remains the same: markets move in repeating waves with larger cycles influencing smaller ones in a fractal-like structure, and understanding their timing gives traders an edge to better anticipate future price movements🔮.
Traditional cycle analysis has always been manual, requiring traders to painstakingly plot semicircles, diamonds, or sine waves to estimate pivot points and time reversals. Drawing tools like semicircle & sine wave projections exist on TradingView, but they lack automation—forcing traders to adjust cycle lengths by eye, often leading to inconsistencies.
This is where SemiCircle Cycle Notation Pivots indicator comes in. Semicircle cycle chart notation appears to have evolved as a practical visualization tool among cycle theorists rather than being pioneered by a single individual; some key influences include HM Gartley, WD Gann, JM Hurst, Walter Bressert, and RayTomes. Built upon LonesomeTheBlue's foundational ZigZag Waves indicator , this indicator takes cycle visualization to the next level by dynamically detecting price pivots and then automatically plotting semicircles based on real-time cycle length calculations & expected rhythm of price action over time.
Key Features:
Automated Cycle Detection: The indicator identifies pivot points based on your preference—highs, lows, or both—and plots semicircle waves that correspond to Hurst's cycle notation.
Customizable Cycle Lengths: Tailor the analysis to your trading strategy with adjustable cycle lengths, defaulting to 10, 20, and 40 bars, allowing for flexibility across various timeframes and assets.
Dynamic Wave Scaling: The semicircle waves adapt to different price structures, ensuring that the visualization remains proportional to the detected cycle lengths and aiding in the identification of potential reversal points.
Automated Cycle Detection: Dynamically identifies price pivot points and automatically adjusts offsets based on real-time cycle length calculations, ensuring precise semicircle wave alignment with market structure.
Color-Coded Cycle Tiers: Each cycle tier is distinctly color-coded, enabling quick differentiation and a clearer understanding of nested market cycles.
Local Hurst Slope [Dynamic Regime]1. HOW THE INDICATOR WORKS (Math → Market Edge)Step
Math
Market Intuition
1. Log-Returns
r_t = log(P_t / P_{t-1})
Removes scale, makes series stationary
2. R/S per τ
R = max(cum_dev) - min(cum_dev)
S = stdev(segment)
Measures memory strength over window τ
3. H(τ) = log(R/S) / log(τ)
Di Matteo (2007)
H > 0.5 → Trend memory
H < 0.5 → Mean-reversion
4. Slope = dH/d(log τ)
Linear regression of H vs log(τ)
Slope > 0.12 → Trend accelerating
Slope < -0.08 → Reversion emerging
LEADING EDGE: The slope changes 3–20 bars BEFORE price confirms
→ You enter before the crowd, exit before the trap
Slope > +0.12 + Strong Trend = Bullish = Long
Slope +0.05 to +0.12 = Weak Trend = Cautious = Hold/Trail
Slope -0.05 to +0.05 = Random = No Edge
Slope-0.08 to -0.05 = Weak Reversion = Bearish setup = Prepare Short
Slope < -0.08 = Strong Reversion = Bearish= Short
PRO TIPS
Only trade in direction of 200-day SMA
Filters false signals
Avoid trading 3 days before/after earnings
Volatility kills edge
Use on ETFs (SPY, QQQ)
Cleaner than single stocks
Combine with RSI(14)
RSI < 30 + Hurst short = nuclear reversal
Stealthy Hurst ExponentThis is my attempt at Hurst Exponent indicator.
Above 0.5 is supposed to indicate a trend is present.
Below 0.5 is noise.
0.5 is supposed to be Brownian Motion or regular market noise.
If you have corrections to the code you want to share, please post it.
I'm not an expert in math or coding, so this shouldn't be copied / ported.
This code didn't work very well as a filter, but you may have a fix or other use.
Bridge Bands ATR (Overlay) ShaneHurst-Adaptive Volatility Bands
A fractal-inspired evolution of Bollinger and Keltner bands that adapts dynamically to both volatility and trend persistence.
This indicator estimates the Hurst exponent (H) — a measure of market memory — and adjusts a standard volatility band to lean in the direction of the prevailing trend.
When H > 0.5, markets exhibit persistence (trending behavior); the bands shift in the trend’s direction.
When H < 0.5, markets are mean-reverting; the bands flatten and recent extremes become potential fade zones.
Band width scales with recent volatility (σ), expanding in turbulent conditions and contracting during calm periods.
Key Features:
Adaptive offset using the Hurst exponent
Volatility-sensitive width for dynamic market regimes
EMA baseline with directional bias
Clear visual separation between trending and choppy phases
Inspired by Benoit Mandelbrot’s The Misbehavior of Markets and H.E. Hurst’s original work on long-term memory in time series.
Use it to identify regime shifts, trend-following entries, and volatility-adjusted stop levels.
Credit for this script goes to a number of people including Steve B, MichaalAngle, doc and joecat808. 500 day DEMA (double EMA) can be used as a longer term momentum line.
Tensor Market Analysis Engine (TMAE)# Tensor Market Analysis Engine (TMAE)
## Advanced Multi-Dimensional Mathematical Analysis System
*Where Quantum Mathematics Meets Market Structure*
---
## 🎓 THEORETICAL FOUNDATION
The Tensor Market Analysis Engine represents a revolutionary synthesis of three cutting-edge mathematical frameworks that have never before been combined for comprehensive market analysis. This indicator transcends traditional technical analysis by implementing advanced mathematical concepts from quantum mechanics, information theory, and fractal geometry.
### 🌊 Multi-Dimensional Volatility with Jump Detection
**Hawkes Process Implementation:**
The TMAE employs a sophisticated Hawkes process approximation for detecting self-exciting market jumps. Unlike traditional volatility measures that treat price movements as independent events, the Hawkes process recognizes that market shocks cluster and exhibit memory effects.
**Mathematical Foundation:**
```
Intensity λ(t) = μ + Σ α(t - Tᵢ)
```
Where market jumps at times Tᵢ increase the probability of future jumps through the decay function α, controlled by the Hawkes Decay parameter (0.5-0.99).
**Mahalanobis Distance Calculation:**
The engine calculates volatility jumps using multi-dimensional Mahalanobis distance across up to 5 volatility dimensions:
- **Dimension 1:** Price volatility (standard deviation of returns)
- **Dimension 2:** Volume volatility (normalized volume fluctuations)
- **Dimension 3:** Range volatility (high-low spread variations)
- **Dimension 4:** Correlation volatility (price-volume relationship changes)
- **Dimension 5:** Microstructure volatility (intrabar positioning analysis)
This creates a volatility state vector that captures market behavior impossible to detect with traditional single-dimensional approaches.
### 📐 Hurst Exponent Regime Detection
**Fractal Market Hypothesis Integration:**
The TMAE implements advanced Rescaled Range (R/S) analysis to calculate the Hurst exponent in real-time, providing dynamic regime classification:
- **H > 0.6:** Trending (persistent) markets - momentum strategies optimal
- **H < 0.4:** Mean-reverting (anti-persistent) markets - contrarian strategies optimal
- **H ≈ 0.5:** Random walk markets - breakout strategies preferred
**Adaptive R/S Analysis:**
Unlike static implementations, the TMAE uses adaptive windowing that adjusts to market conditions:
```
H = log(R/S) / log(n)
```
Where R is the range of cumulative deviations and S is the standard deviation over period n.
**Dynamic Regime Classification:**
The system employs hysteresis to prevent regime flipping, requiring sustained Hurst values before regime changes are confirmed. This prevents false signals during transitional periods.
### 🔄 Transfer Entropy Analysis
**Information Flow Quantification:**
Transfer entropy measures the directional flow of information between price and volume, revealing lead-lag relationships that indicate future price movements:
```
TE(X→Y) = Σ p(yₜ₊₁, yₜ, xₜ) log
```
**Causality Detection:**
- **Volume → Price:** Indicates accumulation/distribution phases
- **Price → Volume:** Suggests retail participation or momentum chasing
- **Balanced Flow:** Market equilibrium or transition periods
The system analyzes multiple lag periods (2-20 bars) to capture both immediate and structural information flows.
---
## 🔧 COMPREHENSIVE INPUT SYSTEM
### Core Parameters Group
**Primary Analysis Window (10-100, Default: 50)**
The fundamental lookback period affecting all calculations. Optimization by timeframe:
- **1-5 minute charts:** 20-30 (rapid adaptation to micro-movements)
- **15 minute-1 hour:** 30-50 (balanced responsiveness and stability)
- **4 hour-daily:** 50-100 (smooth signals, reduced noise)
- **Asset-specific:** Cryptocurrency 20-35, Stocks 35-50, Forex 40-60
**Signal Sensitivity (0.1-2.0, Default: 0.7)**
Master control affecting all threshold calculations:
- **Conservative (0.3-0.6):** High-quality signals only, fewer false positives
- **Balanced (0.7-1.0):** Optimal risk-reward ratio for most trading styles
- **Aggressive (1.1-2.0):** Maximum signal frequency, requires careful filtering
**Signal Generation Mode:**
- **Aggressive:** Any component signals (highest frequency)
- **Confluence:** 2+ components agree (balanced approach)
- **Conservative:** All 3 components align (highest quality)
### Volatility Jump Detection Group
**Volatility Dimensions (2-5, Default: 3)**
Determines the mathematical space complexity:
- **2D:** Price + Volume volatility (suitable for clean markets)
- **3D:** + Range volatility (optimal for most conditions)
- **4D:** + Correlation volatility (advanced multi-asset analysis)
- **5D:** + Microstructure volatility (maximum sensitivity)
**Jump Detection Threshold (1.5-4.0σ, Default: 3.0σ)**
Standard deviations required for volatility jump classification:
- **Cryptocurrency:** 2.0-2.5σ (naturally volatile)
- **Stock Indices:** 2.5-3.0σ (moderate volatility)
- **Forex Major Pairs:** 3.0-3.5σ (typically stable)
- **Commodities:** 2.0-3.0σ (varies by commodity)
**Jump Clustering Decay (0.5-0.99, Default: 0.85)**
Hawkes process memory parameter:
- **0.5-0.7:** Fast decay (jumps treated as independent)
- **0.8-0.9:** Moderate clustering (realistic market behavior)
- **0.95-0.99:** Strong clustering (crisis/event-driven markets)
### Hurst Exponent Analysis Group
**Calculation Method Options:**
- **Classic R/S:** Original Rescaled Range (fast, simple)
- **Adaptive R/S:** Dynamic windowing (recommended for trading)
- **DFA:** Detrended Fluctuation Analysis (best for noisy data)
**Trending Threshold (0.55-0.8, Default: 0.60)**
Hurst value defining persistent market behavior:
- **0.55-0.60:** Weak trend persistence
- **0.65-0.70:** Clear trending behavior
- **0.75-0.80:** Strong momentum regimes
**Mean Reversion Threshold (0.2-0.45, Default: 0.40)**
Hurst value defining anti-persistent behavior:
- **0.35-0.45:** Weak mean reversion
- **0.25-0.35:** Clear ranging behavior
- **0.15-0.25:** Strong reversion tendency
### Transfer Entropy Parameters Group
**Information Flow Analysis:**
- **Price-Volume:** Classic flow analysis for accumulation/distribution
- **Price-Volatility:** Risk flow analysis for sentiment shifts
- **Multi-Timeframe:** Cross-timeframe causality detection
**Maximum Lag (2-20, Default: 5)**
Causality detection window:
- **2-5 bars:** Immediate causality (scalping)
- **5-10 bars:** Short-term flow (day trading)
- **10-20 bars:** Structural flow (swing trading)
**Significance Threshold (0.05-0.3, Default: 0.15)**
Minimum entropy for signal generation:
- **0.05-0.10:** Detect subtle information flows
- **0.10-0.20:** Clear causality only
- **0.20-0.30:** Very strong flows only
---
## 🎨 ADVANCED VISUAL SYSTEM
### Tensor Volatility Field Visualization
**Five-Layer Resonance Bands:**
The tensor field creates dynamic support/resistance zones that expand and contract based on mathematical field strength:
- **Core Layer (Purple):** Primary tensor field with highest intensity
- **Layer 2 (Neutral):** Secondary mathematical resonance
- **Layer 3 (Info Blue):** Tertiary harmonic frequencies
- **Layer 4 (Warning Gold):** Outer field boundaries
- **Layer 5 (Success Green):** Maximum field extension
**Field Strength Calculation:**
```
Field Strength = min(3.0, Mahalanobis Distance × Tensor Intensity)
```
The field amplitude adjusts to ATR and mathematical distance, creating dynamic zones that respond to market volatility.
**Radiation Line Network:**
During active tensor states, the system projects directional radiation lines showing field energy distribution:
- **8 Directional Rays:** Complete angular coverage
- **Tapering Segments:** Progressive transparency for natural visual flow
- **Pulse Effects:** Enhanced visualization during volatility jumps
### Dimensional Portal System
**Portal Mathematics:**
Dimensional portals visualize regime transitions using category theory principles:
- **Green Portals (◉):** Trending regime detection (appear below price for support)
- **Red Portals (◎):** Mean-reverting regime (appear above price for resistance)
- **Yellow Portals (○):** Random walk regime (neutral positioning)
**Tensor Trail Effects:**
Each portal generates 8 trailing particles showing mathematical momentum:
- **Large Particles (●):** Strong mathematical signal
- **Medium Particles (◦):** Moderate signal strength
- **Small Particles (·):** Weak signal continuation
- **Micro Particles (˙):** Signal dissipation
### Information Flow Streams
**Particle Stream Visualization:**
Transfer entropy creates flowing particle streams indicating information direction:
- **Upward Streams:** Volume leading price (accumulation phases)
- **Downward Streams:** Price leading volume (distribution phases)
- **Stream Density:** Proportional to information flow strength
**15-Particle Evolution:**
Each stream contains 15 particles with progressive sizing and transparency, creating natural flow visualization that makes information transfer immediately apparent.
### Fractal Matrix Grid System
**Multi-Timeframe Fractal Levels:**
The system calculates and displays fractal highs/lows across five Fibonacci periods:
- **8-Period:** Short-term fractal structure
- **13-Period:** Intermediate-term patterns
- **21-Period:** Primary swing levels
- **34-Period:** Major structural levels
- **55-Period:** Long-term fractal boundaries
**Triple-Layer Visualization:**
Each fractal level uses three-layer rendering:
- **Shadow Layer:** Widest, darkest foundation (width 5)
- **Glow Layer:** Medium white core line (width 3)
- **Tensor Layer:** Dotted mathematical overlay (width 1)
**Intelligent Labeling System:**
Smart spacing prevents label overlap using ATR-based minimum distances. Labels include:
- **Fractal Period:** Time-based identification
- **Topological Class:** Mathematical complexity rating (0, I, II, III)
- **Price Level:** Exact fractal price
- **Mahalanobis Distance:** Current mathematical field strength
- **Hurst Exponent:** Current regime classification
- **Anomaly Indicators:** Visual strength representations (○ ◐ ● ⚡)
### Wick Pressure Analysis
**Rejection Level Mathematics:**
The system analyzes candle wick patterns to project future pressure zones:
- **Upper Wick Analysis:** Identifies selling pressure and resistance zones
- **Lower Wick Analysis:** Identifies buying pressure and support zones
- **Pressure Projection:** Extends lines forward based on mathematical probability
**Multi-Layer Glow Effects:**
Wick pressure lines use progressive transparency (1-8 layers) creating natural glow effects that make pressure zones immediately visible without cluttering the chart.
### Enhanced Regime Background
**Dynamic Intensity Mapping:**
Background colors reflect mathematical regime strength:
- **Deep Transparency (98% alpha):** Subtle regime indication
- **Pulse Intensity:** Based on regime strength calculation
- **Color Coding:** Green (trending), Red (mean-reverting), Neutral (random)
**Smoothing Integration:**
Regime changes incorporate 10-bar smoothing to prevent background flicker while maintaining responsiveness to genuine regime shifts.
### Color Scheme System
**Six Professional Themes:**
- **Dark (Default):** Professional trading environment optimization
- **Light:** High ambient light conditions
- **Classic:** Traditional technical analysis appearance
- **Neon:** High-contrast visibility for active trading
- **Neutral:** Minimal distraction focus
- **Bright:** Maximum visibility for complex setups
Each theme maintains mathematical accuracy while optimizing visual clarity for different trading environments and personal preferences.
---
## 📊 INSTITUTIONAL-GRADE DASHBOARD
### Tensor Field Status Section
**Field Strength Display:**
Real-time Mahalanobis distance calculation with dynamic emoji indicators:
- **⚡ (Lightning):** Extreme field strength (>1.5× threshold)
- **● (Solid Circle):** Strong field activity (>1.0× threshold)
- **○ (Open Circle):** Normal field state
**Signal Quality Rating:**
Democratic algorithm assessment:
- **ELITE:** All 3 components aligned (highest probability)
- **STRONG:** 2 components aligned (good probability)
- **GOOD:** 1 component active (moderate probability)
- **WEAK:** No clear component signals
**Threshold and Anomaly Monitoring:**
- **Threshold Display:** Current mathematical threshold setting
- **Anomaly Level (0-100%):** Combined volatility and volume spike measurement
- **>70%:** High anomaly (red warning)
- **30-70%:** Moderate anomaly (orange caution)
- **<30%:** Normal conditions (green confirmation)
### Tensor State Analysis Section
**Mathematical State Classification:**
- **↑ BULL (Tensor State +1):** Trending regime with bullish bias
- **↓ BEAR (Tensor State -1):** Mean-reverting regime with bearish bias
- **◈ SUPER (Tensor State 0):** Random walk regime (neutral)
**Visual State Gauge:**
Five-circle progression showing tensor field polarity:
- **🟢🟢🟢⚪⚪:** Strong bullish mathematical alignment
- **⚪⚪🟡⚪⚪:** Neutral/transitional state
- **⚪⚪🔴🔴🔴:** Strong bearish mathematical alignment
**Trend Direction and Phase Analysis:**
- **📈 BULL / 📉 BEAR / ➡️ NEUTRAL:** Primary trend classification
- **🌪️ CHAOS:** Extreme information flow (>2.0 flow strength)
- **⚡ ACTIVE:** Strong information flow (1.0-2.0 flow strength)
- **😴 CALM:** Low information flow (<1.0 flow strength)
### Trading Signals Section
**Real-Time Signal Status:**
- **🟢 ACTIVE / ⚪ INACTIVE:** Long signal availability
- **🔴 ACTIVE / ⚪ INACTIVE:** Short signal availability
- **Components (X/3):** Active algorithmic components
- **Mode Display:** Current signal generation mode
**Signal Strength Visualization:**
Color-coded component count:
- **Green:** 3/3 components (maximum confidence)
- **Aqua:** 2/3 components (good confidence)
- **Orange:** 1/3 components (moderate confidence)
- **Gray:** 0/3 components (no signals)
### Performance Metrics Section
**Win Rate Monitoring:**
Estimated win rates based on signal quality with emoji indicators:
- **🔥 (Fire):** ≥60% estimated win rate
- **👍 (Thumbs Up):** 45-59% estimated win rate
- **⚠️ (Warning):** <45% estimated win rate
**Mathematical Metrics:**
- **Hurst Exponent:** Real-time fractal dimension (0.000-1.000)
- **Information Flow:** Volume/price leading indicators
- **📊 VOL:** Volume leading price (accumulation/distribution)
- **💰 PRICE:** Price leading volume (momentum/speculation)
- **➖ NONE:** Balanced information flow
- **Volatility Classification:**
- **🔥 HIGH:** Above 1.5× jump threshold
- **📊 NORM:** Normal volatility range
- **😴 LOW:** Below 0.5× jump threshold
### Market Structure Section (Large Dashboard)
**Regime Classification:**
- **📈 TREND:** Hurst >0.6, momentum strategies optimal
- **🔄 REVERT:** Hurst <0.4, contrarian strategies optimal
- **🎲 RANDOM:** Hurst ≈0.5, breakout strategies preferred
**Mathematical Field Analysis:**
- **Dimensions:** Current volatility space complexity (2D-5D)
- **Hawkes λ (Lambda):** Self-exciting jump intensity (0.00-1.00)
- **Jump Status:** 🚨 JUMP (active) / ✅ NORM (normal)
### Settings Summary Section (Large Dashboard)
**Active Configuration Display:**
- **Sensitivity:** Current master sensitivity setting
- **Lookback:** Primary analysis window
- **Theme:** Active color scheme
- **Method:** Hurst calculation method (Classic R/S, Adaptive R/S, DFA)
**Dashboard Sizing Options:**
- **Small:** Essential metrics only (mobile/small screens)
- **Normal:** Balanced information density (standard desktop)
- **Large:** Maximum detail (multi-monitor setups)
**Position Options:**
- **Top Right:** Standard placement (avoids price action)
- **Top Left:** Wide chart optimization
- **Bottom Right:** Recent price focus (scalping)
- **Bottom Left:** Maximum price visibility (swing trading)
---
## 🎯 SIGNAL GENERATION LOGIC
### Multi-Component Convergence System
**Component Signal Architecture:**
The TMAE generates signals through sophisticated component analysis rather than simple threshold crossing:
**Volatility Component:**
- **Jump Detection:** Mahalanobis distance threshold breach
- **Hawkes Intensity:** Self-exciting process activation (>0.2)
- **Multi-dimensional:** Considers all volatility dimensions simultaneously
**Hurst Regime Component:**
- **Trending Markets:** Price above SMA-20 with positive momentum
- **Mean-Reverting Markets:** Price at Bollinger Band extremes
- **Random Markets:** Bollinger squeeze breakouts with directional confirmation
**Transfer Entropy Component:**
- **Volume Leadership:** Information flow from volume to price
- **Volume Spike:** Volume 110%+ above 20-period average
- **Flow Significance:** Above entropy threshold with directional bias
### Democratic Signal Weighting
**Signal Mode Implementation:**
- **Aggressive Mode:** Any single component triggers signal
- **Confluence Mode:** Minimum 2 components must agree
- **Conservative Mode:** All 3 components must align
**Momentum Confirmation:**
All signals require momentum confirmation:
- **Long Signals:** RSI >50 AND price >EMA-9
- **Short Signals:** RSI <50 AND price 0.6):**
- **Increase Sensitivity:** Catch momentum continuation
- **Lower Mean Reversion Threshold:** Avoid counter-trend signals
- **Emphasize Volume Leadership:** Institutional accumulation/distribution
- **Tensor Field Focus:** Use expansion for trend continuation
- **Signal Mode:** Aggressive or Confluence for trend following
**Range-Bound Markets (Hurst <0.4):**
- **Decrease Sensitivity:** Avoid false breakouts
- **Lower Trending Threshold:** Quick regime recognition
- **Focus on Price Leadership:** Retail sentiment extremes
- **Fractal Grid Emphasis:** Support/resistance trading
- **Signal Mode:** Conservative for high-probability reversals
**Volatile Markets (High Jump Frequency):**
- **Increase Hawkes Decay:** Recognize event clustering
- **Higher Jump Threshold:** Avoid noise signals
- **Maximum Dimensions:** Capture full volatility complexity
- **Reduce Position Sizing:** Risk management adaptation
- **Enhanced Visuals:** Maximum information for rapid decisions
**Low Volatility Markets (Low Jump Frequency):**
- **Decrease Jump Threshold:** Capture subtle movements
- **Lower Hawkes Decay:** Treat moves as independent
- **Reduce Dimensions:** Simplify analysis
- **Increase Position Sizing:** Capitalize on compressed volatility
- **Minimal Visuals:** Reduce distraction in quiet markets
---
## 🚀 ADVANCED TRADING STRATEGIES
### The Mathematical Convergence Method
**Entry Protocol:**
1. **Fractal Grid Approach:** Monitor price approaching significant fractal levels
2. **Tensor Field Confirmation:** Verify field expansion supporting direction
3. **Portal Signal:** Wait for dimensional portal appearance
4. **ELITE/STRONG Quality:** Only trade highest quality mathematical signals
5. **Component Consensus:** Confirm 2+ components agree in Confluence mode
**Example Implementation:**
- Price approaching 21-period fractal high
- Tensor field expanding upward (bullish mathematical alignment)
- Green portal appears below price (trending regime confirmation)
- ELITE quality signal with 3/3 components active
- Enter long position with stop below fractal level
**Risk Management:**
- **Stop Placement:** Below/above fractal level that generated signal
- **Position Sizing:** Based on Mahalanobis distance (higher distance = smaller size)
- **Profit Targets:** Next fractal level or tensor field resistance
### The Regime Transition Strategy
**Regime Change Detection:**
1. **Monitor Hurst Exponent:** Watch for persistent moves above/below thresholds
2. **Portal Color Change:** Regime transitions show different portal colors
3. **Background Intensity:** Increasing regime background intensity
4. **Mathematical Confirmation:** Wait for regime confirmation (hysteresis)
**Trading Implementation:**
- **Trending Transitions:** Trade momentum breakouts, follow trend
- **Mean Reversion Transitions:** Trade range boundaries, fade extremes
- **Random Transitions:** Trade breakouts with tight stops
**Advanced Techniques:**
- **Multi-Timeframe:** Confirm regime on higher timeframe
- **Early Entry:** Enter on regime transition rather than confirmation
- **Regime Strength:** Larger positions during strong regime signals
### The Information Flow Momentum Strategy
**Flow Detection Protocol:**
1. **Monitor Transfer Entropy:** Watch for significant information flow shifts
2. **Volume Leadership:** Strong edge when volume leads price
3. **Flow Acceleration:** Increasing flow strength indicates momentum
4. **Directional Confirmation:** Ensure flow aligns with intended trade direction
**Entry Signals:**
- **Volume → Price Flow:** Enter during accumulation/distribution phases
- **Price → Volume Flow:** Enter on momentum confirmation breaks
- **Flow Reversal:** Counter-trend entries when flow reverses
**Optimization:**
- **Scalping:** Use immediate flow detection (2-5 bar lag)
- **Swing Trading:** Use structural flow (10-20 bar lag)
- **Multi-Asset:** Compare flow between correlated assets
### The Tensor Field Expansion Strategy
**Field Mathematics:**
The tensor field expansion indicates mathematical pressure building in market structure:
**Expansion Phases:**
1. **Compression:** Field contracts, volatility decreases
2. **Tension Building:** Mathematical pressure accumulates
3. **Expansion:** Field expands rapidly with directional movement
4. **Resolution:** Field stabilizes at new equilibrium
**Trading Applications:**
- **Compression Trading:** Prepare for breakout during field contraction
- **Expansion Following:** Trade direction of field expansion
- **Reversion Trading:** Fade extreme field expansion
- **Multi-Dimensional:** Consider all field layers for confirmation
### The Hawkes Process Event Strategy
**Self-Exciting Jump Trading:**
Understanding that market shocks cluster and create follow-on opportunities:
**Jump Sequence Analysis:**
1. **Initial Jump:** First volatility jump detected
2. **Clustering Phase:** Hawkes intensity remains elevated
3. **Follow-On Opportunities:** Additional jumps more likely
4. **Decay Period:** Intensity gradually decreases
**Implementation:**
- **Jump Confirmation:** Wait for mathematical jump confirmation
- **Direction Assessment:** Use other components for direction
- **Clustering Trades:** Trade subsequent moves during high intensity
- **Decay Exit:** Exit positions as Hawkes intensity decays
### The Fractal Confluence System
**Multi-Timeframe Fractal Analysis:**
Combining fractal levels across different periods for high-probability zones:
**Confluence Zones:**
- **Double Confluence:** 2 fractal levels align
- **Triple Confluence:** 3+ fractal levels cluster
- **Mathematical Confirmation:** Tensor field supports the level
- **Information Flow:** Transfer entropy confirms direction
**Trading Protocol:**
1. **Identify Confluence:** Find 2+ fractal levels within 1 ATR
2. **Mathematical Support:** Verify tensor field alignment
3. **Signal Quality:** Wait for STRONG or ELITE signal
4. **Risk Definition:** Use fractal level for stop placement
5. **Profit Targeting:** Next major fractal confluence zone
---
## ⚠️ COMPREHENSIVE RISK MANAGEMENT
### Mathematical Position Sizing
**Mahalanobis Distance Integration:**
Position size should inversely correlate with mathematical field strength:
```
Position Size = Base Size × (Threshold / Mahalanobis Distance)
```
**Risk Scaling Matrix:**
- **Low Field Strength (<2.0):** Standard position sizing
- **Moderate Field Strength (2.0-3.0):** 75% position sizing
- **High Field Strength (3.0-4.0):** 50% position sizing
- **Extreme Field Strength (>4.0):** 25% position sizing or no trade
### Signal Quality Risk Adjustment
**Quality-Based Position Sizing:**
- **ELITE Signals:** 100% of planned position size
- **STRONG Signals:** 75% of planned position size
- **GOOD Signals:** 50% of planned position size
- **WEAK Signals:** No position or paper trading only
**Component Agreement Scaling:**
- **3/3 Components:** Full position size
- **2/3 Components:** 75% position size
- **1/3 Components:** 50% position size or skip trade
### Regime-Adaptive Risk Management
**Trending Market Risk:**
- **Wider Stops:** Allow for trend continuation
- **Trend Following:** Trade with regime direction
- **Higher Position Size:** Trend probability advantage
- **Momentum Stops:** Trail stops based on momentum indicators
**Mean-Reverting Market Risk:**
- **Tighter Stops:** Quick exits on trend continuation
- **Contrarian Positioning:** Trade against extremes
- **Smaller Position Size:** Higher reversal failure rate
- **Level-Based Stops:** Use fractal levels for stops
**Random Market Risk:**
- **Breakout Focus:** Trade only clear breakouts
- **Tight Initial Stops:** Quick exit if breakout fails
- **Reduced Frequency:** Skip marginal setups
- **Range-Based Targets:** Profit targets at range boundaries
### Volatility-Adaptive Risk Controls
**High Volatility Periods:**
- **Reduced Position Size:** Account for wider price swings
- **Wider Stops:** Avoid noise-based exits
- **Lower Frequency:** Skip marginal setups
- **Faster Exits:** Take profits more quickly
**Low Volatility Periods:**
- **Standard Position Size:** Normal risk parameters
- **Tighter Stops:** Take advantage of compressed ranges
- **Higher Frequency:** Trade more setups
- **Extended Targets:** Allow for compressed volatility expansion
### Multi-Timeframe Risk Alignment
**Higher Timeframe Trend:**
- **With Trend:** Standard or increased position size
- **Against Trend:** Reduced position size or skip
- **Neutral Trend:** Standard position size with tight management
**Risk Hierarchy:**
1. **Primary:** Current timeframe signal quality
2. **Secondary:** Higher timeframe trend alignment
3. **Tertiary:** Mathematical field strength
4. **Quaternary:** Market regime classification
---
## 📚 EDUCATIONAL VALUE AND MATHEMATICAL CONCEPTS
### Advanced Mathematical Concepts
**Tensor Analysis in Markets:**
The TMAE introduces traders to tensor analysis, a branch of mathematics typically reserved for physics and advanced engineering. Tensors provide a framework for understanding multi-dimensional market relationships that scalar and vector analysis cannot capture.
**Information Theory Applications:**
Transfer entropy implementation teaches traders about information flow in markets, a concept from information theory that quantifies directional causality between variables. This provides intuition about market microstructure and participant behavior.
**Fractal Geometry in Trading:**
The Hurst exponent calculation exposes traders to fractal geometry concepts, helping understand that markets exhibit self-similar patterns across multiple timeframes. This mathematical insight transforms how traders view market structure.
**Stochastic Process Theory:**
The Hawkes process implementation introduces concepts from stochastic process theory, specifically self-exciting point processes. This provides mathematical framework for understanding why market events cluster and exhibit memory effects.
### Learning Progressive Complexity
**Beginner Mathematical Concepts:**
- **Volatility Dimensions:** Understanding multi-dimensional analysis
- **Regime Classification:** Learning market personality types
- **Signal Democracy:** Algorithmic consensus building
- **Visual Mathematics:** Interpreting mathematical concepts visually
**Intermediate Mathematical Applications:**
- **Mahalanobis Distance:** Statistical distance in multi-dimensional space
- **Rescaled Range Analysis:** Fractal dimension measurement
- **Information Entropy:** Quantifying uncertainty and causality
- **Field Theory:** Understanding mathematical fields in market context
**Advanced Mathematical Integration:**
- **Tensor Field Dynamics:** Multi-dimensional market force analysis
- **Stochastic Self-Excitation:** Event clustering and memory effects
- **Categorical Composition:** Mathematical signal combination theory
- **Topological Market Analysis:** Understanding market shape and connectivity
### Practical Mathematical Intuition
**Developing Market Mathematics Intuition:**
The TMAE serves as a bridge between abstract mathematical concepts and practical trading applications. Traders develop intuitive understanding of:
- **How markets exhibit mathematical structure beneath apparent randomness**
- **Why multi-dimensional analysis reveals patterns invisible to single-variable approaches**
- **How information flows through markets in measurable, predictable ways**
- **Why mathematical models provide probabilistic edges rather than certainties**
---
## 🔬 IMPLEMENTATION AND OPTIMIZATION
### Getting Started Protocol
**Phase 1: Observation (Week 1)**
1. **Apply with defaults:** Use standard settings on your primary trading timeframe
2. **Study visual elements:** Learn to interpret tensor fields, portals, and streams
3. **Monitor dashboard:** Observe how metrics change with market conditions
4. **No trading:** Focus entirely on pattern recognition and understanding
**Phase 2: Pattern Recognition (Week 2-3)**
1. **Identify signal patterns:** Note what market conditions produce different signal qualities
2. **Regime correlation:** Observe how Hurst regimes affect signal performance
3. **Visual confirmation:** Learn to read tensor field expansion and portal signals
4. **Component analysis:** Understand which components drive signals in different markets
**Phase 3: Parameter Optimization (Week 4-5)**
1. **Asset-specific tuning:** Adjust parameters for your specific trading instrument
2. **Timeframe optimization:** Fine-tune for your preferred trading timeframe
3. **Sensitivity adjustment:** Balance signal frequency with quality
4. **Visual customization:** Optimize colors and intensity for your trading environment
**Phase 4: Live Implementation (Week 6+)**
1. **Paper trading:** Test signals with hypothetical trades
2. **Small position sizing:** Begin with minimal risk during learning phase
3. **Performance tracking:** Monitor actual vs. expected signal performance
4. **Continuous optimization:** Refine settings based on real performance data
### Performance Monitoring System
**Signal Quality Tracking:**
- **ELITE Signal Win Rate:** Track highest quality signals separately
- **Component Performance:** Monitor which components provide best signals
- **Regime Performance:** Analyze performance across different market regimes
- **Timeframe Analysis:** Compare performance across different session times
**Mathematical Metric Correlation:**
- **Field Strength vs. Performance:** Higher field strength should correlate with better performance
- **Component Agreement vs. Win Rate:** More component agreement should improve win rates
- **Regime Alignment vs. Success:** Trading with mathematical regime should outperform
### Continuous Optimization Process
**Monthly Review Protocol:**
1. **Performance Analysis:** Review win rates, profit factors, and maximum drawdown
2. **Parameter Assessment:** Evaluate if current settings remain optimal
3. **Market Adaptation:** Adjust for changes in market character or volatility
4. **Component Weighting:** Consider if certain components should receive more/less emphasis
**Quarterly Deep Analysis:**
1. **Mathematical Model Validation:** Verify that mathematical relationships remain valid
2. **Regime Distribution:** Analyze time spent in different market regimes
3. **Signal Evolution:** Track how signal characteristics change over time
4. **Correlation Analysis:** Monitor correlations between different mathematical components
---
## 🌟 UNIQUE INNOVATIONS AND CONTRIBUTIONS
### Revolutionary Mathematical Integration
**First-Ever Implementations:**
1. **Multi-Dimensional Volatility Tensor:** First indicator to implement true tensor analysis for market volatility
2. **Real-Time Hawkes Process:** First trading implementation of self-exciting point processes
3. **Transfer Entropy Trading Signals:** First practical application of information theory for trade generation
4. **Democratic Component Voting:** First algorithmic consensus system for signal generation
5. **Fractal-Projected Signal Quality:** First system to predict signal quality at future price levels
### Advanced Visualization Innovations
**Mathematical Visualization Breakthroughs:**
- **Tensor Field Radiation:** Visual representation of mathematical field energy
- **Dimensional Portal System:** Category theory visualization for regime transitions
- **Information Flow Streams:** Real-time visual display of market information transfer
- **Multi-Layer Fractal Grid:** Intelligent spacing and projection system
- **Regime Intensity Mapping:** Dynamic background showing mathematical regime strength
### Practical Trading Innovations
**Trading System Advances:**
- **Quality-Weighted Signal Generation:** Signals rated by mathematical confidence
- **Regime-Adaptive Strategy Selection:** Automatic strategy optimization based on market personality
- **Anti-Spam Signal Protection:** Mathematical prevention of signal clustering
- **Component Performance Tracking:** Real-time monitoring of algorithmic component success
- **Field-Strength Position Sizing:** Mathematical volatility integration for risk management
---
## ⚖️ RESPONSIBLE USAGE AND LIMITATIONS
### Mathematical Model Limitations
**Understanding Model Boundaries:**
While the TMAE implements sophisticated mathematical concepts, traders must understand fundamental limitations:
- **Markets Are Not Purely Mathematical:** Human psychology, news events, and fundamental factors create unpredictable elements
- **Past Performance Limitations:** Mathematical relationships that worked historically may not persist indefinitely
- **Model Risk:** Complex models can fail during unprecedented market conditions
- **Overfitting Potential:** Highly optimized parameters may not generalize to future market conditions
### Proper Implementation Guidelines
**Risk Management Requirements:**
- **Never Risk More Than 2% Per Trade:** Regardless of signal quality
- **Diversification Mandatory:** Don't rely solely on mathematical signals
- **Position Sizing Discipline:** Use mathematical field strength for sizing, not confidence
- **Stop Loss Non-Negotiable:** Every trade must have predefined risk parameters
**Realistic Expectations:**
- **Mathematical Edge, Not Certainty:** The indicator provides probabilistic advantages, not guaranteed outcomes
- **Learning Curve Required:** Complex mathematical concepts require time to master
- **Market Adaptation Necessary:** Parameters must evolve with changing market conditions
- **Continuous Education Important:** Understanding underlying mathematics improves application
### Ethical Trading Considerations
**Market Impact Awareness:**
- **Information Asymmetry:** Advanced mathematical analysis may provide advantages over other market participants
- **Position Size Responsibility:** Large positions based on mathematical signals can impact market structure
- **Sharing Knowledge:** Consider educational contributions to trading community
- **Fair Market Participation:** Use mathematical advantages responsibly within market framework
### Professional Development Path
**Skill Development Sequence:**
1. **Basic Mathematical Literacy:** Understand fundamental concepts before advanced application
2. **Risk Management Mastery:** Develop disciplined risk control before relying on complex signals
3. **Market Psychology Understanding:** Combine mathematical analysis with behavioral market insights
4. **Continuous Learning:** Stay updated on mathematical finance developments and market evolution
---
## 🔮 CONCLUSION
The Tensor Market Analysis Engine represents a quantum leap forward in technical analysis, successfully bridging the gap between advanced pure mathematics and practical trading applications. By integrating multi-dimensional volatility analysis, fractal market theory, and information flow dynamics, the TMAE reveals market structure invisible to conventional analysis while maintaining visual clarity and practical usability.
### Mathematical Innovation Legacy
This indicator establishes new paradigms in technical analysis:
- **Tensor analysis for market volatility understanding**
- **Stochastic self-excitation for event clustering prediction**
- **Information theory for causality-based trade generation**
- **Democratic algorithmic consensus for signal quality enhancement**
- **Mathematical field visualization for intuitive market understanding**
### Practical Trading Revolution
Beyond mathematical innovation, the TMAE transforms practical trading:
- **Quality-rated signals replace binary buy/sell decisions**
- **Regime-adaptive strategies automatically optimize for market personality**
- **Multi-dimensional risk management integrates mathematical volatility measures**
- **Visual mathematical concepts make complex analysis immediately interpretable**
- **Educational value creates lasting improvement in trading understanding**
### Future-Proof Design
The mathematical foundations ensure lasting relevance:
- **Universal mathematical principles transcend market evolution**
- **Multi-dimensional analysis adapts to new market structures**
- **Regime detection automatically adjusts to changing market personalities**
- **Component democracy allows for future algorithmic additions**
- **Mathematical visualization scales with increasing market complexity**
### Commitment to Excellence
The TMAE represents more than an indicator—it embodies a philosophy of bringing rigorous mathematical analysis to trading while maintaining practical utility and visual elegance. Every component, from the multi-dimensional tensor fields to the democratic signal generation, reflects a commitment to mathematical accuracy, trading practicality, and educational value.
### Trading with Mathematical Precision
In an era where markets grow increasingly complex and computational, the TMAE provides traders with mathematical tools previously available only to institutional quantitative research teams. Yet unlike academic mathematical models, the TMAE translates complex concepts into intuitive visual representations and practical trading signals.
By combining the mathematical rigor of tensor analysis, the statistical power of multi-dimensional volatility modeling, and the information-theoretic insights of transfer entropy, traders gain unprecedented insight into market structure and dynamics.
### Final Perspective
Markets, like nature, exhibit profound mathematical beauty beneath apparent chaos. The Tensor Market Analysis Engine serves as a mathematical lens that reveals this hidden order, transforming how traders perceive and interact with market structure.
Through mathematical precision, visual elegance, and practical utility, the TMAE empowers traders to see beyond the noise and trade with the confidence that comes from understanding the mathematical principles governing market behavior.
Trade with mathematical insight. Trade with the power of tensors. Trade with the TMAE.
*"In mathematics, you don't understand things. You just get used to them." - John von Neumann*
*With the TMAE, mathematical market understanding becomes not just possible, but intuitive.*
— Dskyz, Trade with insight. Trade with anticipation.
Mandelbrot-Fibonacci Cascade Vortex (MFCV)Mandelbrot-Fibonacci Cascade Vortex (MFCV) - Where Chaos Theory Meets Sacred Geometry
A Revolutionary Synthesis of Fractal Mathematics and Golden Ratio Dynamics
What began as an exploration into Benoit Mandelbrot's fractal market hypothesis and the mysterious appearance of Fibonacci sequences in nature has culminated in a groundbreaking indicator that reveals the hidden mathematical structure underlying market movements. This indicator represents months of research into chaos theory, fractal geometry, and the golden ratio's manifestation in financial markets.
The Theoretical Foundation
Mandelbrot's Fractal Market Hypothesis Traditional efficient market theory assumes normal distributions and random walks. Mandelbrot proved markets are fractal - self-similar patterns repeating across all timeframes with power-law distributions. The MFCV implements this through:
Hurst Exponent Calculation: H = log(R/S) / log(n/2)
Where:
R = Range of cumulative deviations
S = Standard deviation
n = Period length
This measures market memory:
H > 0.5: Trending (persistent) behavior
H = 0.5: Random walk
H < 0.5: Mean-reverting (anti-persistent) behavior
Fractal Dimension: D = 2 - H
This quantifies market complexity, where higher dimensions indicate more chaotic behavior.
Fibonacci Vortex Theory Markets don't move linearly - they spiral. The MFCV reveals these spirals using Fibonacci sequences:
Vortex Calculation: Vortex(n) = Price + sin(bar_index × φ / Fn) × ATR(Fn) × Volume_Factor
Where:
φ = 0.618 (golden ratio)
Fn = Fibonacci number (8, 13, 21, 34, 55)
Volume_Factor = 1 + (Volume/SMA(Volume,50) - 1) × 0.5
This creates oscillating spirals that contract and expand with market energy.
The Volatility Cascade System
Markets exhibit volatility clustering - Mandelbrot's "Noah Effect." The MFCV captures this through cascading volatility bands:
Cascade Level Calculation: Level(i) = ATR(20) × φ^i
Each level represents a different fractal scale, creating a multi-dimensional view of market structure. The golden ratio spacing ensures harmonic resonance between levels.
Implementation Architecture
Core Components:
Fractal Analysis Engine
Calculates Hurst exponent over user-defined periods
Derives fractal dimension for complexity measurement
Identifies market regime (trending/ranging/chaotic)
Fibonacci Vortex Generator
Creates 5 independent spiral oscillators
Each spiral follows a Fibonacci period
Volume amplification creates dynamic response
Cascade Band System
Up to 8 volatility levels
Golden ratio expansion between levels
Dynamic coloring based on fractal state
Confluence Detection
Identifies convergence of vortex and cascade levels
Highlights high-probability reversal zones
Real-time confluence strength calculation
Signal Generation Logic
The MFCV generates two primary signal types:
Fractal Signals: Generated when:
Hurst > 0.65 (strong trend) AND volatility expanding
Hurst < 0.35 (mean reversion) AND RSI < 35
Trend strength > 0.4 AND vortex alignment
Cascade Signals: Triggered by:
RSI > 60 AND price > SMA(50) AND bearish vortex
RSI < 40 AND price < SMA(50) AND bullish vortex
Volatility expansion AND trend strength > 0.3
Both signals implement a 15-bar cooldown to prevent overtrading.
Advanced Input System
Mandelbrot Parameters:
Cascade Levels (3-8):
Controls number of volatility bands
Crypto: 5-7 (high volatility)
Indices: 4-5 (moderate volatility)
Forex: 3-4 (low volatility)
Hurst Period (20-200):
Lookback for fractal calculation
Scalping: 20-50
Day Trading: 50-100
Swing Trading: 100-150
Position Trading: 150-200
Cascade Ratio (1.0-3.0):
Band width multiplier
1.618: Golden ratio (default)
Higher values for trending markets
Lower values for ranging markets
Fractal Memory (21-233):
Fibonacci retracement lookback
Uses Fibonacci numbers for harmonic alignment
Fibonacci Vortex Settings:
Spiral Periods:
Comma-separated Fibonacci sequence
Fast: "5,8,13,21,34" (scalping)
Standard: "8,13,21,34,55" (balanced)
Extended: "13,21,34,55,89" (swing)
Rotation Speed (0.1-2.0):
Controls spiral oscillation frequency
0.618: Golden ratio (balanced)
Higher = more signals, more noise
Lower = smoother, fewer signals
Volume Amplification:
Enables dynamic spiral expansion
Essential for stocks and crypto
Disable for forex (no central volume)
Visual System Architecture
Cascade Bands:
Multi-level volatility envelopes
Gradient coloring from primary to secondary theme
Transparency increases with distance from price
Fill between bands shows fractal structure
Vortex Spirals:
5 Fibonacci-period oscillators
Blue above price (bullish pressure)
Red below price (bearish pressure)
Multiple display styles: Lines, Circles, Dots, Cross
Dynamic Fibonacci Levels:
Auto-updating retracement levels
Smart update logic prevents disruption near levels
Distance-based transparency (closer = more visible)
Updates every 50 bars or on volatility spikes
Confluence Zones:
Highlighted boxes where indicators converge
Stronger confluence = stronger support/resistance
Key areas for reversal trades
Professional Dashboard System
Main Fractal Dashboard: Displays real-time:
Hurst Exponent with market state
Fractal Dimension with complexity level
Volatility Cascade status
Vortex rotation impact
Market regime classification
Signal strength percentage
Active indicator levels
Vortex Metrics Panel: Shows:
Individual spiral deviations
Convergence/divergence metrics
Real-time vortex positioning
Fibonacci period performance
Fractal Metrics Display: Tracks:
Dimension D value
Market complexity rating
Self-similarity strength
Trend quality assessment
Theory Guide Panel: Educational reference showing:
Mandelbrot principles
Fibonacci vortex concepts
Dynamic trading suggestions
Trading Applications
Trend Following:
High Hurst (>0.65) indicates strong trends
Follow cascade band direction
Use vortex spirals for entry timing
Exit when Hurst drops below 0.5
Mean Reversion:
Low Hurst (<0.35) signals reversal potential
Trade toward vortex spiral convergence
Use Fibonacci levels as targets
Tighten stops in chaotic regimes
Breakout Trading:
Monitor cascade band compression
Watch for vortex spiral alignment
Volatility expansion confirms breakouts
Use confluence zones for targets
Risk Management:
Position size based on fractal dimension
Wider stops in high complexity markets
Tighter stops when Hurst is extreme
Scale out at Fibonacci levels
Market-Specific Optimization
Cryptocurrency:
Cascade Levels: 5-7
Hurst Period: 50-100
Rotation Speed: 0.786-1.2
Enable volume amplification
Stock Indices:
Cascade Levels: 4-5
Hurst Period: 80-120
Rotation Speed: 0.5-0.786
Moderate cascade ratio
Forex:
Cascade Levels: 3-4
Hurst Period: 100-150
Rotation Speed: 0.382-0.618
Disable volume amplification
Commodities:
Cascade Levels: 4-6
Hurst Period: 60-100
Rotation Speed: 0.5-1.0
Seasonal adjustment consideration
Innovation and Originality
The MFCV represents several breakthrough innovations:
First Integration of Mandelbrot Fractals with Fibonacci Vortex Theory
Unique synthesis of chaos theory and sacred geometry
Novel application of Hurst exponent to spiral dynamics
Dynamic Volatility Cascade System
Golden ratio-based band expansion
Multi-timeframe fractal analysis
Self-adjusting to market conditions
Volume-Amplified Vortex Spirals
Revolutionary spiral calculation method
Dynamic response to market participation
Multiple Fibonacci period integration
Intelligent Signal Generation
Cooldown system prevents overtrading
Multi-factor confirmation required
Regime-aware signal filtering
Professional Analytics Dashboard
Institutional-grade metrics display
Real-time fractal analysis
Educational integration
Development Journey
Creating the MFCV involved overcoming numerous challenges:
Mathematical Complexity: Implementing Hurst exponent calculations efficiently
Visual Clarity: Displaying multiple indicators without cluttering
Performance Optimization: Managing array operations and calculations
Signal Quality: Balancing sensitivity with reliability
User Experience: Making complex theory accessible
The result is an indicator that brings PhD-level mathematics to practical trading while maintaining visual elegance and usability.
Best Practices and Guidelines
Start Simple: Use default settings initially
Match Timeframe: Adjust parameters to your trading style
Confirm Signals: Never trade MFCV signals in isolation
Respect Regimes: Adapt strategy to market state
Manage Risk: Use fractal dimension for position sizing
Color Themes
Six professional themes included:
Fractal: Balanced blue/purple palette
Golden: Warm Fibonacci-inspired colors
Plasma: Vibrant modern aesthetics
Cosmic: Dark mode optimized
Matrix: Classic green terminal
Fire: Heat map visualization
Disclaimer
This indicator is for educational and research purposes only. It does not constitute financial advice. While the MFCV reveals deep market structure through advanced mathematics, markets remain inherently unpredictable. Past performance does not guarantee future results.
The integration of Mandelbrot's fractal theory with Fibonacci vortex dynamics provides unique market insights, but should be used as part of a comprehensive trading strategy. Always use proper risk management and never risk more than you can afford to lose.
Acknowledgments
Special thanks to Benoit Mandelbrot for revolutionizing our understanding of markets through fractal geometry, and to the ancient mathematicians who discovered the golden ratio's universal significance.
"The geometry of nature is fractal... Markets are fractal too." - Benoit Mandelbrot
Revealing the Hidden Order in Market Chaos Trade with Mathematical Precision. Trade with MFCV.
— Created with passion for the TradingView community
Trade with insight. Trade with anticipation.
— Dskyz , for DAFE Trading Systems
Multifractal Forecast [ScorsoneEnterprises]Multifractal Forecast Indicator
The Multifractal Forecast is an indicator designed to model and forecast asset price movements using a multifractal framework. It uses concepts from fractal geometry and stochastic processes, specifically the Multifractal Model of Asset Returns (MMAR) and fractional Brownian motion (fBm), to generate price forecasts based on historical price data. The indicator visualizes potential future price paths as colored lines, providing traders with a probabilistic view of price trends over a specified trading time scale. Below is a detailed breakdown of the indicator’s functionality, inputs, calculations, and visualization.
Overview
Purpose: The indicator forecasts future price movements by simulating multiple price paths based on a multifractal model, which accounts for the complex, non-linear behavior of financial markets.
Key Concepts:
Multifractal Model of Asset Returns (MMAR): Models price movements as a multifractal process, capturing varying degrees of volatility and self-similarity across different time scales.
Fractional Brownian Motion (fBm): A generalization of Brownian motion that incorporates long-range dependence and self-similarity, controlled by the Hurst exponent.
Binomial Cascade: Used to model trading time, introducing heterogeneity in time scales to reflect market activity bursts.
Hurst Exponent: Measures the degree of long-term memory in the price series (persistence, randomness, or mean-reversion).
Rescaled Range (R/S) Analysis: Estimates the Hurst exponent to quantify the fractal nature of the price series.
Inputs
The indicator allows users to customize its behavior through several input parameters, each influencing the multifractal model and forecast generation:
Maximum Lag (max_lag):
Type: Integer
Default: 50
Minimum: 5
Purpose: Determines the maximum lag used in the rescaled range (R/S) analysis to calculate the Hurst exponent. A higher lag increases the sample size for Hurst estimation but may smooth out short-term dynamics.
2 to the n values in the Multifractal Model (n):
Type: Integer
Default: 4
Purpose: Defines the resolution of the multifractal model by setting the size of arrays used in calculations (N = 2^n). For example, n=4 results in N=16 data points. Larger n increases computational complexity and detail but may exceed Pine Script’s array size limits (capped at 100,000).
Multiplier for Binomial Cascade (m):
Type: Float
Default: 0.8
Purpose: Controls the asymmetry in the binomial cascade, which models trading time. The multiplier m (and its complement 2.0 - m) determines how mass is distributed across time scales. Values closer to 1 create more balanced cascades, while values further from 1 introduce more variability.
Length Scale for fBm (L):
Type: Float
Default: 100,000.0
Purpose: Scales the fractional Brownian motion output, affecting the amplitude of simulated price paths. Larger values increase the magnitude of forecasted price movements.
Cumulative Sum (cum):
Type: Integer (0 or 1)
Default: 1
Purpose: Toggles whether the fBm output is cumulatively summed (1=On, 0=Off). When enabled, the fBm series is accumulated to simulate a price path with memory, resembling a random walk with long-range dependence.
Trading Time Scale (T):
Type: Integer
Default: 5
Purpose: Defines the forecast horizon in bars (20 bars into the future). It also scales the binomial cascade’s output to align with the desired trading time frame.
Number of Simulations (num_simulations):
Type: Integer
Default: 5
Minimum: 1
Purpose: Specifies how many forecast paths are simulated and plotted. More simulations provide a broader range of possible price outcomes but increase computational load.
Core Calculations
The indicator combines several mathematical and statistical techniques to generate price forecasts. Below is a step-by-step explanation of its calculations:
Log Returns (lgr):
The indicator calculates log returns as math.log(close / close ) when both the current and previous close prices are positive. This measures the relative price change in a logarithmic scale, which is standard for financial time series analysis to stabilize variance.
Hurst Exponent Estimation (get_hurst_exponent):
Purpose: Estimates the Hurst exponent (H) to quantify the degree of long-term memory in the price series.
Method: Uses rescaled range (R/S) analysis:
For each lag from 2 to max_lag, the function calc_rescaled_range computes the rescaled range:
Calculate the mean of the log returns over the lag period.
Compute the cumulative deviation from the mean.
Find the range (max - min) of the cumulative deviation.
Divide the range by the standard deviation of the log returns to get the rescaled range.
The log of the rescaled range (log(R/S)) is regressed against the log of the lag (log(lag)) using the polyfit_slope function.
The slope of this regression is the Hurst exponent (H).
Interpretation:
H = 0.5: Random walk (no memory, like standard Brownian motion).
H > 0.5: Persistent behavior (trends tend to continue).
H < 0.5: Mean-reverting behavior (price tends to revert to the mean).
Fractional Brownian Motion (get_fbm):
Purpose: Generates a fractional Brownian motion series to model price movements with long-range dependence.
Inputs: n (array size 2^n), H (Hurst exponent), L (length scale), cum (cumulative sum toggle).
Method:
Computes covariance for fBm using the formula: 0.5 * (|i+1|^(2H) - 2 * |i|^(2H) + |i-1|^(2H)).
Uses Hosking’s method (referenced from Columbia University’s implementation) to generate fBm:
Initializes arrays for covariance (cov), intermediate calculations (phi, psi), and output.
Iteratively computes the fBm series by incorporating a random term scaled by the variance (v) and covariance structure.
Applies scaling based on L / N^H to adjust the amplitude.
Optionally applies cumulative summation if cum = 1 to produce a path with memory.
Output: An array of 2^n values representing the fBm series.
Binomial Cascade (get_binomial_cascade):
Purpose: Models trading time (theta) to account for non-uniform market activity (e.g., bursts of volatility).
Inputs: n (array size 2^n), m (multiplier), T (trading time scale).
Method:
Initializes an array of size 2^n with values of 1.0.
Iteratively applies a binomial cascade:
For each block (from 0 to n-1), splits the array into segments.
Randomly assigns a multiplier (m or 2.0 - m) to each segment, redistributing mass.
Normalizes the array by dividing by its sum and scales by T.
Checks for array size limits to prevent Pine Script errors.
Output: An array (theta) representing the trading time, which warps the fBm to reflect market activity.
Interpolation (interpolate_fbm):
Purpose: Maps the fBm series to the trading time scale to produce a forecast.
Method:
Computes the cumulative sum of theta and normalizes it to .
Interpolates the fBm series linearly based on the normalized trading time.
Ensures the output aligns with the trading time scale (T).
Output: An array of interpolated fBm values representing log returns over the forecast horizon.
Price Path Generation:
For each simulation (up to num_simulations):
Generates an fBm series using get_fbm.
Interpolates it with the trading time (theta) using interpolate_fbm.
Converts log returns to price levels:
Starts with the current close price.
For each step i in the forecast horizon (T), computes the price as prev_price * exp(log_return).
Output: An array of price levels for each simulation.
Visualization:
Trigger: Updates every T bars when the bar state is confirmed (barstate.isconfirmed).
Process:
Clears previous lines from line_array.
For each simulation, plots a line from the current bar’s close price to the forecasted price at bar_index + T.
Colors the line using a gradient (color.from_gradient) based on the final forecasted price relative to the minimum and maximum forecasted prices across all simulations (red for lower prices, teal for higher prices).
Output: Multiple colored lines on the chart, each representing a possible price path over the next T bars.
How It Works on the Chart
Initialization: On each bar, the indicator calculates the Hurst exponent (H) using historical log returns and prepares the trading time (theta) using the binomial cascade.
Forecast Generation: Every T bars, it generates num_simulations price paths:
Each path starts at the current close price.
Uses fBm to model log returns, warped by the trading time.
Converts log returns to price levels.
Plotting: Draws lines from the current bar to the forecasted price T bars ahead, with colors indicating relative price levels.
Dynamic Updates: The forecast updates every T bars, replacing old lines with new ones based on the latest price data and calculations.
Key Features
Multifractal Modeling: Captures complex market dynamics by combining fBm (long-range dependence) with a binomial cascade (non-uniform time).
Customizable Parameters: Allows users to adjust the forecast horizon, model resolution, scaling, and number of simulations.
Probabilistic Forecast: Multiple simulations provide a range of possible price outcomes, helping traders assess uncertainty.
Visual Clarity: Gradient-colored lines make it easy to distinguish bullish (teal) and bearish (red) forecasts.
Potential Use Cases
Trend Analysis: Identify potential price trends or reversals based on the direction and spread of forecast lines.
Risk Assessment: Evaluate the range of possible price outcomes to gauge market uncertainty.
Volatility Analysis: The Hurst exponent and binomial cascade provide insights into market persistence and volatility clustering.
Limitations
Computational Intensity: Large values of n or num_simulations may slow down execution or hit Pine Script’s array size limits.
Randomness: The binomial cascade and fBm rely on random terms (math.random), which may lead to variability between runs.
Assumptions: The model assumes log-normal price movements and fractal behavior, which may not always hold in extreme market conditions.
Adjusting Inputs:
Set max_lag based on the desired depth of historical analysis.
Adjust n for model resolution (start with 4–6 to avoid performance issues).
Tune m to control trading time variability (0.5–1.5 is typical).
Set L to scale the forecast amplitude (experiment with values like 10,000–1,000,000).
Choose T based on your trading horizon (20 for short-term, 50 for longer-term for example).
Select num_simulations for the number of forecast paths (5–10 is reasonable for visualization).
Interpret Output:
Teal lines suggest bullish scenarios, red lines suggest bearish scenarios.
A wide spread of lines indicates high uncertainty; convergence suggests a stronger trend.
Monitor Updates: Forecasts update every T bars, so check the chart periodically for new projections.
Chart Examples
This is a daily AMEX:SPY chart with default settings. We see the simulations being done every T bars and they provide a range for us to analyze with a few simulations still in the range.
On this intraday PEPPERSTONE:COCOA chart I modified the Length Scale for fBm, L, parameter to be 1000 from 100000. Adjusting the parameter as you switch between timeframes can give you more contextual simulations.
On BITSTAMP:ETHUSD I modified the L to be 1000000 to have a more contextual set of simulations with crypto's volatile nature.
With L at 100000 we see the range for NASDAQ:TLT is correctly simulated. The recent pop stays within the bounds of the highest simulation. Note this is a cherry picked example to show the power and potential of these simulations.
Technical Notes
Error Handling: The script includes checks for array size limits and division by zero (math.abs(denominator) > 1e-10, v := math.max(v, 1e-10)).
External Reference: The fBm implementation is based on Hosking’s method (www.columbia.edu), ensuring a robust algorithm.
Conclusion
The Multifractal Forecast is a powerful tool for traders seeking to model complex market dynamics using a multifractal framework. By combining fBm, binomial cascades, and Hurst exponent analysis, it generates probabilistic price forecasts that account for long-range dependence and non-uniform market activity. Its customizable inputs and clear visualizations make it suitable for both technical analysis and strategy development, though users should be mindful of its computational demands and parameter sensitivity. For optimal use, experiment with input settings and validate forecasts against other technical indicators or market conditions.
Advanced Multi-Timeframe Trading System (Risk Managed)Description:
This strategy is an original approach that combines two main analytical components to identify potential trade opportunities while simulating realistic trading conditions:
1. Market Trend Analysis via an Approximate Hurst Exponent
• What It Does:
The strategy computes a rough measure of market trending using an approximate Hurst exponent. A value above 0.5 suggests persistent, trending behavior, while a value below 0.5 indicates a tendency toward mean-reversion.
• How It’s Used:
The Hurst exponent is calculated on both the chart’s current timeframe and a higher timeframe (default: Daily) to capture both local and broader market dynamics.
2. Fibonacci Retracement Levels
• What It Does:
Using daily high and low data from a selected timeframe (default: Daily), the script computes key Fibonacci retracement levels.
• How It’s Used:
• The 61.8% level (Golden Ratio) serves as a key threshold:
• A long entry is signaled when the price crosses above this level if the daily Hurst exponent confirms a trending market.
• The 38.2% level is used to identify short-entry opportunities when the price crosses below it and the daily Hurst indicates non-trending conditions.
Signal Logic:
• Long Entry:
When the price crosses above the 61.8% Fibonacci level (Golden Ratio) and the daily Hurst exponent is greater than 0.5, suggesting a trending market.
• Short Entry:
When the price crosses below the 38.2% Fibonacci level and the daily Hurst exponent is less than 0.5, indicating a less trending or potentially reversing market.
Risk Management & Trade Execution:
• Stop-Loss:
Each trade is risk-managed with a stop-loss set at 2% below (for longs) or above (for shorts) the entry price. This ensures that no single trade risks more than a small, sustainable portion of the account.
• Take Profit:
A take profit order targets a risk-reward ratio of 1:2 (i.e., the target profit is twice the amount risked).
• Position Sizing:
Trades are executed with a fixed position size equal to 10% of account equity.
• Trade Frequency Limits:
• Daily Limit: A maximum of 5 trades per day
• Overall Limit: No more than 510 trades during the backtesting period (e.g., since 2019)
These limits are imposed to simulate realistic trading frequency and to avoid overtrading in backtest results.
Backtesting Parameters:
• Initial Capital: $10,000
• Commission: 0.1% per trade
• Slippage: 1 tick per bar
These settings aim to reflect the conditions faced by the average trader and help ensure that the backtesting results are realistic and not misleading.
Chart Overlays & Visual Aids:
• Fibonacci Levels:
The key Fibonacci retracement levels are plotted on the chart, and the zone between the 61.8% and 38.2% levels is highlighted to show a key retracement area.
• Market Trend Background:
The chart background is tinted green when the daily Hurst exponent indicates a trending market (value > 0.5) and red otherwise.
• Information Table:
An on-chart table displays key parameters such as the current Hurst exponent, daily Hurst value, the number of trades executed today, and the global trade count.
Disclaimer:
Past performance is not indicative of future results. This strategy is experimental and provided solely for educational purposes. It is essential that you backtest and paper trade using your own settings before considering any live deployment. The Hurst exponent calculation is an approximation and should be interpreted as a rough gauge of market behavior. Adjust the parameters and risk management settings according to your personal risk tolerance and market conditions.
Additional Notes:
• Originality & Usefulness:
This script is an original mashup that combines trend analysis with Fibonacci retracement methods. The description above explains how these components work together to provide trading signals.
• Realistic Results:
The strategy uses realistic account sizes, commission rates, slippage, and risk management rules to generate backtesting results that are representative of real-world trading.
• Educational Purpose:
This script is intended to support the TradingView community by offering insights into combining multiple analysis techniques in one strategy. It is not a “get-rich-quick” system but rather an educational tool to help traders understand risk management and trade signal logic.
By using this script, you acknowledge that trading involves risk and that you are responsible for testing and adjusting the strategy to fit your own trading environment. This publication is fully open source, and any modifications should include proper attribution if significant portions of the code are reused.
Entropy Balance Oscillator [JOAT]
Entropy Balance Oscillator - Chaos Theory Edition
Overview
Entropy Balance Oscillator is an open-source oscillator indicator that applies chaos theory concepts to market analysis. It calculates market entropy (disorder/randomness), balance (price position within range), and various chaos metrics to identify whether the market is in an ordered, chaotic, or balanced state. This helps traders understand market regime and adjust their strategies accordingly.
What This Indicator Does
The indicator calculates and displays:
Entropy - Measures market disorder using return distribution analysis
Balance - Price position within the high-low range, normalized to -1 to +1
Lyapunov Exponent - Estimates sensitivity to initial conditions (chaos indicator)
Hurst Exponent - Measures long-term memory in price series (trend persistence)
Strange Attractor - Simulated attractor points for visualization
Bifurcation Detection - Identifies potential regime change points
Chaos Index - Combined entropy and volatility score
Market Phase - Classification as CHAOS, ORDER, or BALANCED
How It Works
Entropy is calculated using return distribution:
calculateEntropy(series float price, simple int period) =>
// Calculate returns and their absolute values
// Sum absolute returns for normalization
// Apply Shannon entropy formula: -sum(p * log(p))
float entropy = 0.0
for i = 0 to array.size(returns) - 1
float prob = math.abs(array.get(returns, i)) / sumAbs
if prob > 0
entropy -= prob * math.log(prob)
entropy
Balance measures price position within range:
calculateBalance(series float high, series float low, series float close, simple int period) =>
float range = high - low
float position = (close - low) / (range > 0 ? range : 1)
float balance = ta.ema(position, period)
(balance - 0.5) * 2 // Normalize to -1 to +1
Lyapunov Exponent estimates chaos sensitivity:
lyapunovExponent(series float price, simple int period) =>
float sumLog = 0.0
for i = 1 to period
float ratio = price > 0 ? math.abs(price / price ) : 1.0
if ratio > 0
sumLog += math.log(ratio)
lyapunov := sumLog / period
Hurst Exponent measures trend persistence:
H > 0.5: Trending/persistent behavior
H = 0.5: Random walk
H < 0.5: Mean-reverting behavior
Signal Generation
Phase changes and extreme conditions generate signals:
Chaos Phase: Normalized entropy exceeds chaos threshold (default 0.7)
Order Phase: Normalized entropy falls below order threshold (default 0.3)
Extreme Chaos: Entropy exceeds 1.5x chaos threshold
Extreme Order: Entropy falls below 0.5x order threshold
Bifurcation: Variance exceeds 2x average variance
Dashboard Panel (Top-Right)
Market Phase - Current phase (CHAOS/ORDER/BALANCED)
Entropy Level - Normalized entropy value
Balance - Current balance reading (-1 to +1)
Chaos Index - Combined chaos score percentage
Volatility - Current price volatility
Lyapunov Exp - Lyapunov exponent value
Hurst Exponent - Hurst exponent value
Chaos Score - Overall chaos assessment
Status - Current market status
Visual Elements
Entropy Line - Main oscillator showing normalized entropy
Entropy EMA - Smoothed entropy for trend reference
Balance Area - Filled area showing balance direction
Chaos/Order Thresholds - Horizontal dashed lines
Lyapunov Line - Step line showing Lyapunov exponent
Strange Attractor - Circle plots showing attractor points
Phase Space - Line showing phase space reconstruction
Phase Background - Background color based on current phase
Extreme Markers - X-cross for extreme chaos, diamond for extreme order
Bifurcation Markers - Circles at potential regime changes
Input Parameters
Entropy Period (default: 20) - Period for entropy calculation
Balance Period (default: 14) - Period for balance calculation
Chaos Threshold (default: 0.7) - Threshold for chaos phase
Order Threshold (default: 0.3) - Threshold for order phase
Lyapunov Exponent (default: true) - Enable Lyapunov calculation
Hurst Exponent (default: true) - Enable Hurst calculation
Strange Attractor (default: true) - Enable attractor visualization
Bifurcation Detection (default: true) - Enable bifurcation detection
Suggested Use Cases
Identify market regime for strategy selection (trend-following vs mean-reversion)
Watch for phase changes as potential trading environment shifts
Use Hurst exponent to assess trend persistence
Monitor chaos index for volatility regime awareness
Avoid trading during extreme chaos phases
Timeframe Recommendations
Best on 1H to Daily charts. Chaos metrics require sufficient data for meaningful calculations.
Limitations
Chaos theory concepts are applied as analogies, not rigorous mathematical implementations
Lyapunov and Hurst calculations are simplified approximations
Strange attractor visualization is conceptual
Bifurcation detection uses variance as proxy
Open-Source and Disclaimer
This script is published as open-source under the Mozilla Public License 2.0 for educational purposes. It does not constitute financial advice. Past performance does not guarantee future results. Always use proper risk management.
- Made with passion by officialjackofalltrades
Wave Dynamics - Neural Adaptive Engine🌊 WAVE DYNAMICS - NEURAL ADAPTIVE ENGINE
The Official Reference Manual & Trading Protocol
═════════════════════════════════════════════════════════════
📖 PREFACE: THE END OF STATIC ANALYSIS
The financial markets are not linear; they are fractal. They do not move in straight lines; they breathe. They expand in trending volatility and contract in chopping noise.
The fundamental failure of traditional technical analysis is Static Sensitivity .
• A 14-period RSI works beautifully in a range but fails in a trend.
• A 12,26 MACD captures trends but destroys capital in chop.
Wave Dynamics solves this by treating the market as a living organism. At its core is a Neural Adaptive Engine that calculates the Hurst Exponent (Fractal Dimension) in real-time. It measures the "roughness" of price action and automatically adjusts the lookback periods of every subsystem—Waves, Ribbons, and Oscillators—to match the current market regime.
This manual is your guide to navigating this adaptive framework.
PART 1: THEOLOGY & MARKET PHYSICS
To use this tool, you must understand the three pillars of its logic:
1. The Hurst Exponent (Chaos Theory)
The engine continuously calculates H (Hurst) on a rolling window.
• Persistent Regime (H > 0.5): "What is happening now is likely to continue." The market is trending. The Engine Tightens sensitivity to catch fast pullbacks.
• Anti-Persistent Regime (H < 0.5): "What is happening now is likely to reverse." The market is chopping/ranging. The Engine Widens sensitivity to filter out noise and stop runs.
2. The Elliott Wave Cycle (Crowd Psychology)
Price moves in 5-wave motive sequences followed by corrections.
• Waves 1 & 3: Institutional Accumulation/Mark-up.
• Waves 2 & 4: Profit Taking (The Pullback). These are the only safe entry points.
• Wave 5: Retail FOMO (The Trap). Identified by Momentum Divergence .
3. Smart Money Concepts (Liquidity)
Price moves from liquidity to liquidity.
• Order Blocks: Where institutions initiated the move.
• Breakers: Where institutions trapped traders (Support flips to Resistance).
• Fair Value Gaps: Where price moved too fast, leaving inefficiency.
PART 2: VISUAL INTELLIGENCE (COLOR THEORY)
The chart communicates instantly through a strict color-coded language.
🎨 THE RIBBON (Adaptive Equilibrium)
The background "Cloud" is an Adaptive EMA ribbon.
• Neon Green (#00FF88): Bullish Trend. Only look for Longs. Price is above the equilibrium mean.
• Neon Red (#FF3366): Bearish Trend. Only look for Shorts. Price is below the equilibrium mean.
• Grey/Narrow: Compression. The market is deciding. Do not trade inside a grey ribbon.
🎨 INSTITUTIONAL ZONES
• Green/Red Boxes (Order Blocks): Standard Support/Resistance. Valid entry zones, but lower probability.
• Vivid Purple Boxes (#9C27B0) - THE BREAKER: CRITICAL. This appears when a Green Order Block is smashed through by price. It turns Purple to signify it has flipped from Support to Resistance (or vice versa). A retest of a Purple Zone is the highest probability setup in the system.
• Dotted Outlines (FVG): Magnets. Do not place stops inside these; price will likely travel through them.
🎨 WAVE ANATOMY
• Cyan Lines: Valid Impulse Waves (1, 3, 5).
• Orange Lines/Dots: EXHAUSTION. If a wave line turns Orange, Angular Momentum is decaying. The trend is dying.
• Diamonds (◆): DIVERGENCE. Price made a Higher High, but the internal oscillator (MPI) made a Lower Low. Immediate reversal warning.
🎨 SIGNALS
• Triangles: Confirmed Entries. (Green = Long, Red = Short).
• Labels (e.g., A+): The Grade of the trade based on Confluence.
• A+: Perfect Confluence (Trend + Structure + Zone + Momentum).
• C: Counter-trend or Weak.
PART 3: THE DASHBOARD ECOSYSTEM
Three panels provide Total Situational Awareness. You must read them in order: Top Right → Bottom Left → Bottom Right.
1. MISSION CONTROL (Top Right)
This panel tells you the "Weather Report."
• Neural Status:
• 🧠 TREND: Safe to trade breakout and trend-following strategies.
• 🧠 CHOP: Danger. Use mean-reversion or stay out.
• 🧠 RND (Random): No clear edge.
• Phase: Displays the Bias (Bull/Bear) and Strength. "WEAK BEARISH" usually signals a bottom is forming.
• Score Bar: A live visual meter of the Confluence Score (0-100%).
2. THE ASSISTANT (Bottom Left)
This panel acts as your co-pilot, translating data into English.
• Situation:
• "💎 BULL GEM": You are in a range, at the bottom, showing exhaustion. Buy immediately.
• "🔥 COMPRESSION": Volatility squeeze. A violent move is imminent.
• Action: Tells you exactly what to do (e.g., "Wait for confluence," "Trail Stop," "Let it develop").
• Pro Metrics (Simulated):
• Win Rate: The percentage of signals on the current visible chart that hit Target 1.
• Profit Factor: Gross Win / Gross Loss. If this is < 1.0, stop trading this asset immediately.
• Buckets: Shows the win rate of A-Grade signals vs. C-Grade signals.
3. WAVE INTELLIGENCE (Bottom Right)
This panel provides structural context.
• Channel Gauge (0-100%):
• 0-20%: Oversold / Channel Bottom.
• 80-100%: Overbought / Channel Top.
• 50%: Equilibrium.
• W3/W1 Ratio: The "Health Check" of the trend.
• < 1.0: Weak. Wave 3 is shorter than Wave 1. The trend is struggling.
• > 1.618: Extended. The move is parabolic. Expect a snap-back.
• Trend Health (0-100): Composite score of sub-wave physics. If Health < 30, the trend is effectively dead.
PART 4: PARAMETER OPTIMIZATION (THE INPUTS)
Every input allows you to tune the engine. Here is the deep dive:
🧠 NEURAL ADAPTIVE ENGINE
• Enable Neural Adaptive Engine: Master switch for the Hurst calculation.
• Hurst Period (100):
• Adjustment: Increase to 200 for Crypto/Alts (too much noise). Decrease to 50 for
Forex/Indices (need speed).
• How to tell: If the dashboard says "TREND" but the chart is sideways, INCREASE this value.
• Min/Max Lookback: Defines the constraints. Only adjust if you are an advanced user creating a custom scalping setup (e.g., Min 3 / Max 10).
🌊 WAVE & STRUCTURE
• Base Swing Detection (8): The "Anchor."
• Scalpers (1m-5m): Set to 5-8.
• Swing Traders (1H-4H): Set to 15-20.
• Min Wave Size (ATR): Prevents the script from labeling tiny wicks as waves. Increase this during high-volatility news events.
🔗 MTF STRUCTURE MAPPING
• Require Macro Align: Strict Mode. If enabled, the script checks the Higher Timeframe (e.g., 4H). If 4H is Bearish, it BLOCKS all Long signals on the 5m chart. Use this to prevent counter-trend losses.
🏦 SMART MONEY CONCEPTS
• Enable Breakers: ALWAYS ON. This turns failed Order Blocks into Breaker Zones (Purple).
• Institutional Mode: ULTRA STRICT. If enabled, signals will ONLY fire if price is physically touching an Order Block, FVG, or Breaker. This creates very few, very high-quality signals.
🎯 SIGNAL ENGINE
• Signal Mode:
• Strict: Grades A+ and A only.
• Balanced: Grades B and above.
• Aggressive: Includes counter-trend scalps (Grade C).
• Min Confluence Score (5-35): The raw points needed to trigger. 5 is standard. 10 is conservative.
PART 5: TRADE EXECUTION PLAYBOOKS
PLAYBOOK A: THE "BREAKER RETEST" (Highest Probability)
1. Context: Ribbon is Green.
2. Event: Price creates a Red Order Block, then smashes upward through it.
3. Change: The Red Block turns Purple (Bullish Breaker).
4. Trigger: Price pulls back down to touch the top of the Purple Box.
5. Signal: Green Triangle appears.
6. Action: Max Size Entry. Stop Loss below the Purple Box. Target Wave 3 Projection.
PLAYBOOK B: THE "WAVE 4 DIP" (Trend Following)
1. Context: Wave count shows "3". Ribbon is Green.
2. Event: Price pulls back towards the Ribbon.
3. Wave Panel: Wave count flips to "4".
4. Trigger: Price touches Ribbon, prints Green Triangle.
5. Action: Standard Size Entry. Stop Loss at Swing Low. Target New High (Wave 5).
PLAYBOOK C: THE "HIDDEN GEM" (Range Reversal)
1. Context: Ribbon is Grey (Consolidation). Neural Status is CHOP.
2. Wave Panel: Channel Gauge is < 10% (Extreme Bottom).
3. Visuals: Orange Exhaustion Dot + Divergence Diamond (◆).
4. Assistant: Reads "💎 BULL GEM".
5. Action: Half Size Entry. This is a counter-trend trade. Target the middle of the range (50% Channel).
PLAYBOOK D: THE "BULL TRAP" (When to Fold)
1. Context: Wave Count is "5".
2. Wave Panel: Trend Health < 30. W3/W1 Ratio > 1.618 (Extended).
3. Visuals: Orange Line appears on price high.
4. Signal: Green Triangle appears (Grade C).
5. Action: NO TRADE. The system is warning you that even though a signal fired, the structural physics indicate exhaustion.
PART 6: GRADING & SCORING MATRIX
Every signal is graded on a 35-point scale. Know what you are buying.
• Trend Alignment (5 pts): Ribbon & HTF agreement.
• Structure (5 pts): BOS (Break of Structure) & Higher Highs.
• Physics (5 pts): MPI (Volume Flow) & Angular Velocity.
• Institutional Location (10 pts):
• Inside Order Block: +3 pts
• Inside Breaker: +4 pts
• Wave 2/4 Pullback: +3 pts
• Penalty: Wave 5 Extension (-3 pts).
Grade Scale:
• A+ (Score ≥ 70%): "All In" Setup.
• A (Score 55-69%): Strong Setup.
• B (Score 40-54%): Standard Setup.
• C (Score < 40%): Dangerous.
PART 7: RISK DISCLOSURE & LIMITATIONS
1. The Reality of Adaptation (Redrawing):
The Neural Engine is dynamic. As new data arrives, the calculation of "Chaos" changes. This means historical channel lines or wave labels may shift to fit the matured trend. HOWEVER: Entry Signals (Triangles) NEVER repaint once the bar is closed.
2. Simulation vs. Reality:
The Dashboard metrics (Win Rate, Profit Factor) are Simulations run on the historical data visible on your chart. They do not account for spread, slippage, or liquidity. They are a tool to gauge the current market personality, not a promise of future returns.
3. No Financial Advice:
Wave Dynamics is a tool for structural analysis. It helps you see the market, but it cannot trade for you. You are responsible for your own risk management.
CLOSING THOUGHTS
Wave Dynamics is not just an indicator; it is a lens. It allows you to see the market not as a random walk of candles, but as a structured, breathing entity.
Trust the Neural Status. Respect the Breakers. Fear the Exhaustion.
Taking you to school. — Dskyz, Trade with insight. Trade with anticipation.
Adaptive Fusion ADX VortexIntroduction
The Adaptive Fusion ADX DI Vortex Indicator is a powerful tool designed to help traders identify trend strength and potential trend reversals in the market. This indicator uses a combination of technical analysis (TA) and mathematical concepts to provide accurate and reliable signals.
Features
The Adaptive Fusion ADX DI Vortex Indicator has several features that make it a powerful tool for traders. The Fusion Mode combines the Vortex Indicator and the ADX DI indicator to provide a more accurate picture of the market. The Hurst Exponent Filter helps to filter out choppy markets (inspired by balipour). Additionally, the indicator can be customized with various inputs and settings to suit individual trading strategies.
Signals
The enterLong signal is generated when the algorithm detects that it's a good time to buy a stock or other asset. This signal is based on certain conditions such as the values of technical indicators like ADX, Vortex, and Fusion. For example, if the ADX value is above a certain threshold and there is a crossover between the plus and minus lines of the ADX indicator, then the algorithm will generate an enterLong signal.
Similarly, the enterShort signal is generated when the algorithm detects that it's a good time to sell a stock or other asset. This signal is also based on certain conditions such as the values of technical indicators like ADX, Vortex, and Fusion. For example, if the ADX value is above a certain threshold and there is a crossunder between the plus and minus lines of the ADX indicator, then the algorithm will generate an enterShort signal.
The exitLong and exitShort signals are generated when the algorithm detects that it's a good time to close a long or short position, respectively. These signals are also based on certain conditions such as the values of technical indicators like ADX, Vortex, and Fusion. For example, if the ADX value crosses above a certain threshold or there is a crossover between the minus and plus lines of the ADX indicator, then the algorithm will generate an exitLong signal.
Usage
Traders can use this indicator in a variety of ways, depending on their trading strategy and style. Short-term traders may use it to identify short-term trends and potential trade opportunities, while long-term traders may use it to identify long-term trends and potential investment opportunities. The indicator can also be used to confirm other technical indicators or trading signals. Personally, I prefer to use it for short-term trades.
Strengths
One of the strengths of the Adaptive Fusion ADX DI Vortex Indicator is its accuracy and reliability. The indicator uses a combination of TA and mathematical concepts to provide accurate and reliable signals, helping traders make informed trading decisions. It is also versatile and can be used in a variety of trading strategies.
Weaknesses
While this indicator has many strengths, it also has some weaknesses. One of the weaknesses is that it can generate false signals in choppy or sideways markets. Additionally, the indicator may lag behind the market, making it less effective in fast-moving markets. That's a reason why I included the Hurst Exponent Filter and special smoothing.
Concepts
The Adaptive ADX DI Vortex Indicator with Fusion Mode and Hurst Filter is based on several key concepts. The Average Directional Index (ADX) is used to measure trend strength, while the Vortex Indicator is used to identify trend reversals. The Hurst Exponent is used to filter out noise and provide a more accurate picture of the market.
In conclusion, the Adaptive Fusion ADX DI Vortex Indicator is a versatile and powerful tool for traders. By combining technical analysis and mathematical concepts, this indicator provides accurate and reliable signals for identifying trend strength and potential trend reversals. While it has some weaknesses, its many strengths and features make it a valuable addition to any trader's toolbox.
---
Credits to:
▪️@cheatcountry – Hann Window Smoohing
▪️@loxx – VHF and T3
▪️@balipour – Hurst Exponent Filter
CEF (Chaos Theory Regime Oscillator)Chaos Theory Regime Oscillator
This script is open to the community.
What is it?
The CEF (Chaos Entropy Fusion) Oscillator is a next-generation "Regime Analysis" tool designed to replace traditional, static momentum indicators like RSI or MACD. Unlike standard oscillators that only look at price changes, CEF analyzes the "character" of the market using concepts from Chaos Theory and Information Theory.
It combines advanced mathematical engines (Hurst Exponent, Entropy, VHF) to determine whether a price movement is a real trend or just random noise. It uses a novel "Adaptive Normalization" technique to solve scaling problems common in advanced indicators, ensuring the oscillator remains sensitive yet stable across all assets (Crypto, Forex, Stocks).
What It Promises:
Intelligent Filtering: Filters out false signals in sideways (volatile) markets using the Hurst Base to measure trend continuity.
Dynamic Adaptation: Automatically adapts to volatility. Thanks to trend memory, it doesn't get stuck at the top during uptrends or at the bottom during downtrends.
No Repainting: All signals are confirmed at the close of the bar. They don't repaint or disappear.
What It Doesn't Promise:
Magic Wand: It's a powerful analytical tool, not a crystal ball. It determines the regime, but risk management is up to the investor.
Late-Free Holy Grail: It deliberately uses advanced correction algorithms (WMA/SMA) to provide stability and filter out noise. Speed is sacrificed for accuracy.
Which Concepts Are Used for Which Purpose?
CEF is built on proven mathematical concepts while creating a unique "Fusion" mechanism. These are not used in their standard forms, but are remixed to create a consensus engine:
Hurst Exponent: Used to measure the "memory" of the time series. Tells the oscillator whether there is a probability of the trend continuing or reversing to the mean.
Vertical Horizontal Filter (VHF): Determines whether the market is in a trend phase or a congestion phase.
Shannon Entropy: Measures the "irregularity" or "unpredictability" of market data to adjust signal sensitivity.
Adaptive Normalization (Key Innovation): Instead of fixed limits, the oscillator dynamically scales itself based on recent historical performance, solving the "flat line" problem seen in other advanced scripts.
Original Methodology and Community Contribution
This algorithm is a custom synthesis of public domain mathematical theories. The author's unique contribution lies in the "Adaptive Normalization Logic" and the custom weighting of Chaos components to filter momentum.
Why Public Domain? Standard indicators (RSI, MACD) were developed for the markets of the 1970s. Modern markets require modern mathematics. This script is presented to the community to demonstrate how Regime Analysis can improve trading decisions compared to static tools.
What Problems Does It Solve?
Problem 1: The "Stagnant Market" Trap
CEF Solution: While the RSI gives false signals in a sideways market, CEF's Hurst/VHF filter suppresses the signal, essentially making the histogram "off" (or weak) during noise.
Problem 2: The "Overbought" Fallacy
CEF Solution: In a strong trend (Pump/Dump), traditional oscillators get stuck at 100 or 0. CEF uses "Trend Memory" to understand that an overbought price is not a reversal signal but a sign of trend strength, and keeps the signal green/red instead of reversing it prematurely. Problem 3: Visual Confusion
CEF Solution: Instead of multiple lines, it presents a single, color-coded histogram featuring only prominent "Smart Circles" at high-probability reversal points.
Automation Ready: Custom Alerts
CEF is designed for both manual trading and automation.
Smart Buy/Sell Circles: Visual signals that only appear when trend filters are aligned with momentum reversals.
Deviation Labels: Automatically detects and labels structural divergences between price and entropy.
Disclaimer: This indicator is for educational purposes only. Past performance does not guarantee future results. Always practice appropriate risk management.
Cycle VTLs – with Scaled Channels "Cycle VTLs – with Scaled Channels" for TradingView plots Valid Trend Lines (VTLs) based on Hurst's Cyclic Theory, connecting consecutive price peaks (downward VTLs) or troughs (upward VTLs) for specific cycles. It uses up to eight Simple Moving Averages (SMAs) (default lengths: 25, 50, 100, 200, 400, 800, 1600, 1600 bars) with customizable envelope bands to detect pivots and draw VTLs, enhanced by optional parallel channels scaled to envelope widths.
Key Features:
Valid Trend Lines (VTLs):
Upward VTLs: Connect consecutive cycle troughs, sloping upward.
Downward VTLs: Connect consecutive cycle peaks, sloping downward.
Hurst’s Rules:
Connects consecutive cycle peaks/troughs.
Must not cross price between points.
Downward VTLs:
No longer-cycle trough between peaks.
Invalid if slope is incorrect (upward VTL not up, downward VTL not down).
Expired VTLs: Historical VTLs (crossed by price) from up to three prior cycle waves.
SMA Cycles:
Eight customizable SMAs with envelope bands (offset × multiplier) for pivot detection.
Channels:
Optional parallel lines around VTLs, width set by channelFactor × envelope half-width.
Pivot Detection:
Fractal-based (pivotPeriod) on envelopes or price (usePriceFallback).
Customization:
Toggle cycles, VTLs, and channels.
Adjust SMA lengths, offsets, colors, line styles, and widths.
Enable centered envelopes, slope filtering, and limit stored lines (maxStoredLines).
Usage in Hurst’s Cyclic TheoryAnalysis:
VTLs identify cycle trends; upward VTLs suggest bullish momentum, downward VTLs bearish.
Price crossing below an upward VTL confirms a peak in the next longer cycle; crossing above a downward VTL confirms a trough.
Trading:
Buy: Price bounces off upward VTL or breaks above downward VTL.
Sell: Price rejects downward VTL or breaks below upward VTL.
Use channels for support/resistance, breakouts, or stop-loss/take-profit levels.
Workflow:
Add indicator on TradingView.
Enable desired cycles (e.g., 50-bar, 1600-bar), adjust pivotPeriod, channelFactor, and showOnlyCorrectSlope.
Monitor VTL crossings and channels for trade signals.
NotesOptimized for performance with line limits.
Ideal for cycle-based trend analysis across markets (stocks, forex, crypto).
Debug labels show pivot counts and VTL status.
This indicator supports Hurst’s Cyclic Theory for trend identification and trading decisions with flexible, cycle-based VTLs and channels.
Use global variable to scale to chart. best results use factors of 2 and double. try 2, 4, 8, 16...128, 256, etc until price action fits 95% in smallest cycle.
Small Business Economic Conditions - Statistical Analysis ModelThe Small Business Economic Conditions Statistical Analysis Model (SBO-SAM) represents an econometric approach to measuring and analyzing the economic health of small business enterprises through multi-dimensional factor analysis and statistical methodologies. This indicator synthesizes eight fundamental economic components into a composite index that provides real-time assessment of small business operating conditions with statistical rigor. The model employs Z-score standardization, variance-weighted aggregation, higher-order moment analysis, and regime-switching detection to deliver comprehensive insights into small business economic conditions with statistical confidence intervals and multi-language accessibility.
1. Introduction and Theoretical Foundation
The development of quantitative models for assessing small business economic conditions has gained significant importance in contemporary financial analysis, particularly given the critical role small enterprises play in economic development and employment generation. Small businesses, typically defined as enterprises with fewer than 500 employees according to the U.S. Small Business Administration, constitute approximately 99.9% of all businesses in the United States and employ nearly half of the private workforce (U.S. Small Business Administration, 2024).
The theoretical framework underlying the SBO-SAM model draws extensively from established academic research in small business economics and quantitative finance. The foundational understanding of key drivers affecting small business performance builds upon the seminal work of Dunkelberg and Wade (2023) in their analysis of small business economic trends through the National Federation of Independent Business (NFIB) Small Business Economic Trends survey. Their research established the critical importance of optimism, hiring plans, capital expenditure intentions, and credit availability as primary determinants of small business performance.
The model incorporates insights from Federal Reserve Board research, particularly the Senior Loan Officer Opinion Survey (Federal Reserve Board, 2024), which demonstrates the critical importance of credit market conditions in small business operations. This research consistently shows that small businesses face disproportionate challenges during periods of credit tightening, as they typically lack access to capital markets and rely heavily on bank financing.
The statistical methodology employed in this model follows the econometric principles established by Hamilton (1989) in his work on regime-switching models and time series analysis. Hamilton's framework provides the theoretical foundation for identifying different economic regimes and understanding how economic relationships may vary across different market conditions. The variance-weighted aggregation technique draws from modern portfolio theory as developed by Markowitz (1952) and later refined by Sharpe (1964), applying these concepts to economic indicator construction rather than traditional asset allocation.
Additional theoretical support comes from the work of Engle and Granger (1987) on cointegration analysis, which provides the statistical framework for combining multiple time series while maintaining long-term equilibrium relationships. The model also incorporates insights from behavioral economics research by Kahneman and Tversky (1979) on prospect theory, recognizing that small business decision-making may exhibit systematic biases that affect economic outcomes.
2. Model Architecture and Component Structure
The SBO-SAM model employs eight orthogonalized economic factors that collectively capture the multifaceted nature of small business operating conditions. Each component is normalized using Z-score standardization with a rolling 252-day window, representing approximately one business year of trading data. This approach ensures statistical consistency across different market regimes and economic cycles, following the methodology established by Tsay (2010) in his treatment of financial time series analysis.
2.1 Small Cap Relative Performance Component
The first component measures the performance of the Russell 2000 index relative to the S&P 500, capturing the market-based assessment of small business equity valuations. This component reflects investor sentiment toward smaller enterprises and provides a forward-looking perspective on small business prospects. The theoretical justification for this component stems from the efficient market hypothesis as formulated by Fama (1970), which suggests that stock prices incorporate all available information about future prospects.
The calculation employs a 20-day rate of change with exponential smoothing to reduce noise while preserving signal integrity. The mathematical formulation is:
Small_Cap_Performance = (Russell_2000_t / S&P_500_t) / (Russell_2000_{t-20} / S&P_500_{t-20}) - 1
This relative performance measure eliminates market-wide effects and isolates the specific performance differential between small and large capitalization stocks, providing a pure measure of small business market sentiment.
2.2 Credit Market Conditions Component
Credit Market Conditions constitute the second component, incorporating commercial lending volumes and credit spread dynamics. This factor recognizes that small businesses are particularly sensitive to credit availability and borrowing costs, as established in numerous Federal Reserve studies (Bernanke and Gertler, 1995). Small businesses typically face higher borrowing costs and more stringent lending standards compared to larger enterprises, making credit conditions a critical determinant of their operating environment.
The model calculates credit spreads using high-yield bond ETFs relative to Treasury securities, providing a market-based measure of credit risk premiums that directly affect small business borrowing costs. The component also incorporates commercial and industrial loan growth data from the Federal Reserve's H.8 statistical release, which provides direct evidence of lending activity to businesses.
The mathematical specification combines these elements as:
Credit_Conditions = α₁ × (HYG_t / TLT_t) + α₂ × C&I_Loan_Growth_t
where HYG represents high-yield corporate bond ETF prices, TLT represents long-term Treasury ETF prices, and C&I_Loan_Growth represents the rate of change in commercial and industrial loans outstanding.
2.3 Labor Market Dynamics Component
The Labor Market Dynamics component captures employment cost pressures and labor availability metrics through the relationship between job openings and unemployment claims. This factor acknowledges that labor market tightness significantly impacts small business operations, as these enterprises typically have less flexibility in wage negotiations and face greater challenges in attracting and retaining talent during periods of low unemployment.
The theoretical foundation for this component draws from search and matching theory as developed by Mortensen and Pissarides (1994), which explains how labor market frictions affect employment dynamics. Small businesses often face higher search costs and longer hiring processes, making them particularly sensitive to labor market conditions.
The component is calculated as:
Labor_Tightness = Job_Openings_t / (Unemployment_Claims_t × 52)
This ratio provides a measure of labor market tightness, with higher values indicating greater difficulty in finding workers and potential wage pressures.
2.4 Consumer Demand Strength Component
Consumer Demand Strength represents the fourth component, combining consumer sentiment data with retail sales growth rates. Small businesses are disproportionately affected by consumer spending patterns, making this component crucial for assessing their operating environment. The theoretical justification comes from the permanent income hypothesis developed by Friedman (1957), which explains how consumer spending responds to both current conditions and future expectations.
The model weights consumer confidence and actual spending data to provide both forward-looking sentiment and contemporaneous demand indicators. The specification is:
Demand_Strength = β₁ × Consumer_Sentiment_t + β₂ × Retail_Sales_Growth_t
where β₁ and β₂ are determined through principal component analysis to maximize the explanatory power of the combined measure.
2.5 Input Cost Pressures Component
Input Cost Pressures form the fifth component, utilizing producer price index data to capture inflationary pressures on small business operations. This component is inversely weighted, recognizing that rising input costs negatively impact small business profitability and operating conditions. Small businesses typically have limited pricing power and face challenges in passing through cost increases to customers, making them particularly vulnerable to input cost inflation.
The theoretical foundation draws from cost-push inflation theory as described by Gordon (1988), which explains how supply-side price pressures affect business operations. The model employs a 90-day rate of change to capture medium-term cost trends while filtering out short-term volatility:
Cost_Pressure = -1 × (PPI_t / PPI_{t-90} - 1)
The negative weighting reflects the inverse relationship between input costs and business conditions.
2.6 Monetary Policy Impact Component
Monetary Policy Impact represents the sixth component, incorporating federal funds rates and yield curve dynamics. Small businesses are particularly sensitive to interest rate changes due to their higher reliance on variable-rate financing and limited access to capital markets. The theoretical foundation comes from monetary transmission mechanism theory as developed by Bernanke and Blinder (1992), which explains how monetary policy affects different segments of the economy.
The model calculates the absolute deviation of federal funds rates from a neutral 2% level, recognizing that both extremely low and high rates can create operational challenges for small enterprises. The yield curve component captures the shape of the term structure, which affects both borrowing costs and economic expectations:
Monetary_Impact = γ₁ × |Fed_Funds_Rate_t - 2.0| + γ₂ × (10Y_Yield_t - 2Y_Yield_t)
2.7 Currency Valuation Effects Component
Currency Valuation Effects constitute the seventh component, measuring the impact of US Dollar strength on small business competitiveness. A stronger dollar can benefit businesses with significant import components while disadvantaging exporters. The model employs Dollar Index volatility as a proxy for currency-related uncertainty that affects small business planning and operations.
The theoretical foundation draws from international trade theory and the work of Krugman (1987) on exchange rate effects on different business segments. Small businesses often lack hedging capabilities, making them more vulnerable to currency fluctuations:
Currency_Impact = -1 × DXY_Volatility_t
2.8 Regional Banking Health Component
The eighth and final component, Regional Banking Health, assesses the relative performance of regional banks compared to large financial institutions. Regional banks traditionally serve as primary lenders to small businesses, making their health a critical factor in small business credit availability and overall operating conditions.
This component draws from the literature on relationship banking as developed by Boot (2000), which demonstrates the importance of bank-borrower relationships, particularly for small enterprises. The calculation compares regional bank performance to large financial institutions:
Banking_Health = (Regional_Banks_Index_t / Large_Banks_Index_t) - 1
3. Statistical Methodology and Advanced Analytics
The model employs statistical techniques to ensure robustness and reliability. Z-score normalization is applied to each component using rolling 252-day windows, providing standardized measures that remain consistent across different time periods and market conditions. This approach follows the methodology established by Engle and Granger (1987) in their cointegration analysis framework.
3.1 Variance-Weighted Aggregation
The composite index calculation utilizes variance-weighted aggregation, where component weights are determined by the inverse of their historical variance. This approach, derived from modern portfolio theory, ensures that more stable components receive higher weights while reducing the impact of highly volatile factors. The mathematical formulation follows the principle that optimal weights are inversely proportional to variance, maximizing the signal-to-noise ratio of the composite indicator.
The weight for component i is calculated as:
w_i = (1/σᵢ²) / Σⱼ(1/σⱼ²)
where σᵢ² represents the variance of component i over the lookback period.
3.2 Higher-Order Moment Analysis
Higher-order moment analysis extends beyond traditional mean and variance calculations to include skewness and kurtosis measurements. Skewness provides insight into the asymmetry of the sentiment distribution, while kurtosis measures the tail behavior and potential for extreme events. These metrics offer valuable information about the underlying distribution characteristics and potential regime changes.
Skewness is calculated as:
Skewness = E / σ³
Kurtosis is calculated as:
Kurtosis = E / σ⁴ - 3
where μ represents the mean and σ represents the standard deviation of the distribution.
3.3 Regime-Switching Detection
The model incorporates regime-switching detection capabilities based on the Hamilton (1989) framework. This allows for identification of different economic regimes characterized by distinct statistical properties. The regime classification employs percentile-based thresholds:
- Regime 3 (Very High): Percentile rank > 80
- Regime 2 (High): Percentile rank 60-80
- Regime 1 (Moderate High): Percentile rank 50-60
- Regime 0 (Neutral): Percentile rank 40-50
- Regime -1 (Moderate Low): Percentile rank 30-40
- Regime -2 (Low): Percentile rank 20-30
- Regime -3 (Very Low): Percentile rank < 20
3.4 Information Theory Applications
The model incorporates information theory concepts, specifically Shannon entropy measurement, to assess the information content of the sentiment distribution. Shannon entropy, as developed by Shannon (1948), provides a measure of the uncertainty or information content in a probability distribution:
H(X) = -Σᵢ p(xᵢ) log₂ p(xᵢ)
Higher entropy values indicate greater unpredictability and information content in the sentiment series.
3.5 Long-Term Memory Analysis
The Hurst exponent calculation provides insight into the long-term memory characteristics of the sentiment series. Originally developed by Hurst (1951) for analyzing Nile River flow patterns, this measure has found extensive application in financial time series analysis. The Hurst exponent H is calculated using the rescaled range statistic:
H = log(R/S) / log(T)
where R/S represents the rescaled range and T represents the time period. Values of H > 0.5 indicate long-term positive autocorrelation (persistence), while H < 0.5 indicates mean-reverting behavior.
3.6 Structural Break Detection
The model employs Chow test approximation for structural break detection, based on the methodology developed by Chow (1960). This technique identifies potential structural changes in the underlying relationships by comparing the stability of regression parameters across different time periods:
Chow_Statistic = (RSS_restricted - RSS_unrestricted) / RSS_unrestricted × (n-2k)/k
where RSS represents residual sum of squares, n represents sample size, and k represents the number of parameters.
4. Implementation Parameters and Configuration
4.1 Language Selection Parameters
The model provides comprehensive multi-language support across five languages: English, German (Deutsch), Spanish (Español), French (Français), and Japanese (日本語). This feature enhances accessibility for international users and ensures cultural appropriateness in terminology usage. The language selection affects all internal displays, statistical classifications, and alert messages while maintaining consistency in underlying calculations.
4.2 Model Configuration Parameters
Calculation Method: Users can select from four aggregation methodologies:
- Equal-Weighted: All components receive identical weights
- Variance-Weighted: Components weighted inversely to their historical variance
- Principal Component: Weights determined through principal component analysis
- Dynamic: Adaptive weighting based on recent performance
Sector Specification: The model allows for sector-specific calibration:
- General: Broad-based small business assessment
- Retail: Emphasis on consumer demand and seasonal factors
- Manufacturing: Enhanced weighting of input costs and currency effects
- Services: Focus on labor market dynamics and consumer demand
- Construction: Emphasis on credit conditions and monetary policy
Lookback Period: Statistical analysis window ranging from 126 to 504 trading days, with 252 days (one business year) as the optimal default based on academic research.
Smoothing Period: Exponential moving average period from 1 to 21 days, with 5 days providing optimal noise reduction while preserving signal integrity.
4.3 Statistical Threshold Parameters
Upper Statistical Boundary: Configurable threshold between 60-80 (default 70) representing the upper significance level for regime classification.
Lower Statistical Boundary: Configurable threshold between 20-40 (default 30) representing the lower significance level for regime classification.
Statistical Significance Level (α): Alpha level for statistical tests, configurable between 0.01-0.10 with 0.05 as the standard academic default.
4.4 Display and Visualization Parameters
Color Theme Selection: Eight professional color schemes optimized for different user preferences and accessibility requirements:
- Gold: Traditional financial industry colors
- EdgeTools: Professional blue-gray scheme
- Behavioral: Psychology-based color mapping
- Quant: Value-based quantitative color scheme
- Ocean: Blue-green maritime theme
- Fire: Warm red-orange theme
- Matrix: Green-black technology theme
- Arctic: Cool blue-white theme
Dark Mode Optimization: Automatic color adjustment for dark chart backgrounds, ensuring optimal readability across different viewing conditions.
Line Width Configuration: Main index line thickness adjustable from 1-5 pixels for optimal visibility.
Background Intensity: Transparency control for statistical regime backgrounds, adjustable from 90-99% for subtle visual enhancement without distraction.
4.5 Alert System Configuration
Alert Frequency Options: Three frequency settings to match different trading styles:
- Once Per Bar: Single alert per bar formation
- Once Per Bar Close: Alert only on confirmed bar close
- All: Continuous alerts for real-time monitoring
Statistical Extreme Alerts: Notifications when the index reaches 99% confidence levels (Z-score > 2.576 or < -2.576).
Regime Transition Alerts: Notifications when statistical boundaries are crossed, indicating potential regime changes.
5. Practical Application and Interpretation Guidelines
5.1 Index Interpretation Framework
The SBO-SAM index operates on a 0-100 scale with statistical normalization ensuring consistent interpretation across different time periods and market conditions. Values above 70 indicate statistically elevated small business conditions, suggesting favorable operating environment with potential for expansion and growth. Values below 30 indicate statistically reduced conditions, suggesting challenging operating environment with potential constraints on business activity.
The median reference line at 50 represents the long-term equilibrium level, with deviations providing insight into cyclical conditions relative to historical norms. The statistical confidence bands at 95% levels (approximately ±2 standard deviations) help identify when conditions reach statistically significant extremes.
5.2 Regime Classification System
The model employs a seven-level regime classification system based on percentile rankings:
Very High Regime (P80+): Exceptional small business conditions, typically associated with strong economic growth, easy credit availability, and favorable regulatory environment. Historical analysis suggests these periods often precede economic peaks and may warrant caution regarding sustainability.
High Regime (P60-80): Above-average conditions supporting business expansion and investment. These periods typically feature moderate growth, stable credit conditions, and positive consumer sentiment.
Moderate High Regime (P50-60): Slightly above-normal conditions with mixed signals. Careful monitoring of individual components helps identify emerging trends.
Neutral Regime (P40-50): Balanced conditions near long-term equilibrium. These periods often represent transition phases between different economic cycles.
Moderate Low Regime (P30-40): Slightly below-normal conditions with emerging headwinds. Early warning signals may appear in credit conditions or consumer demand.
Low Regime (P20-30): Below-average conditions suggesting challenging operating environment. Businesses may face constraints on growth and expansion.
Very Low Regime (P0-20): Severely constrained conditions, typically associated with economic recessions or financial crises. These periods often present opportunities for contrarian positioning.
5.3 Component Analysis and Diagnostics
Individual component analysis provides valuable diagnostic information about the underlying drivers of overall conditions. Divergences between components can signal emerging trends or structural changes in the economy.
Credit-Labor Divergence: When credit conditions improve while labor markets tighten, this may indicate early-stage economic acceleration with potential wage pressures.
Demand-Cost Divergence: Strong consumer demand coupled with rising input costs suggests inflationary pressures that may constrain small business margins.
Market-Fundamental Divergence: Disconnection between small-cap equity performance and fundamental conditions may indicate market inefficiencies or changing investor sentiment.
5.4 Temporal Analysis and Trend Identification
The model provides multiple temporal perspectives through momentum analysis, rate of change calculations, and trend decomposition. The 20-day momentum indicator helps identify short-term directional changes, while the Hodrick-Prescott filter approximation separates cyclical components from long-term trends.
Acceleration analysis through second-order momentum calculations provides early warning signals for potential trend reversals. Positive acceleration during declining conditions may indicate approaching inflection points, while negative acceleration during improving conditions may suggest momentum loss.
5.5 Statistical Confidence and Uncertainty Quantification
The model provides comprehensive uncertainty quantification through confidence intervals, volatility measures, and regime stability analysis. The 95% confidence bands help users understand the statistical significance of current readings and identify when conditions reach historically extreme levels.
Volatility analysis provides insight into the stability of current conditions, with higher volatility indicating greater uncertainty and potential for rapid changes. The regime stability measure, calculated as the inverse of volatility, helps assess the sustainability of current conditions.
6. Risk Management and Limitations
6.1 Model Limitations and Assumptions
The SBO-SAM model operates under several important assumptions that users must understand for proper interpretation. The model assumes that historical relationships between economic variables remain stable over time, though the regime-switching framework helps accommodate some structural changes. The 252-day lookback period provides reasonable statistical power while maintaining sensitivity to changing conditions, but may not capture longer-term structural shifts.
The model's reliance on publicly available economic data introduces inherent lags in some components, particularly those based on government statistics. Users should consider these timing differences when interpreting real-time conditions. Additionally, the model's focus on quantitative factors may not fully capture qualitative factors such as regulatory changes, geopolitical events, or technological disruptions that could significantly impact small business conditions.
The model's timeframe restrictions ensure statistical validity by preventing application to intraday periods where the underlying economic relationships may be distorted by market microstructure effects, trading noise, and temporal misalignment with the fundamental data sources. Users must utilize daily or longer timeframes to ensure the model's statistical foundations remain valid and interpretable.
6.2 Data Quality and Reliability Considerations
The model's accuracy depends heavily on the quality and availability of underlying economic data. Market-based components such as equity indices and bond prices provide real-time information but may be subject to short-term volatility unrelated to fundamental conditions. Economic statistics provide more stable fundamental information but may be subject to revisions and reporting delays.
Users should be aware that extreme market conditions may temporarily distort some components, particularly those based on financial market data. The model's statistical normalization helps mitigate these effects, but users should exercise additional caution during periods of market stress or unusual volatility.
6.3 Interpretation Caveats and Best Practices
The SBO-SAM model provides statistical analysis and should not be interpreted as investment advice or predictive forecasting. The model's output represents an assessment of current conditions based on historical relationships and may not accurately predict future outcomes. Users should combine the model's insights with other analytical tools and fundamental analysis for comprehensive decision-making.
The model's regime classifications are based on historical percentile rankings and may not fully capture the unique characteristics of current economic conditions. Users should consider the broader economic context and potential structural changes when interpreting regime classifications.
7. Academic References and Bibliography
Bernanke, B. S., & Blinder, A. S. (1992). The Federal Funds Rate and the Channels of Monetary Transmission. American Economic Review, 82(4), 901-921.
Bernanke, B. S., & Gertler, M. (1995). Inside the Black Box: The Credit Channel of Monetary Policy Transmission. Journal of Economic Perspectives, 9(4), 27-48.
Boot, A. W. A. (2000). Relationship Banking: What Do We Know? Journal of Financial Intermediation, 9(1), 7-25.
Chow, G. C. (1960). Tests of Equality Between Sets of Coefficients in Two Linear Regressions. Econometrica, 28(3), 591-605.
Dunkelberg, W. C., & Wade, H. (2023). NFIB Small Business Economic Trends. National Federation of Independent Business Research Foundation, Washington, D.C.
Engle, R. F., & Granger, C. W. J. (1987). Co-integration and Error Correction: Representation, Estimation, and Testing. Econometrica, 55(2), 251-276.
Fama, E. F. (1970). Efficient Capital Markets: A Review of Theory and Empirical Work. Journal of Finance, 25(2), 383-417.
Federal Reserve Board. (2024). Senior Loan Officer Opinion Survey on Bank Lending Practices. Board of Governors of the Federal Reserve System, Washington, D.C.
Friedman, M. (1957). A Theory of the Consumption Function. Princeton University Press, Princeton, NJ.
Gordon, R. J. (1988). The Role of Wages in the Inflation Process. American Economic Review, 78(2), 276-283.
Hamilton, J. D. (1989). A New Approach to the Economic Analysis of Nonstationary Time Series and the Business Cycle. Econometrica, 57(2), 357-384.
Hurst, H. E. (1951). Long-term Storage Capacity of Reservoirs. Transactions of the American Society of Civil Engineers, 116(1), 770-799.
Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263-291.
Krugman, P. (1987). Pricing to Market When the Exchange Rate Changes. In S. W. Arndt & J. D. Richardson (Eds.), Real-Financial Linkages among Open Economies (pp. 49-70). MIT Press, Cambridge, MA.
Markowitz, H. (1952). Portfolio Selection. Journal of Finance, 7(1), 77-91.
Mortensen, D. T., & Pissarides, C. A. (1994). Job Creation and Job Destruction in the Theory of Unemployment. Review of Economic Studies, 61(3), 397-415.
Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27(3), 379-423.
Sharpe, W. F. (1964). Capital Asset Prices: A Theory of Market Equilibrium under Conditions of Risk. Journal of Finance, 19(3), 425-442.
Tsay, R. S. (2010). Analysis of Financial Time Series (3rd ed.). John Wiley & Sons, Hoboken, NJ.
U.S. Small Business Administration. (2024). Small Business Profile. Office of Advocacy, Washington, D.C.
8. Technical Implementation Notes
The SBO-SAM model is implemented in Pine Script version 6 for the TradingView platform, ensuring compatibility with modern charting and analysis tools. The implementation follows best practices for financial indicator development, including proper error handling, data validation, and performance optimization.
The model includes comprehensive timeframe validation to ensure statistical accuracy and reliability. The indicator operates exclusively on daily (1D) timeframes or higher, including weekly (1W), monthly (1M), and longer periods. This restriction ensures that the statistical analysis maintains appropriate temporal resolution for the underlying economic data sources, which are primarily reported on daily or longer intervals.
When users attempt to apply the model to intraday timeframes (such as 1-minute, 5-minute, 15-minute, 30-minute, 1-hour, 2-hour, 4-hour, 6-hour, 8-hour, or 12-hour charts), the system displays a comprehensive error message in the user's selected language and prevents execution. This safeguard protects users from potentially misleading results that could occur when applying daily-based economic analysis to shorter timeframes where the underlying data relationships may not hold.
The model's statistical calculations are performed using vectorized operations where possible to ensure computational efficiency. The multi-language support system employs Unicode character encoding to ensure proper display of international characters across different platforms and devices.
The alert system utilizes TradingView's native alert functionality, providing users with flexible notification options including email, SMS, and webhook integrations. The alert messages include comprehensive statistical information to support informed decision-making.
The model's visualization system employs professional color schemes designed for optimal readability across different chart backgrounds and display devices. The system includes dynamic color transitions based on momentum and volatility, professional glow effects for enhanced line visibility, and transparency controls that allow users to customize the visual intensity to match their preferences and analytical requirements. The clean confidence band implementation provides clear statistical boundaries without visual distractions, maintaining focus on the analytical content.
Information-Geometric Market DynamicsInformation-Geometric Market Dynamics
The Information Field: A Geometric Approach to Market Dynamics
By: DskyzInvestments
Foreword: Beyond the Shadows on the Wall
If you have traded for any length of time, you know " the feeling ." It is the frustration of a perfect setup that fails, the whipsaw that stops you out just before the real move, the nagging sense that the chart is telling you only half the story. For decades, technical analysis has relied on interpreting the shadows—the patterns left behind by price. We draw lines on these shadows, apply indicators to them, and hope they reveal the future.
But what if we could stop looking at the shadows and, instead, analyze the object casting them?
This script introduces a new paradigm for market analysis: Information-Geometric Market Dynamics (IGMD) . The core premise of IGMD is that the price chart is merely a one-dimensional projection of a much richer, higher-dimensional reality—an " information field " generated by the collective actions and beliefs of all market participants.
This is not just another collection of indicators. It is a unified framework for measuring the geometry of the market's information field—its memory, its complexity, its uncertainty, its causal flows—and making high-probability decisions based on that deeper reality. By fusing advanced mathematical and informational concepts, IGMD provides a multi-faceted lens through which to view market behavior, moving beyond simple price action into the very structure of market information itself.
Prepare to move beyond the flatland of the price chart. Welcome to the information field.
The IGMD Framework: A Multi-Kernel Approach
What is a Kernel? The Heart of Transformation
In mathematics and data science, a kernel is a powerful and elegant concept. At its core, a kernel is a function that takes complex, often inscrutable data and transforms it into a more useful format. Think of it as a specialized lens or a mathematical "probe." You cannot directly measure abstract concepts like "market memory" or "trend quality" by looking at a price number. First, you must process the raw price data through a specific mathematical machine—a kernel—that is designed to output a measurement of that specific property. Kernels operate by performing a sort of "similarity test," projecting data into a higher-dimensional space where hidden patterns and relationships become visible and measurable.
Why do creators use them? We use kernels to extract features —meaningful pieces of information—that are not explicitly present in the raw data. They are the essential tools for moving beyond surface-level analysis into the very DNA of market behavior. A simple moving average can tell you the average price; a suite of well-chosen kernels can tell you about the character of the price action itself.
The Alchemist's Challenge: The Art of Fusion
Using a single kernel is a challenge. Using five distinct, computationally demanding mathematical engines in unison is an immense undertaking. The true difficulty—and artistry—lies not just in using one kernel, but in fusing the outputs of many . Each kernel provides a different perspective, and they can often give conflicting signals. One kernel might detect a strong trend, while another signals rising chaos and uncertainty. The IGMD script's greatest strength is its ability to act as this alchemist, synthesizing these disparate viewpoints through a weighted fusion process to produce a single, coherent picture of the market's state. It required countless hours of testing and calibration to balance the influence of these five distinct analytical engines so they work in harmony rather than cacophony.
The Five Kernels of Market Dynamics
The IGMD script is built upon a foundation of five distinct kernels, each chosen to probe a unique and critical dimension of the market's information field.
1. The Wavelet Kernel (The "Microscope")
What it is: The Wavelet Kernel is a signal processing function designed to decompose a signal into different frequency scales. Unlike a Fourier Transform that analyzes the entire signal at once, the wavelet slides across the data, providing information about both what frequencies are present and when they occurred.
The Kernels I Use:
Haar Kernel: The simplest wavelet, a square-wave shape defined by the coefficients . It excels at detecting sharp, sudden changes.
Daubechies 2 (db2) Kernel: A more complex and smoother wavelet shape that provides a better balance for analyzing the nuanced ebb and flow of typical market trends.
How it Works in the Script: This kernel is applied iteratively. It first separates the finest "noise" (detail d1) from the first level of trend (approximation a1). It then takes the trend a1 and repeats the process, extracting the next level of cycle (d2) and trend (a2), and so on. This hierarchical decomposition allows us to separate short-term noise from the long-term market "thesis."
2. The Hurst Exponent Kernel (The "Memory Gauge")
What it is: The Hurst Exponent is derived from a statistical analysis kernel that measures the "long-term memory" or persistence of a time series. It is the definitive measure of whether a series is trending (H > 0.5), mean-reverting (H < 0.5), or random (H = 0.5).
How it Works in the Script: The script employs a method based on Rescaled Range (R/S) analysis. It calculates the average range of price movements over increasingly larger time lags (m1, m2, m4, m8...). The slope of the line plotting log(range) vs. log(lag) is the Hurst Exponent. Applying this complex statistical analysis not to the raw price, but to the clean, wavelet-decomposed trend lines, is a key innovation of IGMD.
3. The Fractal Dimension Kernel (The "Complexity Compass")
What it is: This kernel measures the geometric complexity or "jaggedness" of a price path, based on the principles of fractal geometry. A straight line has a dimension of 1; a chaotic, space-filling line approaches a dimension of 2.
How it Works in the Script: We use a version based on Ehlers' Fractal Dimension Index (FDI). It calculates the rate of price change over a full lookback period (N3) and compares it to the sum of the rates of change over the two halves of that period (N1 + N2). The formula d = (log(N1 + N2) - log(N3)) / log(2) quantifies how much "longer" and more convoluted the price path was than a simple straight line. This kernel is our primary filter for tradeable (low complexity) vs. untradeable (high complexity) conditions.
4. The Shannon Entropy Kernel (The "Uncertainty Meter")
What it is: This kernel comes from Information Theory and provides the purest mathematical measure of information, surprise, or uncertainty within a system. It is not a measure of volatility; a market moving predictably up by 10 points every bar has high volatility but zero entropy .
How it Works in the Script: The script normalizes price returns by the ATR, categorizes them into a discrete number of "bins" over a lookback window, and forms a probability distribution. The Shannon Entropy H = -Σ(p_i * log(p_i)) is calculated from this distribution. A low H means returns are predictable. A high H means returns are chaotic. This kernel is our ultimate gauge of market conviction.
5. The Transfer Entropy Kernel (The "Causality Probe")
What it is: This is by far the most advanced and computationally intensive kernel in the script. Transfer Entropy is a non-parametric measure of directed information flow between two time series. It moves beyond correlation to ask: "Does knowing the past of Volume genuinely reduce our uncertainty about the future of Price?"
How it Works in the Script: To make this work, the script discretizes both price returns and the chosen "driver" (e.g., OBV) into three states: "up," "down," or "neutral." It then builds complex conditional probability tables to measure the flow of information in both directions. The Net Transfer Entropy (TE Driver→Price minus TE Price→Driver) gives us a direct measure of causality . A positive score means the driver is leading price, confirming the validity of the move. This is a profound leap beyond traditional indicator analysis.
Chapter 3: Fusion & Interpretation - The Field Score & Dashboard
Each kernel is a specialist providing a piece of the puzzle. The Field Score is where they are fused into a single, comprehensive reading. It's a weighted sum of the normalized scores from all five kernels, producing a single number from -1 (maximum bearish information field) to +1 (maximum bullish information field). This is the ultimate "at-a-glance" metric for the market's net state, and it is interpreted through the dashboard.
The Dashboard: Your Mission Control
Field Score & Regime: The master metric and its plain-English interpretation ("Uptrend Field", "Downtrend Field", "Transitional").
Kernel Readouts (Wave Align, H(w), FDI, etc.): The live scores of each individual kernel. This allows you to see why the Field Score is what it is. A high Field Score with all components in agreement (all green or red) is a state of High Coherence and represents a high-quality setup.
Market Context: Standard metrics like RSI and Volume for additional confluence.
Signals: The raw and adjusted confluence counts and the final, calculated probability scores for potential long and short entries.
Pattern: Shows the dominant candlestick pattern detected within the currently forming APEX range box and its calculated confidence percentage.
Chapter 4: Mastering the Controls - The Inputs Menu
Every parameter is a lever to fine-tune the IGMD engine.
📊 Wavelet Transform: Kernel ( Haar for sharp moves, db2 for smooth trends) and Scales (depth of analysis) let you tune the script's core microscope to your asset's personality.
📈 Hurst Exponent: The Window determines if you're assessing short-term or long-term market memory.
🔍 Fractal Dimension & ⚡ Entropy Volatility: Adjust the lookback windows to make these kernels more or less sensitive to recent price action. Always keep "Normalize by ATR" enabled for Entropy for consistent results.
🔄 Transfer Entropy: Driver lets you choose what causal force to measure (e.g., OBV, Volume, or even an external symbol like VIX). The throttle setting is a crucial performance tool, allowing you to balance precision with script speed.
⚡ Field Fusion • Weights: This is where you can customize the model's "brain." Increase the weights for the kernels that best align with your trading philosophy (e.g., w_hurst for trend followers, w_fdi for chop avoiders).
📊 Signal Engine: Mode offers presets from Conservative to Aggressive . Min Confluence sets your evidence threshold. Dynamic Confluence is a powerful feature that automatically adapts this threshold to the market regime.
🎨 Visuals & 📏 Support/Resistance: These inputs give you full control over the chart's appearance, allowing you to toggle every visual element for a setup that is as clean or as data-rich as you desire.
Chapter 5: Reading the Battlefield - On-Chart Visuals
Pattern Boxes (The Large Rectangles): These are not simple range boxes. They appear when the Field Score crosses a significance threshold, signaling a potential ignition point.
Color: The color reflects the dominant candlestick pattern that has occurred within that box's duration (e.g., green for Bull Engulf).
Label: Displays the dominant pattern, its duration in bars, and a calculated Confidence % based on field strength and pattern clarity.
Bar Pattern Boxes (The Small Boxes): If enabled, these highlight individual, significant candlestick patterns ( BE for Bull Engulf, H for Hammer) on a bar-by-bar basis.
Signal Markers (▲ and ▼): These appear only when the Signal Engine's criteria are all met. The number is the calculated Probability Score .
RR Rails (Dashed Lines): When a signal appears, these lines automatically plot the Entry, Stop Loss (based on ATR), and two Take Profit targets (based on Risk/Reward ratios). They dynamically break and disappear as price touches each level.
Support & Resistance Lines: Plots of the highest high ( Resistance ) and lowest low ( Support ) over a lookback, providing key structural levels.
Chapter 6: Development Philosophy & A Final Word
One single question: " What is the market really doing? " It represents a triumph of complexity, blending concepts from signal processing, chaos theory, and information theory into a cohesive framework. It is offered for educational and analytical purposes and does not constitute financial advice. Its goal is to elevate your analysis from interpreting flat shadows to measuring the rich, geometric reality of the market's information field.
As the great mathematician Benoit Mandelbrot , father of fractal geometry, noted:
"Clouds are not spheres, mountains are not cones, coastlines are not circles, and bark is not smooth, nor does lightning travel in a straight line."
Neither does the market. IGMD is a tool designed to navigate that beautiful, complex, and fractal reality.
— Dskyz, Trade with insight. Trade with anticipation.






















