☸HH/LL & Support/Resistance Strategy [NHP]🔶This script finds pivot highs and pivot lows then calculates higher highs & lower lows. And also it calculates support/resistance by using HH-HL-LL-LH points.
🔶Generally HH and HL shows up-trend, LL and LH shows down-trend.
🔶If price breaks resistance levels it means the trend is up or if price breaks support level it means the trend is down, so the script changes bar color blue or black. if there is up-trend then bar color is blue, or if down-trend then bar color is black. also as you can see support and resistance levels change dynamically.
🔶If you use smaller numbers for left/right bars then it will be more sensitive.
🔶All content provided is for informational & educational purposes only. Past performance does not guarantee future results.
Educational
EMA with VolNew EMA 9 20 setup with Volume for educational purpose to identify the moves and everything.
Smooth Cloud + ZigZag VPOC CORE v6📌 Description
The Smooth Cloud + ZigZag VPOC indicator is designed to help traders visualize market structure and potential confluence zones.
Smooth Cloud: Built from smoothed moving averages (EMA, RMA, or HMA), this cloud highlights the underlying short-term trend by shading bullish and bearish phases.
Pivots (ZigZag style): Marks confirmed swing highs and lows, helping to identify support/resistance and breakout areas without repainting.
VPOC (Volume Point of Control): Plots the price level with the highest traded volume, either from a rolling lookback or anchored to a custom date. This often acts as a magnet or reaction level.
ATR Bands: Optional dynamic bands based on volatility to frame potential extension zones.
Signals & Alerts: Generates long/short labels when price breaks pivot levels in line with trend filters, with optional confluence from HTF trend, VPOC, and ATR.
This tool combines trend context, structure, and volume confluence in a single view to support decision-making.
✅ Notes
This script is intended for technical analysis and educational use only.
It does not provide financial advice or guaranteed outcomes.
Signals are purely analytical and should be combined with independent risk management.
Smooth Cloud + ZigZag VPOC📝 Indicator Description
The Smooth Cloud + ZigZag VPOC Indicator is a custom tool that combines three well-known concepts into one study:
Smooth Cloud Trend Filter – built from two smoothed EMAs, this visual “cloud” highlights the prevailing trend direction.
When the fast line is above the slow line, the background cloud shades teal (bullish bias).
When the fast line is below the slow line, the cloud shades red (bearish bias).
Confirmed ZigZag Pivots – plots non-repainting swing highs and swing lows using pivot confirmation. This helps traders see important structural turning points and potential breakout zones.
VPOC Approximation (Volume Point of Control) – within a lookback window, the indicator marks the price level with the highest traded volume. This level often acts as a magnet for price or an area of confluence.
Signals & Alerts
A long signal appears when price is trending up, breaks above the last confirmed pivot high, and (optionally) is above the VPOC line.
A short signal appears when price is trending down, breaks below the last confirmed pivot low, and (optionally) is below the VPOC line.
Alerts can be enabled to notify when these conditions occur.
Customization
Inputs allow adjusting the EMA lengths, smoothing factor, pivot sensitivity, and VPOC lookback.
Users can toggle on/off the cloud fill, pivot markers, bar coloring, and VPOC line to match their charting style.
✅ Notes (for compliance)
This script is for technical analysis and educational purposes only.
It does not provide financial advice or guaranteed results.
Signals are intended to highlight trend direction and breakout areas — traders should always confirm with their own risk management and strategy.
TTM Squeeze Range Lines (with Forward Extension) By Gautam KumarThis TTM Squeeze Range Lines script helps visualize breakout levels by marking the recent squeeze’s high and low, making it easier to identify potential trade setups. Each signal line is extended for visibility, showing possible entry levels after a squeeze.
Interpreting the LinesLight blue background marks periods when the TTM squeeze is active (tight volatility).
Green line is drawn at the highest price during the squeeze, extended forward—this is commonly used as the breakout level for long entries.
Red line shows the lowest price during the squeeze, indicating the bottom of the range—potential stop loss positioning or an invalidation level.
When the squeeze background disappears, the horizontal lines will have just appeared and extended forward for several bars after the squeeze ends.
If the price breaks above the green line (the squeeze high), it signals a possible momentum breakout, which traders often use as a long entry.
The red line can be used for placing stop losses or monitoring failed breakouts if price falls below this level.
Best Practices
Combine these levels with volume and momentum confirmation for strong entries.
Adjust the extension length (number of bars forward) from the settings menu to fit your preference.
For systematic trading, use these breakout signals alongside chart pattern or histogram confirmation.
This makes it easy to visualize strong entry zones based on the end of squeeze compression, supporting both discretionary and automated swing trading approaches
Square Root Price Calculator By ABPinescript to Calculate Square root of Price usefull for Gann Lover
IDX Utility Set [zidaniee]Purpose
This indicator is not a technical analysis tool. It’s a companion overlay designed to guide your analysis of the uniquely structured Indonesia Stock Exchange (IDX).
Core Features
Centered Ticker Display – Clean, readable ticker shown at the center of the chart.
Company Name – Displays the listed company’s full name.
Active Timeframe – Shows the currently selected timeframe.
Additional Features
ATH & ATL Markers – Labels the All-Time High (ATH) and All-Time Low (ATL) and shows the percentage distance from the latest price to each level, so you can quickly gauge upside/downside room.
IDX Fraction (Tick) Levels – Visualizes Indonesia’s price-fraction (tick) brackets. This matters because tick size changes by price range—very useful for scalpers and fast traders.
ARA/ARB Levels (Realtime) – Plots Auto-Reject Upper (ARA) and Auto-Reject Lower (ARB) levels in real time. Levels refresh in line with IDX trading hours 09:00–16:00 WIB (UTC+7), so your view stays consistent both during and outside market hours. This feature already complies with the latest rules and adjustments set by the Indonesia Stock Exchange (IDX).
Suspension Status – Shows SUSPENDED if the stock is halted/suspended, helping you avoid unnecessary analysis. The suspension check compares today’s date with the last available candle date and accounts for weekends.
Note: WIB = Western Indonesia Time (UTC+7).
Opening Range Gaps [TakingProphets]What is an Opening Range Gap (ORG)?
In ICT, the Opening Range Gap is defined as the price difference between the previous session’s close (e.g., 4:00 PM EST in U.S. indices) and the current day’s open (9:30 AM EST).
That gap is a liquidity void—an area where no trading occurred during regular hours.
Why ICT Traders Care About ORG
Liquidity Void (Gap Fill Logic)
-Because the gap is an untraded area, it naturally acts as a draw on liquidity.
-Price often seeks to rebalance by retracing into or fully filling this void.
Premium/Discount Sensitivity
-Once the ORG is defined, ICT treats it as a mini dealing range.
-Above EQ (Consequent Encroachment) = algorithmic premium (sell-sensitive).
-Below EQ = algorithmic discount (buy-sensitive).
-Price reaction at these levels gives a precise read on institutional intent intraday.
Support/Resistance from ORG
-If the session opens above prior close, the gap often acts as support until violated.
-If the session opens below prior close, the gap often acts as resistance until reclaimed.
Key ICT Concepts Anchored to ORG
Consequent Encroachment (CE): The midpoint of the gap. The algo is highly sensitive to CE as a decision point: reject → continuation; reclaim → reversal.
Draw on Liquidity (DoL): Price is algorithmically “pulled” toward gap fills, CE, or the opposite side of the ORG.
Order Flow Confirmation: If price ignores the gap and runs away from it, this signals strong institutional order flow in that direction.
Confluence with Other Tools: FVGs, OBs, and HTF PD arrays often overlap with ORG levels, strengthening setups.
Practical Application for Traders
Bias Formation:
Use ORG EQ as a line in the sand for intraday bias.
If price trades below ORG EQ after the open → look for short setups into the prior day’s low or external liquidity.
If price trades above ORG EQ → favor longs into highs/liquidity pools.
Execution Framework:
Wait for liquidity raids or market structure shifts at ORG edges (.00, .25, .50, .75).
Target: EQ, opposite quarter, or full gap fill.
Precision Reads:
ORG lines let traders anticipate where algorithms are likely to respond, providing mechanical invalidation and clear targets without clutter.
New York SessionNY Session highlighter, this indicator highlights newyork session to make it simpler for everyone to differentiate moves between different different sessions
Impulse Range Compression & Expansion (IRCE)📌 Impulse Range Compression & Expansion (IRCE) – Visualizing Price Traps Before Breakouts
📖 Overview
The IRCE Indicator is a precision breakout detection tool designed to identify consolidation traps and price coil zones before expansion moves occur. Unlike traditional volatility indicators that rely solely on statistical thresholds (e.g., Bollinger Bands or ATR), IRCE focuses on behavioral price compression, detecting tight-range candle clusters and validating breakouts through body expansion and/or volume surges.
This makes it ideal for traders looking to:
• Catch breakouts from range traps
• Avoid choppy and premature signals
• Spot early-stage momentum moves based on clean price behavior
⸻
⚙️ How It Works
1. Impulse Range Compression Detection
• Measures the high-low range of each candle
• Compares it to a user-defined average range (default 7 bars)
• Flags candles where the range is significantly smaller (e.g., <60% of average)
• Groups these into tight clusters, indicating compression zones or potential “trap ranges”
2. Cluster Box Construction
• When a valid cluster (e.g., 3 or more tight candles) is detected, the indicator:
• Marks the high and low of the cluster
• Draws a shaded box over this “trap zone”
• This helps visually track where price has coiled before a breakout
3. Breakout Confirmation Logic
A breakout from the trap zone is only validated when:
• Price closes above the cluster high (bullish) or below the cluster low (bearish)
• One or both of the following confirm strength:
• Body Expansion: Current candle body is 120%+ of recent average
• Volume Expansion: Volume exceeds recent volume average
4. Optional Trend Filter
• An optional EMA filter (default: 50 EMA) ensures breakout signals align with trend direction
• Helps filter out countertrend noise in ranging markets
5. Signal Cooldown
• Prevents repeated signals by enforcing a cooldown period (e.g., 10 bars) between entries
⸻
🖥️ Visual Elements
• 📦 Yellow compression boxes represent tight price traps
• 🟢 Buy labels appear when price breaks above the trap with confirmation
• 🔴 Sell labels appear when price breaks below with confirmation
• All visuals are non-repainting and updated in real-time
🧠 How to Use
1. Wait for a yellow trap box to appear
2. Watch for a confirmed breakout from the trap zone
3. Take the trade in the direction of the breakout:
• Only if it satisfies body or volume confirmation
• And if trend alignment is enabled, it must match EMA direction
4. Place stops just outside the opposite end of the trap zone
5. Use risk/reward ratios or structure levels for exits
This logic works great on:
• Lower timeframes (scalping breakouts)
• Higher timeframes (detecting price coiling before major moves)
• Any market: Stocks, Crypto, FX, Commodities
⸻
🔒 Technical Notes
• ✅ No repainting
• ✅ No future-looking logic
• ✅ Suitable for both discretionary and systematic traders
• ✅ Built in Pine Script v6
Dynamic Equity Allocation Model"Cash is Trash"? Not Always. Here's Why Science Beats Guesswork.
Every retail trader knows the frustration: you draw support and resistance lines, you spot patterns, you follow market gurus on social media—and still, when the next bear market hits, your portfolio bleeds red. Meanwhile, institutional investors seem to navigate market turbulence with ease, preserving capital when markets crash and participating when they rally. What's their secret?
The answer isn't insider information or access to exotic derivatives. It's systematic, scientifically validated decision-making. While most retail traders rely on subjective chart analysis and emotional reactions, professional portfolio managers use quantitative models that remove emotion from the equation and process multiple streams of market information simultaneously.
This document presents exactly such a system—not a proprietary black box available only to hedge funds, but a fully transparent, academically grounded framework that any serious investor can understand and apply. The Dynamic Equity Allocation Model (DEAM) synthesizes decades of financial research from Nobel laureates and leading academics into a practical tool for tactical asset allocation.
Stop drawing colorful lines on your chart and start thinking like a quant. This isn't about predicting where the market goes next week—it's about systematically adjusting your risk exposure based on what the data actually tells you. When valuations scream danger, when volatility spikes, when credit markets freeze, when multiple warning signals align—that's when cash isn't trash. That's when cash saves your portfolio.
The irony of "cash is trash" rhetoric is that it ignores timing. Yes, being 100% cash for decades would be disastrous. But being 100% equities through every crisis is equally foolish. The sophisticated approach is dynamic: aggressive when conditions favor risk-taking, defensive when they don't. This model shows you how to make that decision systematically, not emotionally.
Whether you're managing your own retirement portfolio or seeking to understand how institutional allocation strategies work, this comprehensive analysis provides the theoretical foundation, mathematical implementation, and practical guidance to elevate your investment approach from amateur to professional.
The choice is yours: keep hoping your chart patterns work out, or start using the same quantitative methods that professionals rely on. The tools are here. The research is cited. The methodology is explained. All you need to do is read, understand, and apply.
The Dynamic Equity Allocation Model (DEAM) is a quantitative framework for systematic allocation between equities and cash, grounded in modern portfolio theory and empirical market research. The model integrates five scientifically validated dimensions of market analysis—market regime, risk metrics, valuation, sentiment, and macroeconomic conditions—to generate dynamic allocation recommendations ranging from 0% to 100% equity exposure. This work documents the theoretical foundations, mathematical implementation, and practical application of this multi-factor approach.
1. Introduction and Theoretical Background
1.1 The Limitations of Static Portfolio Allocation
Traditional portfolio theory, as formulated by Markowitz (1952) in his seminal work "Portfolio Selection," assumes an optimal static allocation where investors distribute their wealth across asset classes according to their risk aversion. This approach rests on the assumption that returns and risks remain constant over time. However, empirical research demonstrates that this assumption does not hold in reality. Fama and French (1989) showed that expected returns vary over time and correlate with macroeconomic variables such as the spread between long-term and short-term interest rates. Campbell and Shiller (1988) demonstrated that the price-earnings ratio possesses predictive power for future stock returns, providing a foundation for dynamic allocation strategies.
The academic literature on tactical asset allocation has evolved considerably over recent decades. Ilmanen (2011) argues in "Expected Returns" that investors can improve their risk-adjusted returns by considering valuation levels, business cycles, and market sentiment. The Dynamic Equity Allocation Model presented here builds on this research tradition and operationalizes these insights into a practically applicable allocation framework.
1.2 Multi-Factor Approaches in Asset Allocation
Modern financial research has shown that different factors capture distinct aspects of market dynamics and together provide a more robust picture of market conditions than individual indicators. Ross (1976) developed the Arbitrage Pricing Theory, a model that employs multiple factors to explain security returns. Following this multi-factor philosophy, DEAM integrates five complementary analytical dimensions, each tapping different information sources and collectively enabling comprehensive market understanding.
2. Data Foundation and Data Quality
2.1 Data Sources Used
The model draws its data exclusively from publicly available market data via the TradingView platform. This transparency and accessibility is a significant advantage over proprietary models that rely on non-public data. The data foundation encompasses several categories of market information, each capturing specific aspects of market dynamics.
First, price data for the S&P 500 Index is obtained through the SPDR S&P 500 ETF (ticker: SPY). The use of a highly liquid ETF instead of the index itself has practical reasons, as ETF data is available in real-time and reflects actual tradability. In addition to closing prices, high, low, and volume data are captured, which are required for calculating advanced volatility measures.
Fundamental corporate metrics are retrieved via TradingView's Financial Data API. These include earnings per share, price-to-earnings ratio, return on equity, debt-to-equity ratio, dividend yield, and share buyback yield. Cochrane (2011) emphasizes in "Presidential Address: Discount Rates" the central importance of valuation metrics for forecasting future returns, making these fundamental data a cornerstone of the model.
Volatility indicators are represented by the CBOE Volatility Index (VIX) and related metrics. The VIX, often referred to as the market's "fear gauge," measures the implied volatility of S&P 500 index options and serves as a proxy for market participants' risk perception. Whaley (2000) describes in "The Investor Fear Gauge" the construction and interpretation of the VIX and its use as a sentiment indicator.
Macroeconomic data includes yield curve information through US Treasury bonds of various maturities and credit risk premiums through the spread between high-yield bonds and risk-free government bonds. These variables capture the macroeconomic conditions and financing conditions relevant for equity valuation. Estrella and Hardouvelis (1991) showed that the shape of the yield curve has predictive power for future economic activity, justifying the inclusion of these data.
2.2 Handling Missing Data
A practical problem when working with financial data is dealing with missing or unavailable values. The model implements a fallback system where a plausible historical average value is stored for each fundamental metric. When current data is unavailable for a specific point in time, this fallback value is used. This approach ensures that the model remains functional even during temporary data outages and avoids systematic biases from missing data. The use of average values as fallback is conservative, as it generates neither overly optimistic nor pessimistic signals.
3. Component 1: Market Regime Detection
3.1 The Concept of Market Regimes
The idea that financial markets exist in different "regimes" or states that differ in their statistical properties has a long tradition in financial science. Hamilton (1989) developed regime-switching models that allow distinguishing between different market states with different return and volatility characteristics. The practical application of this theory consists of identifying the current market state and adjusting portfolio allocation accordingly.
DEAM classifies market regimes using a scoring system that considers three main dimensions: trend strength, volatility level, and drawdown depth. This multidimensional view is more robust than focusing on individual indicators, as it captures various facets of market dynamics. Classification occurs into six distinct regimes: Strong Bull, Bull Market, Neutral, Correction, Bear Market, and Crisis.
3.2 Trend Analysis Through Moving Averages
Moving averages are among the oldest and most widely used technical indicators and have also received attention in academic literature. Brock, Lakonishok, and LeBaron (1992) examined in "Simple Technical Trading Rules and the Stochastic Properties of Stock Returns" the profitability of trading rules based on moving averages and found evidence for their predictive power, although later studies questioned the robustness of these results when considering transaction costs.
The model calculates three moving averages with different time windows: a 20-day average (approximately one trading month), a 50-day average (approximately one quarter), and a 200-day average (approximately one trading year). The relationship of the current price to these averages and the relationship of the averages to each other provide information about trend strength and direction. When the price trades above all three averages and the short-term average is above the long-term, this indicates an established uptrend. The model assigns points based on these constellations, with longer-term trends weighted more heavily as they are considered more persistent.
3.3 Volatility Regimes
Volatility, understood as the standard deviation of returns, is a central concept of financial theory and serves as the primary risk measure. However, research has shown that volatility is not constant but changes over time and occurs in clusters—a phenomenon first documented by Mandelbrot (1963) and later formalized through ARCH and GARCH models (Engle, 1982; Bollerslev, 1986).
DEAM calculates volatility not only through the classic method of return standard deviation but also uses more advanced estimators such as the Parkinson estimator and the Garman-Klass estimator. These methods utilize intraday information (high and low prices) and are more efficient than simple close-to-close volatility estimators. The Parkinson estimator (Parkinson, 1980) uses the range between high and low of a trading day and is based on the recognition that this information reveals more about true volatility than just the closing price difference. The Garman-Klass estimator (Garman and Klass, 1980) extends this approach by additionally considering opening and closing prices.
The calculated volatility is annualized by multiplying it by the square root of 252 (the average number of trading days per year), enabling standardized comparability. The model compares current volatility with the VIX, the implied volatility from option prices. A low VIX (below 15) signals market comfort and increases the regime score, while a high VIX (above 35) indicates market stress and reduces the score. This interpretation follows the empirical observation that elevated volatility is typically associated with falling markets (Schwert, 1989).
3.4 Drawdown Analysis
A drawdown refers to the percentage decline from the highest point (peak) to the lowest point (trough) during a specific period. This metric is psychologically significant for investors as it represents the maximum loss experienced. Calmar (1991) developed the Calmar Ratio, which relates return to maximum drawdown, underscoring the practical relevance of this metric.
The model calculates current drawdown as the percentage distance from the highest price of the last 252 trading days (one year). A drawdown below 3% is considered negligible and maximally increases the regime score. As drawdown increases, the score decreases progressively, with drawdowns above 20% classified as severe and indicating a crisis or bear market regime. These thresholds are empirically motivated by historical market cycles, in which corrections typically encompassed 5-10% drawdowns, bear markets 20-30%, and crises over 30%.
3.5 Regime Classification
Final regime classification occurs through aggregation of scores from trend (40% weight), volatility (30%), and drawdown (30%). The higher weighting of trend reflects the empirical observation that trend-following strategies have historically delivered robust results (Moskowitz, Ooi, and Pedersen, 2012). A total score above 80 signals a strong bull market with established uptrend, low volatility, and minimal losses. At a score below 10, a crisis situation exists requiring defensive positioning. The six regime categories enable a differentiated allocation strategy that not only distinguishes binarily between bullish and bearish but allows gradual gradations.
4. Component 2: Risk-Based Allocation
4.1 Volatility Targeting as Risk Management Approach
The concept of volatility targeting is based on the idea that investors should maximize not returns but risk-adjusted returns. Sharpe (1966, 1994) defined with the Sharpe Ratio the fundamental concept of return per unit of risk, measured as volatility. Volatility targeting goes a step further and adjusts portfolio allocation to achieve constant target volatility. This means that in times of low market volatility, equity allocation is increased, and in times of high volatility, it is reduced.
Moreira and Muir (2017) showed in "Volatility-Managed Portfolios" that strategies that adjust their exposure based on volatility forecasts achieve higher Sharpe Ratios than passive buy-and-hold strategies. DEAM implements this principle by defining a target portfolio volatility (default 12% annualized) and adjusting equity allocation to achieve it. The mathematical foundation is simple: if market volatility is 20% and target volatility is 12%, equity allocation should be 60% (12/20 = 0.6), with the remaining 40% held in cash with zero volatility.
4.2 Market Volatility Calculation
Estimating current market volatility is central to the risk-based allocation approach. The model uses several volatility estimators in parallel and selects the higher value between traditional close-to-close volatility and the Parkinson estimator. This conservative choice ensures the model does not underestimate true volatility, which could lead to excessive risk exposure.
Traditional volatility calculation uses logarithmic returns, as these have mathematically advantageous properties (additive linkage over multiple periods). The logarithmic return is calculated as ln(P_t / P_{t-1}), where P_t is the price at time t. The standard deviation of these returns over a rolling 20-trading-day window is then multiplied by √252 to obtain annualized volatility. This annualization is based on the assumption of independently identically distributed returns, which is an idealization but widely accepted in practice.
The Parkinson estimator uses additional information from the trading range (High minus Low) of each day. The formula is: σ_P = (1/√(4ln2)) × √(1/n × Σln²(H_i/L_i)) × √252, where H_i and L_i are high and low prices. Under ideal conditions, this estimator is approximately five times more efficient than the close-to-close estimator (Parkinson, 1980), as it uses more information per observation.
4.3 Drawdown-Based Position Size Adjustment
In addition to volatility targeting, the model implements drawdown-based risk control. The logic is that deep market declines often signal further losses and therefore justify exposure reduction. This behavior corresponds with the concept of path-dependent risk tolerance: investors who have already suffered losses are typically less willing to take additional risk (Kahneman and Tversky, 1979).
The model defines a maximum portfolio drawdown as a target parameter (default 15%). Since portfolio volatility and portfolio drawdown are proportional to equity allocation (assuming cash has neither volatility nor drawdown), allocation-based control is possible. For example, if the market exhibits a 25% drawdown and target portfolio drawdown is 15%, equity allocation should be at most 60% (15/25).
4.4 Dynamic Risk Adjustment
An advanced feature of DEAM is dynamic adjustment of risk-based allocation through a feedback mechanism. The model continuously estimates what actual portfolio volatility and portfolio drawdown would result at the current allocation. If risk utilization (ratio of actual to target risk) exceeds 1.0, allocation is reduced by an adjustment factor that grows exponentially with overutilization. This implements a form of dynamic feedback that avoids overexposure.
Mathematically, a risk adjustment factor r_adjust is calculated: if risk utilization u > 1, then r_adjust = exp(-0.5 × (u - 1)). This exponential function ensures that moderate overutilization is gently corrected, while strong overutilization triggers drastic reductions. The factor 0.5 in the exponent was empirically calibrated to achieve a balanced ratio between sensitivity and stability.
5. Component 3: Valuation Analysis
5.1 Theoretical Foundations of Fundamental Valuation
DEAM's valuation component is based on the fundamental premise that the intrinsic value of a security is determined by its future cash flows and that deviations between market price and intrinsic value are eventually corrected. Graham and Dodd (1934) established in "Security Analysis" the basic principles of fundamental analysis that remain relevant today. Translated into modern portfolio context, this means that markets with high valuation metrics (high price-earnings ratios) should have lower expected returns than cheaply valued markets.
Campbell and Shiller (1988) developed the Cyclically Adjusted P/E Ratio (CAPE), which smooths earnings over a full business cycle. Their empirical analysis showed that this ratio has significant predictive power for 10-year returns. Asness, Moskowitz, and Pedersen (2013) demonstrated in "Value and Momentum Everywhere" that value effects exist not only in individual stocks but also in asset classes and markets.
5.2 Equity Risk Premium as Central Valuation Metric
The Equity Risk Premium (ERP) is defined as the expected excess return of stocks over risk-free government bonds. It is the theoretical heart of valuation analysis, as it represents the compensation investors demand for bearing equity risk. Damodaran (2012) discusses in "Equity Risk Premiums: Determinants, Estimation and Implications" various methods for ERP estimation.
DEAM calculates ERP not through a single method but combines four complementary approaches with different weights. This multi-method strategy increases estimation robustness and avoids dependence on single, potentially erroneous inputs.
The first method (35% weight) uses earnings yield, calculated as 1/P/E or directly from operating earnings data, and subtracts the 10-year Treasury yield. This method follows Fed Model logic (Yardeni, 2003), although this model has theoretical weaknesses as it does not consistently treat inflation (Asness, 2003).
The second method (30% weight) extends earnings yield by share buyback yield. Share buybacks are a form of capital return to shareholders and increase value per share. Boudoukh et al. (2007) showed in "The Total Shareholder Yield" that the sum of dividend yield and buyback yield is a better predictor of future returns than dividend yield alone.
The third method (20% weight) implements the Gordon Growth Model (Gordon, 1962), which models stock value as the sum of discounted future dividends. Under constant growth g assumption: Expected Return = Dividend Yield + g. The model estimates sustainable growth as g = ROE × (1 - Payout Ratio), where ROE is return on equity and payout ratio is the ratio of dividends to earnings. This formula follows from equity theory: unretained earnings are reinvested at ROE and generate additional earnings growth.
The fourth method (15% weight) combines total shareholder yield (Dividend + Buybacks) with implied growth derived from revenue growth. This method considers that companies with strong revenue growth should generate higher future earnings, even if current valuations do not yet fully reflect this.
The final ERP is the weighted average of these four methods. A high ERP (above 4%) signals attractive valuations and increases the valuation score to 95 out of 100 possible points. A negative ERP, where stocks have lower expected returns than bonds, results in a minimal score of 10.
5.3 Quality Adjustments to Valuation
Valuation metrics alone can be misleading if not interpreted in the context of company quality. A company with a low P/E may be cheap or fundamentally problematic. The model therefore implements quality adjustments based on growth, profitability, and capital structure.
Revenue growth above 10% annually adds 10 points to the valuation score, moderate growth above 5% adds 5 points. This adjustment reflects that growth has independent value (Modigliani and Miller, 1961, extended by later growth theory). Net margin above 15% signals pricing power and operational efficiency and increases the score by 5 points, while low margins below 8% indicate competitive pressure and subtract 5 points.
Return on equity (ROE) above 20% characterizes outstanding capital efficiency and increases the score by 5 points. Piotroski (2000) showed in "Value Investing: The Use of Historical Financial Statement Information" that fundamental quality signals such as high ROE can improve the performance of value strategies.
Capital structure is evaluated through the debt-to-equity ratio. A conservative ratio below 1.0 multiplies the valuation score by 1.2, while high leverage above 2.0 applies a multiplier of 0.8. This adjustment reflects that high debt constrains financial flexibility and can become problematic in crisis times (Korteweg, 2010).
6. Component 4: Sentiment Analysis
6.1 The Role of Sentiment in Financial Markets
Investor sentiment, defined as the collective psychological attitude of market participants, influences asset prices independently of fundamental data. Baker and Wurgler (2006, 2007) developed a sentiment index and showed that periods of high sentiment are followed by overvaluations that later correct. This insight justifies integrating a sentiment component into allocation decisions.
Sentiment is difficult to measure directly but can be proxied through market indicators. The VIX is the most widely used sentiment indicator, as it aggregates implied volatility from option prices. High VIX values reflect elevated uncertainty and risk aversion, while low values signal market comfort. Whaley (2009) refers to the VIX as the "Investor Fear Gauge" and documents its role as a contrarian indicator: extremely high values typically occur at market bottoms, while low values occur at tops.
6.2 VIX-Based Sentiment Assessment
DEAM uses statistical normalization of the VIX by calculating the Z-score: z = (VIX_current - VIX_average) / VIX_standard_deviation. The Z-score indicates how many standard deviations the current VIX is from the historical average. This approach is more robust than absolute thresholds, as it adapts to the average volatility level, which can vary over longer periods.
A Z-score below -1.5 (VIX is 1.5 standard deviations below average) signals exceptionally low risk perception and adds 40 points to the sentiment score. This may seem counterintuitive—shouldn't low fear be bullish? However, the logic follows the contrarian principle: when no one is afraid, everyone is already invested, and there is limited further upside potential (Zweig, 1973). Conversely, a Z-score above 1.5 (extreme fear) adds -40 points, reflecting market panic but simultaneously suggesting potential buying opportunities.
6.3 VIX Term Structure as Sentiment Signal
The VIX term structure provides additional sentiment information. Normally, the VIX trades in contango, meaning longer-term VIX futures have higher prices than short-term. This reflects that short-term volatility is currently known, while long-term volatility is more uncertain and carries a risk premium. The model compares the VIX with VIX9D (9-day volatility) and identifies backwardation (VIX > 1.05 × VIX9D) and steep backwardation (VIX > 1.15 × VIX9D).
Backwardation occurs when short-term implied volatility is higher than longer-term, which typically happens during market stress. Investors anticipate immediate turbulence but expect calming. Psychologically, this reflects acute fear. The model subtracts 15 points for backwardation and 30 for steep backwardation, as these constellations signal elevated risk. Simon and Wiggins (2001) analyzed the VIX futures curve and showed that backwardation is associated with market declines.
6.4 Safe-Haven Flows
During crisis times, investors flee from risky assets into safe havens: gold, US dollar, and Japanese yen. This "flight to quality" is a sentiment signal. The model calculates the performance of these assets relative to stocks over the last 20 trading days. When gold or the dollar strongly rise while stocks fall, this indicates elevated risk aversion.
The safe-haven component is calculated as the difference between safe-haven performance and stock performance. Positive values (safe havens outperform) subtract up to 20 points from the sentiment score, negative values (stocks outperform) add up to 10 points. The asymmetric treatment (larger deduction for risk-off than bonus for risk-on) reflects that risk-off movements are typically sharper and more informative than risk-on phases.
Baur and Lucey (2010) examined safe-haven properties of gold and showed that gold indeed exhibits negative correlation with stocks during extreme market movements, confirming its role as crisis protection.
7. Component 5: Macroeconomic Analysis
7.1 The Yield Curve as Economic Indicator
The yield curve, represented as yields of government bonds of various maturities, contains aggregated expectations about future interest rates, inflation, and economic growth. The slope of the yield curve has remarkable predictive power for recessions. Estrella and Mishkin (1998) showed that an inverted yield curve (short-term rates higher than long-term) predicts recessions with high reliability. This is because inverted curves reflect restrictive monetary policy: the central bank raises short-term rates to combat inflation, dampening economic activity.
DEAM calculates two spread measures: the 2-year-minus-10-year spread and the 3-month-minus-10-year spread. A steep, positive curve (spreads above 1.5% and 2% respectively) signals healthy growth expectations and generates the maximum yield curve score of 40 points. A flat curve (spreads near zero) reduces the score to 20 points. An inverted curve (negative spreads) is particularly alarming and results in only 10 points.
The choice of two different spreads increases analysis robustness. The 2-10 spread is most established in academic literature, while the 3M-10Y spread is often considered more sensitive, as the 3-month rate directly reflects current monetary policy (Ang, Piazzesi, and Wei, 2006).
7.2 Credit Conditions and Spreads
Credit spreads—the yield difference between risky corporate bonds and safe government bonds—reflect risk perception in the credit market. Gilchrist and Zakrajšek (2012) constructed an "Excess Bond Premium" that measures the component of credit spreads not explained by fundamentals and showed this is a predictor of future economic activity and stock returns.
The model approximates credit spread by comparing the yield of high-yield bond ETFs (HYG) with investment-grade bond ETFs (LQD). A narrow spread below 200 basis points signals healthy credit conditions and risk appetite, contributing 30 points to the macro score. Very wide spreads above 1000 basis points (as during the 2008 financial crisis) signal credit crunch and generate zero points.
Additionally, the model evaluates whether "flight to quality" is occurring, identified through strong performance of Treasury bonds (TLT) with simultaneous weakness in high-yield bonds. This constellation indicates elevated risk aversion and reduces the credit conditions score.
7.3 Financial Stability at Corporate Level
While the yield curve and credit spreads reflect macroeconomic conditions, financial stability evaluates the health of companies themselves. The model uses the aggregated debt-to-equity ratio and return on equity of the S&P 500 as proxies for corporate health.
A low leverage level below 0.5 combined with high ROE above 15% signals robust corporate balance sheets and generates 20 points. This combination is particularly valuable as it represents both defensive strength (low debt means crisis resistance) and offensive strength (high ROE means earnings power). High leverage above 1.5 generates only 5 points, as it implies vulnerability to interest rate increases and recessions.
Korteweg (2010) showed in "The Net Benefits to Leverage" that optimal debt maximizes firm value, but excessive debt increases distress costs. At the aggregated market level, high debt indicates fragilities that can become problematic during stress phases.
8. Component 6: Crisis Detection
8.1 The Need for Systematic Crisis Detection
Financial crises are rare but extremely impactful events that suspend normal statistical relationships. During normal market volatility, diversified portfolios and traditional risk management approaches function, but during systemic crises, seemingly independent assets suddenly correlate strongly, and losses exceed historical expectations (Longin and Solnik, 2001). This justifies a separate crisis detection mechanism that operates independently of regular allocation components.
Reinhart and Rogoff (2009) documented in "This Time Is Different: Eight Centuries of Financial Folly" recurring patterns in financial crises: extreme volatility, massive drawdowns, credit market dysfunction, and asset price collapse. DEAM operationalizes these patterns into quantifiable crisis indicators.
8.2 Multi-Signal Crisis Identification
The model uses a counter-based approach where various stress signals are identified and aggregated. This methodology is more robust than relying on a single indicator, as true crises typically occur simultaneously across multiple dimensions. A single signal may be a false alarm, but the simultaneous presence of multiple signals increases confidence.
The first indicator is a VIX above the crisis threshold (default 40), adding one point. A VIX above 60 (as in 2008 and March 2020) adds two additional points, as such extreme values are historically very rare. This tiered approach captures the intensity of volatility.
The second indicator is market drawdown. A drawdown above 15% adds one point, as corrections of this magnitude can be potential harbingers of larger crises. A drawdown above 25% adds another point, as historical bear markets typically encompass 25-40% drawdowns.
The third indicator is credit market spreads above 500 basis points, adding one point. Such wide spreads occur only during significant credit market disruptions, as in 2008 during the Lehman crisis.
The fourth indicator identifies simultaneous losses in stocks and bonds. Normally, Treasury bonds act as a hedge against equity risk (negative correlation), but when both fall simultaneously, this indicates systemic liquidity problems or inflation/stagflation fears. The model checks whether both SPY and TLT have fallen more than 10% and 5% respectively over 5 trading days, adding two points.
The fifth indicator is a volume spike combined with negative returns. Extreme trading volumes (above twice the 20-day average) with falling prices signal panic selling. This adds one point.
A crisis situation is diagnosed when at least 3 indicators trigger, a severe crisis at 5 or more indicators. These thresholds were calibrated through historical backtesting to identify true crises (2008, 2020) without generating excessive false alarms.
8.3 Crisis-Based Allocation Override
When a crisis is detected, the system overrides the normal allocation recommendation and caps equity allocation at maximum 25%. In a severe crisis, the cap is set at 10%. This drastic defensive posture follows the empirical observation that crises typically require time to develop and that early reduction can avoid substantial losses (Faber, 2007).
This override logic implements a "safety first" principle: in situations of existential danger to the portfolio, capital preservation becomes the top priority. Roy (1952) formalized this approach in "Safety First and the Holding of Assets," arguing that investors should primarily minimize ruin probability.
9. Integration and Final Allocation Calculation
9.1 Component Weighting
The final allocation recommendation emerges through weighted aggregation of the five components. The standard weighting is: Market Regime 35%, Risk Management 25%, Valuation 20%, Sentiment 15%, Macro 5%. These weights reflect both theoretical considerations and empirical backtesting results.
The highest weighting of market regime is based on evidence that trend-following and momentum strategies have delivered robust results across various asset classes and time periods (Moskowitz, Ooi, and Pedersen, 2012). Current market momentum is highly informative for the near future, although it provides no information about long-term expectations.
The substantial weighting of risk management (25%) follows from the central importance of risk control. Wealth preservation is the foundation of long-term wealth creation, and systematic risk management is demonstrably value-creating (Moreira and Muir, 2017).
The valuation component receives 20% weight, based on the long-term mean reversion of valuation metrics. While valuation has limited short-term predictive power (bull and bear markets can begin at any valuation), the long-term relationship between valuation and returns is robustly documented (Campbell and Shiller, 1988).
Sentiment (15%) and Macro (5%) receive lower weights, as these factors are subtler and harder to measure. Sentiment is valuable as a contrarian indicator at extremes but less informative in normal ranges. Macro variables such as the yield curve have strong predictive power for recessions, but the transmission from recessions to stock market performance is complex and temporally variable.
9.2 Model Type Adjustments
DEAM allows users to choose between four model types: Conservative, Balanced, Aggressive, and Adaptive. This choice modifies the final allocation through additive adjustments.
Conservative mode subtracts 10 percentage points from allocation, resulting in consistently more cautious positioning. This is suitable for risk-averse investors or those with limited investment horizons. Aggressive mode adds 10 percentage points, suitable for risk-tolerant investors with long horizons.
Adaptive mode implements procyclical adjustment based on short-term momentum: if the market has risen more than 5% in the last 20 days, 5 percentage points are added; if it has declined more than 5%, 5 points are subtracted. This logic follows the observation that short-term momentum persists (Jegadeesh and Titman, 1993), but the moderate size of adjustment avoids excessive timing bets.
Balanced mode makes no adjustment and uses raw model output. This neutral setting is suitable for investors who wish to trust model recommendations unchanged.
9.3 Smoothing and Stability
The allocation resulting from aggregation undergoes final smoothing through a simple moving average over 3 periods. This smoothing is crucial for model practicality, as it reduces frequent trading and thus transaction costs. Without smoothing, the model could fluctuate between adjacent allocations with every small input change.
The choice of 3 periods as smoothing window is a compromise between responsiveness and stability. Longer smoothing would excessively delay signals and impede response to true regime changes. Shorter or no smoothing would allow too much noise. Empirical tests showed that 3-period smoothing offers an optimal ratio between these goals.
10. Visualization and Interpretation
10.1 Main Output: Equity Allocation
DEAM's primary output is a time series from 0 to 100 representing the recommended percentage allocation to equities. This representation is intuitive: 100% means full investment in stocks (specifically: an S&P 500 ETF), 0% means complete cash position, and intermediate values correspond to mixed portfolios. A value of 60% means, for example: invest 60% of wealth in SPY, hold 40% in money market instruments or cash.
The time series is color-coded to enable quick visual interpretation. Green shades represent high allocations (above 80%, bullish), red shades low allocations (below 20%, bearish), and neutral colors middle allocations. The chart background is dynamically colored based on the signal, enhancing readability in different market phases.
10.2 Dashboard Metrics
A tabular dashboard presents key metrics compactly. This includes current allocation, cash allocation (complement), an aggregated signal (BULLISH/NEUTRAL/BEARISH), current market regime, VIX level, market drawdown, and crisis status.
Additionally, fundamental metrics are displayed: P/E Ratio, Equity Risk Premium, Return on Equity, Debt-to-Equity Ratio, and Total Shareholder Yield. This transparency allows users to understand model decisions and form their own assessments.
Component scores (Regime, Risk, Valuation, Sentiment, Macro) are also displayed, each normalized on a 0-100 scale. This shows which factors primarily drive the current recommendation. If, for example, the Risk score is very low (20) while other scores are moderate (50-60), this indicates that risk management considerations are pulling allocation down.
10.3 Component Breakdown (Optional)
Advanced users can display individual components as separate lines in the chart. This enables analysis of component dynamics: do all components move synchronously, or are there divergences? Divergences can be particularly informative. If, for example, the market regime is bullish (high score) but the valuation component is very negative, this signals an overbought market not fundamentally supported—a classic "bubble warning."
This feature is disabled by default to keep the chart clean but can be activated for deeper analysis.
10.4 Confidence Bands
The model optionally displays uncertainty bands around the main allocation line. These are calculated as ±1 standard deviation of allocation over a rolling 20-period window. Wide bands indicate high volatility of model recommendations, suggesting uncertain market conditions. Narrow bands indicate stable recommendations.
This visualization implements a concept of epistemic uncertainty—uncertainty about the model estimate itself, not just market volatility. In phases where various indicators send conflicting signals, the allocation recommendation becomes more volatile, manifesting in wider bands. Users can understand this as a warning to act more cautiously or consult alternative information sources.
11. Alert System
11.1 Allocation Alerts
DEAM implements an alert system that notifies users of significant events. Allocation alerts trigger when smoothed allocation crosses certain thresholds. An alert is generated when allocation reaches 80% (from below), signaling strong bullish conditions. Another alert triggers when allocation falls to 20%, indicating defensive positioning.
These thresholds are not arbitrary but correspond with boundaries between model regimes. An allocation of 80% roughly corresponds to a clear bull market regime, while 20% corresponds to a bear market regime. Alerts at these points are therefore informative about fundamental regime shifts.
11.2 Crisis Alerts
Separate alerts trigger upon detection of crisis and severe crisis. These alerts have highest priority as they signal large risks. A crisis alert should prompt investors to review their portfolio and potentially take defensive measures beyond the automatic model recommendation (e.g., hedging through put options, rebalancing to more defensive sectors).
11.3 Regime Change Alerts
An alert triggers upon change of market regime (e.g., from Neutral to Correction, or from Bull Market to Strong Bull). Regime changes are highly informative events that typically entail substantial allocation changes. These alerts enable investors to proactively respond to changes in market dynamics.
11.4 Risk Breach Alerts
A specialized alert triggers when actual portfolio risk utilization exceeds target parameters by 20%. This is a warning signal that the risk management system is reaching its limits, possibly because market volatility is rising faster than allocation can be reduced. In such situations, investors should consider manual interventions.
12. Practical Application and Limitations
12.1 Portfolio Implementation
DEAM generates a recommendation for allocation between equities (S&P 500) and cash. Implementation by an investor can take various forms. The most direct method is using an S&P 500 ETF (e.g., SPY, VOO) for equity allocation and a money market fund or savings account for cash allocation.
A rebalancing strategy is required to synchronize actual allocation with model recommendation. Two approaches are possible: (1) rule-based rebalancing at every 10% deviation between actual and target, or (2) time-based monthly rebalancing. Both have trade-offs between responsiveness and transaction costs. Empirical evidence (Jaconetti, Kinniry, and Zilbering, 2010) suggests rebalancing frequency has moderate impact on performance, and investors should optimize based on their transaction costs.
12.2 Adaptation to Individual Preferences
The model offers numerous adjustment parameters. Component weights can be modified if investors place more or less belief in certain factors. A fundamentally-oriented investor might increase valuation weight, while a technical trader might increase regime weight.
Risk target parameters (target volatility, max drawdown) should be adapted to individual risk tolerance. Younger investors with long investment horizons can choose higher target volatility (15-18%), while retirees may prefer lower volatility (8-10%). This adjustment systematically shifts average equity allocation.
Crisis thresholds can be adjusted based on preference for sensitivity versus specificity of crisis detection. Lower thresholds (e.g., VIX > 35 instead of 40) increase sensitivity (more crises are detected) but reduce specificity (more false alarms). Higher thresholds have the reverse effect.
12.3 Limitations and Disclaimers
DEAM is based on historical relationships between indicators and market performance. There is no guarantee these relationships will persist in the future. Structural changes in markets (e.g., through regulation, technology, or central bank policy) can break established patterns. This is the fundamental problem of induction in financial science (Taleb, 2007).
The model is optimized for US equities (S&P 500). Application to other markets (international stocks, bonds, commodities) would require recalibration. The indicators and thresholds are specific to the statistical properties of the US equity market.
The model cannot eliminate losses. Even with perfect crisis prediction, an investor following the model would lose money in bear markets—just less than a buy-and-hold investor. The goal is risk-adjusted performance improvement, not risk elimination.
Transaction costs are not modeled. In practice, spreads, commissions, and taxes reduce net returns. Frequent trading can cause substantial costs. Model smoothing helps minimize this, but users should consider their specific cost situation.
The model reacts to information; it does not anticipate it. During sudden shocks (e.g., 9/11, COVID-19 lockdowns), the model can only react after price movements, not before. This limitation is inherent to all reactive systems.
12.4 Relationship to Other Strategies
DEAM is a tactical asset allocation approach and should be viewed as a complement, not replacement, for strategic asset allocation. Brinson, Hood, and Beebower (1986) showed in their influential study "Determinants of Portfolio Performance" that strategic asset allocation (long-term policy allocation) explains the majority of portfolio performance, but this leaves room for tactical adjustments based on market timing.
The model can be combined with value and momentum strategies at the individual stock level. While DEAM controls overall market exposure, within-equity decisions can be optimized through stock-picking models. This separation between strategic (market exposure) and tactical (stock selection) levels follows classical portfolio theory.
The model does not replace diversification across asset classes. A complete portfolio should also include bonds, international stocks, real estate, and alternative investments. DEAM addresses only the US equity allocation decision within a broader portfolio.
13. Scientific Foundation and Evaluation
13.1 Theoretical Consistency
DEAM's components are based on established financial theory and empirical evidence. The market regime component follows from regime-switching models (Hamilton, 1989) and trend-following literature. The risk management component implements volatility targeting (Moreira and Muir, 2017) and modern portfolio theory (Markowitz, 1952). The valuation component is based on discounted cash flow theory and empirical value research (Campbell and Shiller, 1988; Fama and French, 1992). The sentiment component integrates behavioral finance (Baker and Wurgler, 2006). The macro component uses established business cycle indicators (Estrella and Mishkin, 1998).
This theoretical grounding distinguishes DEAM from purely data-mining-based approaches that identify patterns without causal theory. Theory-guided models have greater probability of functioning out-of-sample, as they are based on fundamental mechanisms, not random correlations (Lo and MacKinlay, 1990).
13.2 Empirical Validation
While this document does not present detailed backtest analysis, it should be noted that rigorous validation of a tactical asset allocation model should include several elements:
In-sample testing establishes whether the model functions at all in the data on which it was calibrated. Out-of-sample testing is crucial: the model should be tested in time periods not used for development. Walk-forward analysis, where the model is successively trained on rolling windows and tested in the next window, approximates real implementation.
Performance metrics should be risk-adjusted. Pure return consideration is misleading, as higher returns often only compensate for higher risk. Sharpe Ratio, Sortino Ratio, Calmar Ratio, and Maximum Drawdown are relevant metrics. Comparison with benchmarks (Buy-and-Hold S&P 500, 60/40 Stock/Bond portfolio) contextualizes performance.
Robustness checks test sensitivity to parameter variation. If the model only functions at specific parameter settings, this indicates overfitting. Robust models show consistent performance over a range of plausible parameters.
13.3 Comparison with Existing Literature
DEAM fits into the broader literature on tactical asset allocation. Faber (2007) presented a simple momentum-based timing system that goes long when the market is above its 10-month average, otherwise cash. This simple system avoided large drawdowns in bear markets. DEAM can be understood as a sophistication of this approach that integrates multiple information sources.
Ilmanen (2011) discusses various timing factors in "Expected Returns" and argues for multi-factor approaches. DEAM operationalizes this philosophy. Asness, Moskowitz, and Pedersen (2013) showed that value and momentum effects work across asset classes, justifying cross-asset application of regime and valuation signals.
Ang (2014) emphasizes in "Asset Management: A Systematic Approach to Factor Investing" the importance of systematic, rule-based approaches over discretionary decisions. DEAM is fully systematic and eliminates emotional biases that plague individual investors (overconfidence, hindsight bias, loss aversion).
References
Ang, A. (2014) *Asset Management: A Systematic Approach to Factor Investing*. Oxford: Oxford University Press.
Ang, A., Piazzesi, M. and Wei, M. (2006) 'What does the yield curve tell us about GDP growth?', *Journal of Econometrics*, 131(1-2), pp. 359-403.
Asness, C.S. (2003) 'Fight the Fed Model', *The Journal of Portfolio Management*, 30(1), pp. 11-24.
Asness, C.S., Moskowitz, T.J. and Pedersen, L.H. (2013) 'Value and Momentum Everywhere', *The Journal of Finance*, 68(3), pp. 929-985.
Baker, M. and Wurgler, J. (2006) 'Investor Sentiment and the Cross-Section of Stock Returns', *The Journal of Finance*, 61(4), pp. 1645-1680.
Baker, M. and Wurgler, J. (2007) 'Investor Sentiment in the Stock Market', *Journal of Economic Perspectives*, 21(2), pp. 129-152.
Baur, D.G. and Lucey, B.M. (2010) 'Is Gold a Hedge or a Safe Haven? An Analysis of Stocks, Bonds and Gold', *Financial Review*, 45(2), pp. 217-229.
Bollerslev, T. (1986) 'Generalized Autoregressive Conditional Heteroskedasticity', *Journal of Econometrics*, 31(3), pp. 307-327.
Boudoukh, J., Michaely, R., Richardson, M. and Roberts, M.R. (2007) 'On the Importance of Measuring Payout Yield: Implications for Empirical Asset Pricing', *The Journal of Finance*, 62(2), pp. 877-915.
Brinson, G.P., Hood, L.R. and Beebower, G.L. (1986) 'Determinants of Portfolio Performance', *Financial Analysts Journal*, 42(4), pp. 39-44.
Brock, W., Lakonishok, J. and LeBaron, B. (1992) 'Simple Technical Trading Rules and the Stochastic Properties of Stock Returns', *The Journal of Finance*, 47(5), pp. 1731-1764.
Calmar, T.W. (1991) 'The Calmar Ratio', *Futures*, October issue.
Campbell, J.Y. and Shiller, R.J. (1988) 'The Dividend-Price Ratio and Expectations of Future Dividends and Discount Factors', *Review of Financial Studies*, 1(3), pp. 195-228.
Cochrane, J.H. (2011) 'Presidential Address: Discount Rates', *The Journal of Finance*, 66(4), pp. 1047-1108.
Damodaran, A. (2012) *Equity Risk Premiums: Determinants, Estimation and Implications*. Working Paper, Stern School of Business.
Engle, R.F. (1982) 'Autoregressive Conditional Heteroskedasticity with Estimates of the Variance of United Kingdom Inflation', *Econometrica*, 50(4), pp. 987-1007.
Estrella, A. and Hardouvelis, G.A. (1991) 'The Term Structure as a Predictor of Real Economic Activity', *The Journal of Finance*, 46(2), pp. 555-576.
Estrella, A. and Mishkin, F.S. (1998) 'Predicting U.S. Recessions: Financial Variables as Leading Indicators', *Review of Economics and Statistics*, 80(1), pp. 45-61.
Faber, M.T. (2007) 'A Quantitative Approach to Tactical Asset Allocation', *The Journal of Wealth Management*, 9(4), pp. 69-79.
Fama, E.F. and French, K.R. (1989) 'Business Conditions and Expected Returns on Stocks and Bonds', *Journal of Financial Economics*, 25(1), pp. 23-49.
Fama, E.F. and French, K.R. (1992) 'The Cross-Section of Expected Stock Returns', *The Journal of Finance*, 47(2), pp. 427-465.
Garman, M.B. and Klass, M.J. (1980) 'On the Estimation of Security Price Volatilities from Historical Data', *Journal of Business*, 53(1), pp. 67-78.
Gilchrist, S. and Zakrajšek, E. (2012) 'Credit Spreads and Business Cycle Fluctuations', *American Economic Review*, 102(4), pp. 1692-1720.
Gordon, M.J. (1962) *The Investment, Financing, and Valuation of the Corporation*. Homewood: Irwin.
Graham, B. and Dodd, D.L. (1934) *Security Analysis*. New York: McGraw-Hill.
Hamilton, J.D. (1989) 'A New Approach to the Economic Analysis of Nonstationary Time Series and the Business Cycle', *Econometrica*, 57(2), pp. 357-384.
Ilmanen, A. (2011) *Expected Returns: An Investor's Guide to Harvesting Market Rewards*. Chichester: Wiley.
Jaconetti, C.M., Kinniry, F.M. and Zilbering, Y. (2010) 'Best Practices for Portfolio Rebalancing', *Vanguard Research Paper*.
Jegadeesh, N. and Titman, S. (1993) 'Returns to Buying Winners and Selling Losers: Implications for Stock Market Efficiency', *The Journal of Finance*, 48(1), pp. 65-91.
Kahneman, D. and Tversky, A. (1979) 'Prospect Theory: An Analysis of Decision under Risk', *Econometrica*, 47(2), pp. 263-292.
Korteweg, A. (2010) 'The Net Benefits to Leverage', *The Journal of Finance*, 65(6), pp. 2137-2170.
Lo, A.W. and MacKinlay, A.C. (1990) 'Data-Snooping Biases in Tests of Financial Asset Pricing Models', *Review of Financial Studies*, 3(3), pp. 431-467.
Longin, F. and Solnik, B. (2001) 'Extreme Correlation of International Equity Markets', *The Journal of Finance*, 56(2), pp. 649-676.
Mandelbrot, B. (1963) 'The Variation of Certain Speculative Prices', *The Journal of Business*, 36(4), pp. 394-419.
Markowitz, H. (1952) 'Portfolio Selection', *The Journal of Finance*, 7(1), pp. 77-91.
Modigliani, F. and Miller, M.H. (1961) 'Dividend Policy, Growth, and the Valuation of Shares', *The Journal of Business*, 34(4), pp. 411-433.
Moreira, A. and Muir, T. (2017) 'Volatility-Managed Portfolios', *The Journal of Finance*, 72(4), pp. 1611-1644.
Moskowitz, T.J., Ooi, Y.H. and Pedersen, L.H. (2012) 'Time Series Momentum', *Journal of Financial Economics*, 104(2), pp. 228-250.
Parkinson, M. (1980) 'The Extreme Value Method for Estimating the Variance of the Rate of Return', *Journal of Business*, 53(1), pp. 61-65.
Piotroski, J.D. (2000) 'Value Investing: The Use of Historical Financial Statement Information to Separate Winners from Losers', *Journal of Accounting Research*, 38, pp. 1-41.
Reinhart, C.M. and Rogoff, K.S. (2009) *This Time Is Different: Eight Centuries of Financial Folly*. Princeton: Princeton University Press.
Ross, S.A. (1976) 'The Arbitrage Theory of Capital Asset Pricing', *Journal of Economic Theory*, 13(3), pp. 341-360.
Roy, A.D. (1952) 'Safety First and the Holding of Assets', *Econometrica*, 20(3), pp. 431-449.
Schwert, G.W. (1989) 'Why Does Stock Market Volatility Change Over Time?', *The Journal of Finance*, 44(5), pp. 1115-1153.
Sharpe, W.F. (1966) 'Mutual Fund Performance', *The Journal of Business*, 39(1), pp. 119-138.
Sharpe, W.F. (1994) 'The Sharpe Ratio', *The Journal of Portfolio Management*, 21(1), pp. 49-58.
Simon, D.P. and Wiggins, R.A. (2001) 'S&P Futures Returns and Contrary Sentiment Indicators', *Journal of Futures Markets*, 21(5), pp. 447-462.
Taleb, N.N. (2007) *The Black Swan: The Impact of the Highly Improbable*. New York: Random House.
Whaley, R.E. (2000) 'The Investor Fear Gauge', *The Journal of Portfolio Management*, 26(3), pp. 12-17.
Whaley, R.E. (2009) 'Understanding the VIX', *The Journal of Portfolio Management*, 35(3), pp. 98-105.
Yardeni, E. (2003) 'Stock Valuation Models', *Topical Study*, 51, Yardeni Research.
Zweig, M.E. (1973) 'An Investor Expectations Stock Price Predictive Model Using Closed-End Fund Premiums', *The Journal of Finance*, 28(1), pp. 67-78.
PSP [ANAY]PSP and TPD with ES NQ and YM. When NQ closoes up and ES closes down that marked uot a TPD
RSI: chart overlay
This indicator maps RSI thresholds directly onto price. Since the EMA of price aligns with RSI’s 50-line, it draws a volatility-based band around the EMA to reveal levels such as 70 and 30.
By converting RSI values into visible price bands, the overlay lets you see exactly where price would have to move to hit traditional RSI boundaries. These bands adapt in real time to both price movement and market volatility, keeping the classic RSI logic intact while presenting it in the context of price action. This approach helps traders interpret RSI signals without leaving the main chart window.
The calculation uses the same components as the RSI: alternative derivation script: Wilder’s EMA for smoothing, a volatility-based unit for scaling, and a normalization factor. The result is a dynamic band structure on the chart, representing RSI boundary levels in actual price terms.
Key components and calculation breakdown:
Wilder’s EMA
Used as the anchor point for measuring price position.
myEMA = ta.rma(close, Length)
Volatility Unit
Derived from the EMA of absolute close-to-close price changes.
CC_vol = ta.rma(math.abs(close - close ), Length)
Normalization Factor
Scales the volatility unit to align with the RSI formula’s structure.
normalization_factor = 1 / (Length - 1)
Upper and Lower Boundaries
Defines price bands corresponding to selected RSI threshold values.
up_b = myEMA + ((upper - 50) / 50) * (CC_vol / normalization_factor)
down_b = myEMA - ((50 - lower) / 50) * (CC_vol / normalization_factor)
Inputs
RSI length
Upper boundary – RSI level above 50
Lower boundary – RSI level below 50
ON/OFF toggle for 50-point line (EMA of close prices)
ON/OFF toggle for overbought/oversold coloring (use with line chart)
Interpretation:
Each band on the chart represents a chosen RSI level.
When price touches a band, RSI is at that threshold.
The distance between moving average and bands adjusts automatically with volatility and your selected RSI length.
All calculations remain fully consistent with standard RSI values.
Feedback and code suggestions are welcome, especially regarding implementation efficiency and customization.
Zero Lag + Momentum Bias StrategyZero Lag + Momentum Bias Strategy (MTF + Strong MBI + R:R + Partial TP + Alerts)
Smart Money Concept v1Smart Money Concept Indicator – Visual Interpretation Guide
What Happens When Liquidity Lines Are Broken
🟩 Green Line Broken (Buy-Side Liquidity Pool Swept)
- Indicates price has dipped below a previous swing low where sell stops are likely placed.
- Market Makers may be triggering these stops to accumulate long positions.
- Often followed by a bullish reversal.
- Trader Actions:
• Look for a bullish candle close after the sweep.
• Confirm with nearby Bullish Order Block or Fair Value Gap.
• Consider entering a Buy trade (SLH entry).
- If price continues falling: Indicates trend continuation and invalidation of the buy-side liquidity zone.
🟥 Red Line Broken (Sell-Side Liquidity Pool Swept)
- Indicates price has moved above a previous swing high where buy stops are likely placed.
- Market Makers may be triggering these stops to accumulate short positions.
- Often followed by a bearish reversal.
- Trader Actions:
• Look for a bearish candle close after the sweep.
• Confirm with nearby Bearish Order Block or Fair Value Gap.
• Consider entering a Sell trade (SLH entry).
- If price continues rising: Indicates trend continuation and invalidation of the sell-side liquidity zone.
Chart-Based Interpretation of Green Line Breaks
In the provided DOGE/USD 15-minute chart image:
- Green lines represent buy-side liquidity zones.
- If these lines are broken:
• It may be a stop hunt before a bullish continuation.
• Or a false Break of Structure (BOS) leading to deeper retracement.
- Confirmation is needed from candle structure and nearby OB/FVG zones.
Is the Pink Zone a Valid Bullish Order Block?
To validate the pink zone as a Bullish OB:
- It should be formed by a strong down-close candle followed by a bullish move.
- Price should have rallied from this zone previously.
- If price is now retesting it and showing bullish reaction, it confirms validity.
- If formed during low volume or price never rallied from it, it may not be valid.
Smart Money Concept - Liquidity Line Breaks Explained
This document explains how traders should interpret the breaking of green (buy-side) and red (sell-side) liquidity lines when using the Smart Money Concept indicator. These lines represent key liquidity pools where stop orders are likely placed.
🟩 Green Line Broken (Buy-Side Liquidity Pool Swept)
When the green line is broken, it indicates:
• - Price has dipped below a previous swing low where sell stops were likely placed.
• - Market Makers have triggered those stops to accumulate long positions.
• - This is often followed by a bullish reversal.
Trader Actions:
• - Look for a bullish candle close after the sweep.
• - Confirm with a nearby Bullish Order Block or Fair Value Gap.
• - Consider entering a Buy trade (SLH entry).
🟥 Red Line Broken (Sell-Side Liquidity Pool Swept)
When the red line is broken, it indicates:
• - Price has moved above a previous swing high where buy stops were likely placed.
• - Market Makers have triggered those stops to accumulate short positions.
• - This is often followed by a bearish reversal.
Trader Actions:
• - Look for a bearish candle close after the sweep.
• - Confirm with a nearby Bearish Order Block or Fair Value Gap.
• - Consider entering a Sell trade (SLH entry).
📌 Additional Notes
• - If price continues beyond the liquidity line without reversal, it may indicate a trend continuation rather than a stop hunt.
• - Always confirm with Higher Time Frame bias, Institutional Order Flow, and price reaction at the zone.
SPX Year-End 2025 Targets by AnalystsJust year end analyst targets for SPX as of 02 October 2025, as answered by Grok
VWAP / ORB / VP & POCThis is an all-in-one technical analysis tool designed to give you a comprehensive view of the market on a single chart. It combines three powerful indicators—VWAP, Opening Range, and Volume Profile—to help you identify key price levels, understand intraday trends, and spot areas of high liquidity.
What It Does
The indicator plots three distinct components on your chart:
Volume-Weighted Average Price (VWAP): A benchmark that shows the average price a security has traded at throughout the day, based on both price and volume. It's often used by institutional traders to gauge whether they are getting a good price. The script also plots standard deviation or percentage-based bands around the VWAP line, which can act as dynamic support and resistance.
Opening Range Breakout (ORB): A tool that highlights the high and low of the initial trading period of a session (e.g., the first 15 minutes). The script draws lines for the opening price, range high, and range low for the rest of the session. It also colors the chart with zones to visually separate price action above, below, and within this critical opening range.
Volume Profile (VP): A powerful study that shows trading activity over a set number of bars at specific price levels. Unlike traditional volume that is plotted over time, this is plotted on the price axis. It helps you instantly see where the most and least trading has occurred, identifying significant levels like the Point of Control (POC)—the single price with the most volume—and the Value Area (VA), where the majority of trading took place.
How to Use It for Trading
The real strength of this indicator comes from finding confluence, where two or more of its components signal the same key level.
Identifying Support & Resistance: The POC, VWAP bands, Opening Range high/low, and session open price are all powerful levels to watch. When price approaches one of these levels, you can anticipate a potential reaction (a bounce or a breakout).
Gauging Intraday Trend: A simple rule of thumb is to consider the intraday trend bullish when the price is trading above the VWAP and bearish when it is trading below the VWAP.
Finding High-Value Zones: The Volume Profile’s Value Area (VA) shows you where the market has accepted a price. Trading within the VA is considered "fair value," while prices outside of it are "unfair." Reversals often happen when the price tries to re-enter the Value Area from the outside.
Settings:
Here’s a breakdown of all the settings you can change to customize the indicator to your liking.
Volume Profile Settings:
Number of Bars: How many of the most recent bars to use for the calculation. A higher number gives a broader profile.
Row Size: The number of price levels (rows) in the profile. Higher numbers give a more detailed, granular view.
Value Area Volume %: The percentage of total volume to include in the Value Area (standard is 70%).
Horizontal Offset: Moves the Volume Profile further to the right to avoid overlapping with recent price action.
Colors & Styles: Customize the colors for the POC line, Value Area, and the up/down volume bars.
VWAP Settings:
Anchor Period: Resets the VWAP calculation at the start of a new Session, Week, Month, Year, etc. You can even anchor it to corporate events like Earnings or Splits.
Source: The price source used in the calculation (default is hlc3, the average of the high, low, and close).
Bands Calculation Mode:
Standard Deviation: The bands are based on statistical volatility.
Percentage: The bands are a fixed percentage away from the VWAP line.
Bands Multiplier: Sets the distance of the bands from the VWAP. You can enable and configure up to three sets of bands.
ORB Settings (Opening Range)
Opening Range Timeframe: The duration of the opening range (e.g., 15 for 15 minutes, 60 for the first hour).
Market Session & Time Zone: Crucial for ensuring the range is calculated at the correct time for the asset you're trading.
Line & Zone Styles: Full customization for the colors, thickness, and style (Solid, Dashed, Dotted) of the High, Low, and Opening Price lines, as well as the background colors for the zones above, below, and within the range.
Total Points Moved by exp3rtsThis lightweight utility tracks the total intraday range of price movement, giving you real-time insight into market activity.
It calculates:
🟩 Bullish Points – Total range from bullish candles (close > open)
🟥 Bearish Points – Total range from bearish candles (close < open)
🔁 Total Points Moved (TPM) – Sum of all high–low ranges for the day
Values are pulled from the 1-second chart for high precision and displayed in a compact tag in the top-right corner.
EQ + Bandas Pro 📊 EQ + Bands Pro is an advanced indicator built on OHLC analysis. It calculates a synthetic equilibrium price and plots dynamic, robust bands that adapt to volatility while filtering outliers. The tool highlights zones of overvaluation and undervaluation, helping traders identify key imbalances, potential reversals, and trend confirmations.
ICT Venom Trading Model [TradingFinder] SMC NY Session 2025SetupIntroduction
The ICT Venom Model is one of the most advanced strategies in the ICT framework, designed for intraday trading on major US indices such as US100, US30, and US500. This model is rooted in liquidity theory, time and price dynamics, and institutional order flow.
The Venom Model focuses on detecting Liquidity Sweeps, identifying Fair Value Gaps (FVG), and analyzing Market Structure Shifts (MSS). By combining these ICT core concepts, traders can filter false breakouts, capture sharp reversals, and align their entries with the real institutional liquidity flow during the New York Session.
Key Highlights of ICT Venom Model :
Intraday focus : Optimized for US indices (US100, US30, US500).
Time element : Critical window is 08:00–09:30 AM (Venom Box).
Liquidity sweep logic : Price grabs liquidity at 09:30 AM open.
Confirmation tools : MSS, CISD, FVG, and Order Blocks.
Dual setups : Works in both Bullish Venom and Bearish Venom conditions.
At its core, the ICT Venom Strategy is a framework that explains how institutional players manipulate liquidity pools by engineering false breakouts around the initial range of the market. Between 08:00 and 09:30 AM New York time, a range called the “Venom Box” is formed.
This range acts as a trap for retail traders, and once the 09:30 AM market open occurs, price usually sweeps either the high or the low of this box to collect stop-loss liquidity. After this liquidity grab, the market often reverses sharply, giving birth to a classic Bullish Venom Setup or Bearish Venom Setup
The Venom Model (ICT Venom Trading Strategy) is not just a pattern recognition tool but a precise institutional trading model based on time, liquidity, and market structure. By understanding the Initial Balance Range, watching for Liquidity Sweeps, and entering trades from FVG zones or Order Blocks, traders can anticipate market reversals with high accuracy. This strategy is widely respected among ICT followers because it offers both risk management discipline and clear entry/exit conditions. In short, the Venom Model transforms liquidity manipulation into actionable trading opportunities.
Bullish Setup :
Bearish Setup :
🔵 How to Use
The ICT Venom Model is applied by observing price behavior during the early hours of the New York session. The first step is to define the Initial Range, also called the Venom Box, which is formed between 08:00 and 09:30 AM EST. This range marks the high and low points where institutional traders often create traps for retail participants. Once the official market opens at 09:30 AM, price usually sweeps either the top or bottom of this box to collect liquidity.
After this liquidity grab, the market tends to reverse in alignment with the true directional bias. To confirm the setup, traders look for signals such as a Market Structure Shift (MSS), Change in State of Delivery (CISD), or the appearance of a Fair Value Gap (FVG). These elements validate the reversal and provide precise levels for trade execution.
🟣 Bullish Setup
In a Bullish Venom Setup, the market first sweeps the low of the Venom Box after 09:30 AM, triggering sell-side liquidity collection. This downward move is often sharp and deceptive, designed to stop out retail long positions and attract new sellers. Once liquidity is taken, the market typically shifts direction, forming an MSS or CISD that signals a reversal to the upside.
Traders then wait for price to retrace into a Fair Value Gap or a demand-side Order Block created during the reversal leg. This retracement offers the ideal entry point for long positions. Stop-loss placement should be just below the liquidity sweep low, while profit targets are set at the Venom Box high and, if momentum continues, at higher session or daily highs.
🟣 Bearish Setup
In a Bearish Venom Setup, the process is similar but reversed. After the Initial Range is defined, if price breaks above the Venom Box high following the 09:30 AM open, it signals a false breakout designed to collect buy-side liquidity. This move usually traps eager buyers and clears out stop-losses above the high.
After the liquidity sweep, confirmation comes through an MSS or CISD pointing to a reversal downward. At this stage, traders anticipate a retracement into a Fair Value Gap or a supply-side Order Block formed during the reversal. Short entries are taken within this zone, with stop-loss positioned just above the liquidity sweep high. The logical profit targets include the Venom Box low and, in stronger bearish momentum, deeper session or daily lows.
🔵 Settings
Refine Order Block : Enables finer adjustments to Order Block levels for more accurate price responses.
Mitigation Level OB : Allows users to set specific reaction points within an Order Block, including: Proximal: Closest level to the current price. 50% OB: Midpoint of the Order Block. Distal: Farthest level from the current price.
FVG Filter : The Judas Swing indicator includes a filter for Fair Value Gap (FVG), allowing different filtering based on FVG width: FVG Filter Type: Can be set to "Very Aggressive," "Aggressive," "Defensive," or "Very Defensive." Higher defensiveness narrows the FVG width, focusing on narrower gaps.
Mitigation Level FVG : Like the Order Block, you can set price reaction levels for FVG with options such as Proximal, 50% OB, and Distal.
CISD : The Bar Back Check option enables traders to specify the number of past candles checked for identifying the CISD Level, enhancing CISD Level accuracy on the chart.
🔵 Conclusion
The ICT Venom Model is more than just a reversal setup; it is a complete intraday trading framework that blends liquidity theory, time precision, and market structure analysis. By focusing on the Initial Range between 08:00 and 09:30 AM New York time and observing how price reacts at the 09:30 AM open, traders can identify liquidity sweeps that reveal institutional intentions.
Whether in a Bullish Venom Setup or a Bearish Venom Setup, the model allows for precise entries through Fair Value Gaps (FVGs) and Order Blocks, while maintaining clear risk management with well-defined stop-loss and target levels.
Ultimately, the ICT Venom Model provides traders with a structured way to filter false moves and align their trades with institutional order flow. Its strength lies in transforming liquidity manipulation into actionable opportunities, giving intraday traders an edge in timing, accuracy, and consistency. For those who master its logic, the Venom Model becomes not only a strategy for entry and exit, but also a deeper framework for understanding how liquidity truly drives price in the New York session.
Quantile-Based Adaptive Detection🙏🏻 Dedicated to John Tukey. He invented the boxplot, and I finalized it.
QBAD (Quantile-Based Adaptive Detection) is ‘the’ adaptive (also optionally weighted = ready for timeseries) boxplot with more senseful fences. Instead of hardcoded multipliers for outer fences, I base em on a set of quantile-based asymmetry metrics (you can view it as an ‘algorithmic’ counter part of central & standardized moments). So outer bands are Not hardcoded, not optimized, not cross-validated etc, simply calculated at O(nlogn).
You can use it literally everywhere in any context with any continuous data, in any task that requires statistical control, novelty || outlier detection, without worrying and doubting the sense in arbitrary chosen thresholds. Obviously, given the robust nature of quantiles, it would fit best the cases where data has problems.
The thresholds are:
Basis: the model of the data (median in our case);
Deviations: represent typical spread around basis, together form “value” in general sense;
Extensions: estimate data’s extremums via combination of quantile-based asymmetry metrics without relying on actual blunt min and max, together form “range” / ”frame”. Datapoints outside the frame/range are novelties or outliers;
Limits: based also on quantile asymmetry metrics, estimate the bounds within which values can ‘ever’ emerge given the current data generating process stays the same, together form “field”. Datapoints outside the field are very rare, happen when a significant change/structural break happens in current data-generating process, or when a corrupt datapoint emerges.
…
The first part of the post is for locals xd, the second is for the wanderers/wizards/creators/:
First part:
In terms of markets, mostly u gotta worry about dem instruments that represent crypto & FX assets: it’s either activity hence data sources there are decentralized, or data is fishy.
For a higher algocomplexity cost O(nlong), unlike MBAD that is 0(n), this thing (a control system in fact) works better with ishy data (contaminated with wrong values, incomplete, missing values etc). Read about the “ breakdown point of an estimator ” if you wanna understand it.
Even with good data, in cases when you have multiple instruments that represent the same asset, e.g. CL and BRN futures, and for some reason you wanna skip constructing a proper index of em (while you should), QBAD should be better put on each instrument individually.
Another reason to use this algo-based rather than math-based tool, might be in cases when data quality is all good, but the actual causal processes that generate the data are a bit inconsistent and/or possess ‘increased’ activity in a way. SO in high volatility periods, this tool should provide better.
In terms of built-ins you got 2 weightings: by sequence and by inferred volume delta. The former should be ‘On’ all the time when you work with timeseries, unless for a reason you want to consciously turn it off for a reason. The latter, you gotta keep it ‘On’ unless you apply the tool on another dataset that ain’t got that particular additional dimension.
Ain’t matter the way you gonna use it, moving windows, cumulative windows with or without anchors, that’s your freedom of will, but some stuff stays the same:
Basis and deviations are “value” levels. From process control perspective, if you pls, it makes sense to Not only fade or push based on these levels, but to also do nothing when things are ambiguous and/or don’t require your intervention
Extensions and limits are extreme levels. Here you either push or fade, doing nothing is not an option, these are decisive points in all the meanings
Another important thing, lately I started to see one kind of trend here on tradingview as well and in general in near quant sources, of applying averages, percentiles etc ‘on’ other stationary metrics, so called “indicators”. And I mean not for diagnostic or development reasons, for decision making xd
This is not the evil crime ofc, but hillbilly af, cuz the metrics are stationary it means that you can model em, fit a distribution, like do smth sharper. Worst case you have Bayesian statistics armed with high density intervals and equal tail intervals, and even some others. All this stuff is not hard to do, if u aint’t doing it, it’s on you.
So what I’m saying is it makes sense to apply QBAD on returns ‘of your strategy’, on volume delta, but Not on other metrics that already do calculations over their own moving windows.
...
Second part:
Looks like some finna start to have lil suspicions, that ‘maybe’ after all math entities in reality are more like blueprints, while actual representations are physical/mechanical/algorithmic. Std & centralized moments is a math entity that represents location, scale & asymmetry info, and we can use it no problem, when things are legit and consistent especially. Real world stuff tho sometimes deviates from that ideal, so we need smth more handy and real. Add to the mix the algo counter part of means: quantiles.
Unlike the legacy quantile-based asymmetry metrics from the previous century (check quantile skewness & kurtosis), I don’t use arbitrary sets of quantiles, instead we get a binary pattern that is totally geometric & natural (check the code if interested, I made it very damn explicit). In spirit with math based central & standardized moments, each consequent pair is wider empathizing tail info more and more for each higher order metric.
Unlike the classic box plot, where inner thresholds are quartiles and the rest are based on em, here the basis is median (minimises L1), I base inner thresholds on it, and we continue the pattern by basing the further set of levels on the previous set. So unlike the classic box plot, here we have coherency in construction, symmetry.
Another thing to pay attention to, tho for some reason ain’t many talk about it, it’s not conceptually right to think that “you got data and you apply std moments on it”. No, you apply it to ‘centered around smth’ data. That ‘smth’ should minimize L2 error in case of math, L1 error in case of algo, and L0 error in case of learning/MLish/optimizational/whatever-you-cal-it stuff. So in the case of L0, that’s actually the ‘mode’ of KDE, but that’s for another time. Anyways, in case of L2 it’s mean, so we center data around mean, and apply std moments on residuals. That’s the precise way of framing it. If you understand this, suddenly very interesting details like 0th and 1st central moments start to make sense. In case of quantiles, we center data around the median, and do further processing on residuals, same.
Oth moment (I call it init) is always 1, tho it’s interesting to extrapolate backwards the sequence for higher order moments construction, to understand how we actually end up with this zero.
1st moment (I call it bias) of residuals would be zero if you match centering and residuals analysis methods. But for some reason you didn’t do that (e.g centered data around midhinge or mean and applied QBAD on the centered data), you have to account for that bias.
Realizing stuff > understanding stuff
Learning 2981234 human invented fields < realizing the same unified principles how the Universe works
∞