MIDAS VWAP Jayy his is just a bash together of two MIDAS VWAP scripts particularly AkifTokuz and drshoe.
I added the ability to show more MIDAS curves from the same script.
The algorithm primarily uses the "n" number but the date can be used for the 8th VWAP
I have not converted the script to version 3.
To find bar number go into "Chart Properties" select " "background" then select Indicator Titles and "Indicator values". When you place your cursor over a bar the first number you see adjacent to the script title is the bar number. Put that in the dialogue box midline is MIDAS VWAP . The resistance is a MIDAS VWAP using bar highs. The resistance is MIDAS VWAP using bar lows.
In most case using N will suffice. However, if you are flipping around charts inputting a specific date can be handy. In this way, you can compare the same point in time across multiple instruments eg first trading day of the year or an election date.
Adding dates into the dialogue box is a bit cumbersome so in this version, it is enabled for only one curve. I have called it VWAP and it follows the typical VWAP algorithm. (Does that make a difference? Read below re my opinion on the Difference between MIDAS VWAP and VWAP ).
I have added the ability to start from the bottom or top of the initiating bar.
In theory in a probable uptrend pick a low of a bar for a low pivot and start the MIDAS VWAP there using the support.
For a downtrend use the high pivot bar and select resistance. The way to see is to play with these values.
Difference between MIDAS VWAP and the regular VWAP
MIDAS itself as described by Levine uses a time anchored On-Balance Volume (OBV) plotted on a graph where the horizontal (abscissa) arm of the graph is cumulative volume not time. He called his VWAP curves Support/Resistance VWAP or S/R curves. These S/R curves are often referred to as "MIDAS curves".
These are the main components of the MIDAS chart. A third algorithm called the Top-Bottom Finder was also described. (Separate script).
Additional tools have been described in "MIDAS_Technical_Analysis"
Midas Technical Analysis: A VWAP Approach to Trading and Investing in Today’s Markets by Andrew Coles, David G. Hawkins
Copyright © 2011 by Andrew Coles and David G. Hawkins.
Denoting the different way in which Levine approached the calculation.
The difference between "MIDAS" VWAP and VWAP is, in my opinion, much ado about nothing. The algorithms generate identical curves albeit the MIDAS algorithm launches the curve one bar later than the VWAP algorithm which can be a pain in the neck. All of the algorithms that I looked at on Tradingview step back one bar in time to initiate the MIDAS curve. As such the plotted curves are identical to traditional VWAP assuming the initiation is from the candle/bar midpoint.
How did Levine intend the curves to be drawn?
On a reversal, he suggested the initiation of the Support and Resistance VVWAP (S/R curve) to be started after a reversal.
It is clear in his examples this happens occasionally but in many cases he initiates the so-called MIDAS S/R VWAP right at the reversal point. In any case, the algorithm is problematic if you wish to start a curve on the first bar of an IPO .
You will get nothing. That is a pain. Also in Levine's writings, he describes simply clicking on the point where a
S/R VWAP is to be drawn from. As such, the generally accepted method of initiating the curve at N-1 is a practical and sensible method. The only issue is that you cannot draw the curve from the first bar on any security, as mentioned without resorting to the typical VWAP algorithm. There is another difference. VWAP is launched from the middle of the bar (as per AlphaTrends), You can also launch from the top of the bar or the bottom (or anywhere for that matter). The calculation proceeds using the top or bottom for each new bar.
The potential applications are discussed in the MIDAS Technical Analysis book.
在腳本中搜尋"curve"
Polynomial Regression Bands + Channel [DW]This is an experimental study designed to calculate polynomial regression for any order polynomial that TV is able to support.
This study aims to educate users on polynomial curve fitting, and the derivation process of Least Squares Moving Averages (LSMAs).
I also designed this study with the intent of showcasing some of the capabilities and potential applications of TV's fantastic new array functions.
Polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable y is modeled as a polynomial of nth degree (order).
For clarification, linear regression can also be described as a first order polynomial regression. The process of deriving linear, quadratic, cubic, and higher order polynomial relationships is all the same.
In addition, although deriving a polynomial regression equation results in a nonlinear output, the process of solving for polynomials by least squares is actually a special case of multiple linear regression.
So, just like in multiple linear regression, polynomial regression can be solved in essentially the same way through a system of linear equations.
In this study, you are first given the option to smooth the input data using the 2 pole Super Smoother Filter from John Ehlers.
I chose this specific filter because I find it provides superior smoothing with low lag and fairly clean cutoff. You can, of course, implement your own filter functions to see how they compare if you feel like experimenting.
Filtering noise prior to regression calculation can be useful for providing a more stable estimation since least squares regression can be rather sensitive to noise.
This is especially true on lower sampling lengths and higher degree polynomials since the regression output becomes more "overfit" to the sample data.
Next, data arrays are populated for the x-axis and y-axis values. These are the main datasets utilized in the rest of the calculations.
To keep the calculations more numerically stable for higher periods and orders, the x array is filled with integers 1 through the sampling period rather than using current bar numbers.
This process can be thought of as shifting the origin of the x-axis as new data emerges.
This keeps the axis values significantly lower than the 10k+ bar values, thus maintaining more numerical stability at higher orders and sample lengths.
The data arrays are then used to create a pseudo 2D matrix of x power sums, and a vector of x power*y sums.
These matrices are a representation the system of equations that need to be solved in order to find the regression coefficients.
Below, you'll see some examples of the pattern of equations used to solve for our coefficients represented in augmented matrix form.
For example, the augmented matrix for the system equations required to solve a second order (quadratic) polynomial regression by least squares is formed like this:
(∑x^0 ∑x^1 ∑x^2 | ∑(x^0)y)
(∑x^1 ∑x^2 ∑x^3 | ∑(x^1)y)
(∑x^2 ∑x^3 ∑x^4 | ∑(x^2)y)
The augmented matrix for the third order (cubic) system is formed like this:
(∑x^0 ∑x^1 ∑x^2 ∑x^3 | ∑(x^0)y)
(∑x^1 ∑x^2 ∑x^3 ∑x^4 | ∑(x^1)y)
(∑x^2 ∑x^3 ∑x^4 ∑x^5 | ∑(x^2)y)
(∑x^3 ∑x^4 ∑x^5 ∑x^6 | ∑(x^3)y)
This pattern continues for any n ordered polynomial regression, in which the coefficient matrix is a n + 1 wide square matrix with the last term being ∑x^2n, and the last term of the result vector being ∑(x^n)y.
Thanks to this pattern, it's rather convenient to solve the for our regression coefficients of any nth degree polynomial by a number of different methods.
In this script, I utilize a process known as LU Decomposition to solve for the regression coefficients.
Lower-upper (LU) Decomposition is a neat form of matrix manipulation that expresses a 2D matrix as the product of lower and upper triangular matrices.
This decomposition method is incredibly handy for solving systems of equations, calculating determinants, and inverting matrices.
For a linear system Ax=b, where A is our coefficient matrix, x is our vector of unknowns, and b is our vector of results, LU Decomposition turns our system into LUx=b.
We can then factor this into two separate matrix equations and solve the system using these two simple steps:
1. Solve Ly=b for y, where y is a new vector of unknowns that satisfies the equation, using forward substitution.
2. Solve Ux=y for x using backward substitution. This gives us the values of our original unknowns - in this case, the coefficients for our regression equation.
After solving for the regression coefficients, the values are then plugged into our regression equation:
Y = a0 + a1*x + a1*x^2 + ... + an*x^n, where a() is the ()th coefficient in ascending order and n is the polynomial degree.
From here, an array of curve values for the period based on the current equation is populated, and standard deviation is added to and subtracted from the equation to calculate the channel high and low levels.
The calculated curve values can also be shifted to the left or right using the "Regression Offset" input
Changing the offset parameter will move the curve left for negative values, and right for positive values.
This offset parameter shifts the curve points within our window while using the same equation, allowing you to use offset datapoints on the regression curve to calculate the LSMA and bands.
The curve and channel's appearance is optionally approximated using Pine's v4 line tools to draw segments.
Since there is a limitation on how many lines can be displayed per script, each curve consists of 10 segments with lengths determined by a user defined step size. In total, there are 30 lines displayed at once when active.
By default, the step size is 10, meaning each segment is 10 bars long. This is because the default sampling period is 100, so this step size will show the approximate curve for the entire period.
When adjusting your sampling period, be sure to adjust your step size accordingly when curve drawing is active if you want to see the full approximate curve for the period.
Note that when you have a larger step size, you will see more seemingly "sharp" turning points on the polynomial curve, especially on higher degree polynomials.
The polynomial functions that are calculated are continuous and differentiable across all points. The perceived sharpness is simply due to our limitation on available lines to draw them.
The approximate channel drawings also come equipped with style inputs, so you can control the type, color, and width of the regression, channel high, and channel low curves.
I also included an input to determine if the curves are updated continuously, or only upon the closing of a bar for reduced runtime demands. More about why this is important in the notes below.
For additional reference, I also included the option to display the current regression equation.
This allows you to easily track the polynomial function you're using, and to confirm that the polynomial is properly supported within Pine.
There are some cases that aren't supported properly due to Pine's limitations. More about this in the notes on the bottom.
In addition, I included a line of text beneath the equation to indicate how many bars left or right the calculated curve data is currently shifted.
The display label comes equipped with style editing inputs, so you can control the size, background color, and text color of the equation display.
The Polynomial LSMA, high band, and low band in this script are generated by tracking the current endpoints of the regression, channel high, and channel low curves respectively.
The output of these bands is similar in nature to Bollinger Bands, but with an obviously different derivation process.
By displaying the LSMA and bands in tandem with the polynomial channel, it's easy to visualize how LSMAs are derived, and how the process that goes into them is drastically different from a typical moving average.
The main difference between LSMA and other MAs is that LSMA is showing the value of the regression curve on the current bar, which is the result of a modelled relationship between x and the expected value of y.
With other MA / filter types, they are typically just averaging or frequency filtering the samples. This is an important distinction in interpretation. However, both can be applied similarly when trading.
An important distinction with the LSMA in this script is that since we can model higher degree polynomial relationships, the LSMA here is not limited to only linear as it is in TV's built in LSMA.
Bar colors are also included in this script. The color scheme is based on disparity between source and the LSMA.
This script is a great study for educating yourself on the process that goes into polynomial regression, as well as one of the many processes computers utilize to solve systems of equations.
Also, the Polynomial LSMA and bands are great components to try implementing into your own analysis setup.
I hope you all enjoy it!
--------------------------------------------------------
NOTES:
- Even though the algorithm used in this script can be implemented to find any order polynomial relationship, TV has a limit on the significant figures for its floating point outputs.
This means that as you increase your sampling period and / or polynomial order, some higher order coefficients will be output as 0 due to floating point round-off.
There is currently no viable workaround for this issue since there isn't a way to calculate more significant figures than the limit.
However, in my humble opinion, fitting a polynomial higher than cubic to most time series data is "overkill" due to bias-variance tradeoff.
Although, this tradeoff is also dependent on the sampling period. Keep that in mind. A good rule of thumb is to aim for a nice "middle ground" between bias and variance.
If TV ever chooses to expand its significant figure limits, then it will be possible to accurately calculate even higher order polynomials and periods if you feel the desire to do so.
To test if your polynomial is properly supported within Pine's constraints, check the equation label.
If you see a coefficient value of 0 in front of any of the x values, reduce your period and / or polynomial order.
- Although this algorithm has less computational complexity than most other linear system solving methods, this script itself can still be rather demanding on runtime resources - especially when drawing the curves.
In the event you find your current configuration is throwing back an error saying that the calculation takes too long, there are a few things you can try:
-> Refresh your chart or hide and unhide the indicator.
The runtime environment on TV is very dynamic and the allocation of available memory varies with collective server usage.
By refreshing, you can often get it to process since you're basically just waiting for your allotment to increase. This method works well in a lot of cases.
-> Change the curve update frequency to "Close Only".
If you've tried refreshing multiple times and still have the error, your configuration may simply be too demanding of resources.
v4 drawing objects, most notably lines, can be highly taxing on the servers. That's why Pine has a limit on how many can be displayed in the first place.
By limiting the curve updates to only bar closes, this will significantly reduce the runtime needs of the lines since they will only be calculated once per bar.
Note that doing this will only limit the visual output of the curve segments. It has no impact on regression calculation, equation display, or LSMA and band displays.
-> Uncheck the display boxes for the drawing objects.
If you still have troubles after trying the above options, then simply stop displaying the curve - unless it's important to you.
As I mentioned, v4 drawing objects can be rather resource intensive. So a simple fix that often works when other things fail is to just stop them from being displayed.
-> Reduce sampling period, polynomial order, or curve drawing step size.
If you're having runtime errors and don't want to sacrifice the curve drawings, then you'll need to reduce the calculation complexity.
If you're using a large sampling period, or high order polynomial, the operational complexity becomes significantly higher than lower periods and orders.
When you have larger step sizes, more historical referencing is used for x-axis locations, which does have an impact as well.
By reducing these parameters, the runtime issue will often be solved.
Another important detail to note with this is that you may have configurations that work just fine in real time, but struggle to load properly in replay mode.
This is because the replay framework also requires its own allotment of runtime, so that must be taken into consideration as well.
- Please note that the line and label objects are reprinted as new data emerges. That's simply the nature of drawing objects vs standard plots.
I do not recommend or endorse basing your trading decisions based on the drawn curve. That component is merely to serve as a visual reference of the current polynomial relationship.
No repainting occurs with the Polynomial LSMA and bands though. Once the bar is closed, that bar's calculated values are set.
So when using the LSMA and bands for trading purposes, you can rest easy knowing that history won't change on you when you come back to view them.
- For those who intend on utilizing or modifying the functions and calculations in this script for their own scripts, I included debug dialogues in the script for all of the arrays to make the process easier.
To use the debugs, see the "Debugs" section at the bottom. All dialogues are commented out by default.
The debugs are displayed using label objects. By default, I have them all located to the right of current price.
If you wish to display multiple debugs at once, it will be up to you to decide on display locations at your leisure.
When using the debugs, I recommend commenting out the other drawing objects (or even all plots) in the script to prevent runtime issues and overlapping displays.
Linear Regression AngleThere are several Linear Regression indicator in the Public Library, but I don't think there is one that converts the Linear Regression (LR) curve into angle in degrees, relative to a set reference frame. Due to the large price range between tickers, creating this indicator isn't as straight forward as I originally thought. For example, given the same time period, a stock that fluctuate in the 10's will have a true linear regression angle dramatically different from a penny stock. Even changing the scale on your chart will affect the "apparent" angle you see on the chart. Hence, this indicator DOES NOT provide the true linear regression angle, but only a relative one based on a defined number of historical bars.
Originality and usefulness
This indicator provides Linear Regression (LR) Angle in degree that may be more easily interpreted by some traders as we are more accustomed to line angles in degree and know how to visualize them.
This script also provides the option to overlay up to four LR curves of different periods, as well as an average curve of the enabled curves. This allows traders to analysis short to long term trends.
Furthermore, slope (rate of change) of each LR curves can be toggled. The slope plot can help traders visualize accelerations and decelerations of the LR curves which may help in spotting trend reversals.
Data table provides real time data for each curve.
Example of using slope plot with a 30 bars Linear Regression Angle:
Recession Warning Traffic LightThis is an indicator that uses 6 different metrics to determine the combined probability of a recession and compares the high probability warning periods against actual historical periods of recession.
GREEN tells us that the referenced recession indicators are not exhibiting any warning. Observe the long stretches of “all-green” in between recessionary periods in the chart above.
RED will show a full-on warning level for that particular recession indicator, signaling that monitoring of this sector is clearly showing a problem – which has in the past, reliably exhibited itself as a forewarning of recessions.
Adding green and red together can help determine a combined probability of recession.
IMPORTANT: Your chart should be on 1d and set to SPX , DJI ,or NDQ indices
Precious metals: This indicator calculates the relative prices of Gold & rhodium. Gold is a flight-to-quality asset. Rhodium is the rarest of precious industrial metals and prices spike when the economy is heating up. In front of a recession, the upper relative movement of rhodium precedes gold.
Stock markets: This indicator compares closing prices to growth rate curves of the SPX. This indication is the noisiest but tells us very well when the recession has ended. Stock market indices, which respond to “smart money” moving out of markets when the other indicators begin to warn of recession, or when markets become overheated and rise to historically unsustainable levels.
Yield curve: This indicator compares the 3m & 10y treasuries and detects yield curve inversions. Interest rates are controlled by the Federal Reserve and by the purchasers in the Federal Treasury auction markets, which together create the treasury yield curve. This inversion is the most reliable recession indicator. These happen during a flight to quality.
Federal Reserve: This indicator measures GDP and detects contraction which is technically a recession. This is usually one of the last indicators to enter a Warning state, and it could be 6 months delayed simply confirming what may have already been projected.
Money Supply. This indicator measures the M2 money supply, which typically grows about 1% per calendar quarter. When this shrinks, it's tapping the brakes on the economy. This can also lead to yield curve inversion. This is also a measure of inflation and its effects on the aggregate money supply (liquid capital) available for short-term economic activity, or which can be directed into the purchase of long-term, less liquid assets.
Leading Economic factors: There is a whole basket of leading economic indicators that, as collections, reflect overall growth or contraction of economic activity. These indicators include measures of level and growth in productivity, employment, housing, consumer confidence, industrial purchasing confidence, and much more. These indicators may or may not be detached from the broader economy, and often provide up to 6 months of foresight. For more information please visit www.conference-board.org
Actual Recession: Central Bank indicators are published by the Federal Reserve and reflect their own analysis of national and regional economic health, as well as their calculations of the likelihood of a recession. The Federal Reserve has a recession ticker which is used to plot periods of actual recessions on this indicator for comparison.
Gann Box (Zeiierman)█ Overview
The Gann Box (Zeiierman) is an indicator that provides visual insights using the principles of W.D. Gann's trading methods. Gann's techniques are based on geometry, astronomy, and astrology, and are used to predict important price levels and market trends. This indicator helps traders identify potential support and resistance levels, and forecast future price movements.
Gann used angles and various geometric constructions to divide time and price into proportionate parts. Gann indicators are often used to predict areas of support and resistance, key tops and bottoms, and future price moves.
█ How It Works
The indicator operates by identifying high and low points within a visible range on the chart and drawing a Gann Box between these points. The box is divided into segments based on selected percentages, which represent key levels for observing market reactions. It includes options to display labels, a Gann fan, and Gann angles for analysis. Advanced features allow extending the box into the future for predictive analysis and reversing its orientation for alternative viewpoints.
High and Low Points Identification: It starts by locating the highest and lowest price points visible on the chart.
Gann Box Construction: Draws a box from these points and divides it according to specified percentages, highlighting potential support and resistance levels.
█ How to Use
Support and Resistance Levels
Using a Gann angle to forecast support and resistance is probably the most popular way they are used. This technique frames the market, allowing the analyst to read the movement of the market inside this framework.
The lines within the Gann Box, drawn at the key percentages, create a grid of potential support and resistance levels. As prices fluctuate, these lines can act as barriers to price movement, with the price often pausing or reversing at these intervals.
Forecasting with the 'Extend' Feature: The indicator's ability to extend lines and boxes into the future provides traders with a forward-looking tool to anticipate potential market movements and prepare for them.
Gann Fan: This feature draws lines at a significant price angle, helping traders identify potential support and resistance levels based on the theory that prices move in predictable patterns.
Gann Curves: Gann Curves display dynamic support and resistance levels, aiding in the analysis of momentum and trend strength.
█ Settings
The indicator includes several settings that allow customization of its appearance and functionality:
⚪ General Settings
Reverse: This setting changes the orientation of labels and calculations within the Gann Box, providing alternative analytical perspectives. It essentially flips the Gann Box's direction, which can be useful in different market conditions or analysis scenarios.
Extend: Extends the drawing of Gann lines or boxes into the future beyond the current last bar. This feature is essential for forecasting future price movements and identifying potential support or resistance levels that lie outside the current price action.
⚪ Gann Box
Show Box: Toggles the visibility of the Gann Box on the chart. The Gann Box is a fundamental tool in Gann analysis, highlighting key levels based on selected high and low points to identify potential support and resistance areas.
Show Fibonacci Labels: Controls the display of Fibonacci labels within the Gann Box. These labels mark specific Fibonacci retracement levels, aiding traders in recognizing significant levels for potential reversals.
Box Visibility: Allows users to enable or disable individual boxes within the Gann Box, providing flexibility in focusing on specific levels of interest.
Percentage Levels: Defines the Fibonacci levels within the Gann Box. Traders can adjust these levels to customize the Gann Box according to their specific analysis needs.
Coloring: Customizes the color of each level within the Gann Box, enhancing visual clarity and differentiation between levels.
⚪ Gann Fan
Show Fan: Enables the Gann Fan, which draws lines at significant Gann angles from a particular point on the chart, helping identify potential support and resistance levels.
Fan Percentages and Coloring: Similar to the Gann Box, these settings allow traders to customize which Gann angles are displayed and how they are colored.
⚪ Gann Curves
Show Curves: When enabled, this setting draws Gann Curves on the chart. These curves are based on Gann percentages and provide a dynamic view of support and resistance levels as they adapt to changing market conditions.
Curve Percentages and Coloring: Define which curves are displayed and their colors, allowing for a tailored analysis experience.
⚪ Gann Angles
Show Angles: Toggles the display of Gann Angles, which are crucial for understanding the market's price and time dynamics, offering insights into future support and resistance levels.
Coloring: Customizes the color of the Gann Angles, making it easier to differentiate between various angles on the chart.
█ Alerts
The indicator includes several alert conditions for price breakouts from the Gann Box and specific levels, enabling traders to be notified of significant market movements.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Finite Difference - Backward (mcbw_)In calculus there exists a 'derivative', which simply just measures the difference between two points on a curve. For well behaved mathematical functions there are infinitely many points and so there exists a derivative at every point. Where there are infinitely many points in a curve that curve is called 'continuous'. Continuous curves are very nice to deal with since each point on it exists almost exactly where its neighbors are. However, if the curve does not have infinitely many points on it, but instead has a finite number of points on it, that curve is called 'discrete' instead of continuous. Taking the derivative of discrete curves is much trickier business since there are none of the mathematical conveniences that a continuous offers. In the real world everything we measure is a discrete curve, including Price (since we measure it a finite number of times, aka each candlestick)!
The branch of Discrete Mathematics has found an approach to measure the derivative along a discrete curve, that approach is aptly called " Finite Difference ". To get a more accurate approximation of a discrete derivative, the finite difference approach uses weighted combinations of neighboring points. The most common type of finite difference is a 'central' difference, this uses a combination of points before and after the point of interest to approximate the discrete derivative. This is great for historical analysis but is not of much use for trading algorithms since it technically means using future prices to calculate the derivative of the current point. Instead we can use a less common variant called a ' Backwards Difference ' that only uses a combination of points before the current one to help approximate the current derivative.
In this script you can choose the " Order " of your derivative and the " Accuracy " of its approximation. This script is for educational purposes for folks building trading algorithms. Many trading algorithms often have an element of seeing how much Price has changed from the previous candle to the current candle. This approach is the lowest accuracy derivative possible, and using the backwards finite differences, made available for the first time on TradingView (!!), algorithms that use derivatives can now have higher orders of accuracy!
Happy Trading/Developing!
[MAD] Acceleration based dampened SMA projectionsThis indicator utilizes concepts of arrays inside arrays to calculate and display projections of multiple Smoothed Moving Average (SMA) lines via polylines.
This is partly an experiment as an educational post, on how to work with multidimensional arrays by using User-Defined Types
------------------
Input Controls for User Interaction:
The indicator provides several input controls, allowing users to adjust parameters like the SMA window, acceleration window, and dampening factors.
This flexibility lets users customize the behavior and appearance of the indicator to fit their analysis needs.
sma length:
Defines the length of the simple moving average (SMA).
acceleration window:
Sets the window size for calculating the acceleration of the SMA.
Input Series:
Selects the input source for calculating the SMA (typically the closing price).
Offset:
Determines the offset for the input source, affecting the positioning of the SMA. Here it´s possible to add external indicators like bollinger bands,.. in that case as double sma this sma should be very short.
(Thanks Fikira for that idea)
Startfactor dampening:
Initial dampening factor for the polynomial curve projections, influencing their starting curvature.
Growfactor dampening:
Growth rate of the dampening factor, affecting how the curvature of the projections changes over time.
Prediction length:
Sets the length of the projected polylines, extending beyond the current bar.
cleanup history:
Boolean input to control whether to clear the previous polyline projections before drawing new ones.
Key technologies used in this indicator include:
User-Defined Types (UDT) :
This indicator uses UDT to create a custom type named type_polypaths.
This type is designed to store information for each polyline, including an array of points (array), a color for the polyline, and a dampening factor.
UDTs in Pine Script enable the creation of complex data structures, which are essential for organizing and manipulating data efficiently.
type type_polypaths
array polyline_points = na
color polyline_color = na
float dampening_factor= na
Arrays and Nested Arrays:
The script heavily utilizes arrays.
For example, it uses a color array (colorpreset) to store different colors for the polyline.
Moreover, an array of type_polypaths (polypaths) is used, which is an array consisting of user-defined types. Each element of this array contains another array (polyline_points), demonstrating nested array usage.
This structure is essential for handling multiple polylines, each with its set of points and attributes.
var type_polypaths polypaths = array.new()
Polyline Creation and Manipulation:
The core visual aspect of the indicator is the creation of polylines.
Polyline points are calculated based on a dampened polynomial curve, which is influenced by the SMA's slope and acceleration.
Filling initial dampening data
array_size = 9
middle_index = math.floor(array_size / 2)
for i = 0 to array_size - 1
damp_factor = f_calculate_damp_factor(i, middle_index, Startfactor, Growfactor)
polyline_color = colorpreset.get(i)
polypaths.push(type_polypaths.new(array.new(0, na), polyline_color, damp_factor))
The script dynamically generates these polyline points and stores them in the polyline_points array of each type_polypaths instance based on those prefilled dampening factors
if barstate.islast or cleanup == false
for damp_factor_index = 0 to polypaths.size() - 1
GET_RW = polypaths.get(damp_factor_index)
GET_RW.polyline_points.clear()
for i = 0 to predictionlength
y = f_dampened_poly_curve(bar_index + i , src_input , sma_slope , sma_acceleration , GET_RW.dampening_factor)
p = chart.point.from_index(bar_index + i - src_off, y)
GET_RW.polyline_points.push(p)
polypaths.set(damp_factor_index, GET_RW)
Polyline Drawout
The polyline is then drawn on the chart using the polyline.new() function, which uses these points and additional attributes like color and width.
for pl_s = 0 to polypaths.size() - 1
GET_RO = polypaths.get(pl_s)
polyline.new(points = GET_RO.polyline_points, line_width = 1, line_color = GET_RO.polyline_color, xloc = xloc.bar_index)
If the cleanup input is enabled, existing polylines are deleted before new ones are drawn, maintaining clarity and accuracy in the visualization.
if cleanup
for pl_delete in polyline.all
pl_delete.delete()
------------------
The mathematics
in the (ABDP) indicator primarily focuses on projecting the behavior of a Smoothed Moving Average (SMA) based on its current trend and acceleration.
SMA Calculation:
The indicator computes a simple moving average (SMA) over a specified window (sma_window). This SMA serves as the baseline for further calculations.
Slope and Acceleration Analysis:
It calculates the slope of the SMA by subtracting the current SMA value from its previous value. Additionally, it computes the SMA's acceleration by evaluating the sum of differences between consecutive SMA values over an acceleration window (acceleration_window). This acceleration represents the rate of change of the SMA's slope.
sma_slope = src_input - src_input
sma_acceleration = sma_acceleration_sum_calc(src_input, acceleration_window) / acceleration_window
sma_acceleration_sum_calc(src, window) =>
sum = 0.0
for i = 0 to window - 1
if not na(src )
sum := sum + src - 2 * src + src
sum
Dampening Factors:
Custom dampening factors for each polyline, which are based on the user-defined starting and growth factors (Startfactor, Growfactor).
These factors adjust the curvature of the projected polylines, simulating various future scenarios of SMA movement.
f_calculate_damp_factor(index, middle, start_factor, growth_factor) =>
start_factor + (index - middle) * growth_factor
Polynomial Curve Projection:
Using the SMA value, its slope, acceleration, and dampening factors, the script calculates points for polynomial curves. These curves represent potential future paths of the SMA, factoring in its current direction and rate of change.
f_dampened_poly_curve(index, initial_value, initial_slope, acceleration, damp_factor) =>
delta = index - bar_index
initial_value + initial_slope * delta + 0.5 * damp_factor * acceleration * delta * delta
damp_factor = f_calculate_damp_factor(i, middle_index, Startfactor, Growfactor)
Have fun trading :-)
Fourier Extrapolator of 'Caterpillar' SSA of Price [Loxx]Fourier Extrapolator of 'Caterpillar' SSA of Price is a forecasting indicator that applies Singular Spectrum Analysis to input price and then injects that transformed value into the Quinn-Fernandes Fourier Transform algorithm to generate a price forecast. The indicator plots two curves: the green/red curve indicates modeled past values and the yellow/fuchsia dotted curve indicates the future extrapolated values.
What is the Fourier Transform Extrapolator of price?
Fourier Extrapolator of Price is a multi-harmonic (or multi-tone) trigonometric model of a price series xi, i=1..n, is given by:
xi = m + Sum( a*Cos(w*i) + b*Sin(w*i), h=1..H )
Where:
xi - past price at i-th bar, total n past prices;
m - bias;
a and b - scaling coefficients of harmonics;
w - frequency of a harmonic ;
h - harmonic number;
H - total number of fitted harmonics.
Fitting this model means finding m, a, b, and w that make the modeled values to be close to real values. Finding the harmonic frequencies w is the most difficult part of fitting a trigonometric model. In the case of a Fourier series, these frequencies are set at 2*pi*h/n. But, the Fourier series extrapolation means simply repeating the n past prices into the future.
Quinn-Fernandes algorithm find sthe harmonic frequencies. It fits harmonics of the trigonometric series one by one until the specified total number of harmonics H is reached. After fitting a new harmonic , the coded algorithm computes the residue between the updated model and the real values and fits a new harmonic to the residue.
see here: A Fast Efficient Technique for the Estimation of Frequency , B. G. Quinn and J. M. Fernandes, Biometrika, Vol. 78, No. 3 (Sep., 1991), pp . 489-497 (9 pages) Published By: Oxford University Press
Fourier Transform Extrapolator of Price inputs are as follows:
npast - number of past bars, to which trigonometric series is fitted;
nharm - total number of harmonics in model;
frqtol - tolerance of frequency calculations.
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
"Caterpillar" SSA inputs are as follows:
lag - How much lag to introduce into the SSA algorithm, the higher this number the slower the process and smoother the signal
ncomp - Number of Computations or cycles of of the SSA algorithm; the higher the slower
ssapernorm - SSA Period Normalization
numbars =- number of past bars, to which SSA is fitted
Included:
Bar coloring
Alerts
Signals
Loxx's Expanded Source Types
Related Fourier Transform Indicators
Real-Fast Fourier Transform of Price w/ Linear Regression
Fourier Extrapolator of Variety RSI w/ Bollinger Bands
Fourier Extrapolator of Price w/ Projection Forecast
Related Projection Forecast Indicators
Itakura-Saito Autoregressive Extrapolation of Price
Helme-Nikias Weighted Burg AR-SE Extra. of Price
Related SSA Indicators
End-pointed SSA of FDASMA
End-pointed SSA of Williams %R
Fourier Extrapolator of Price w/ Projection Forecast [Loxx]Due to popular demand, I'm pusblishing Fourier Extrapolator of Price w/ Projection Forecast.. As stated in it's twin indicator, this one is also multi-harmonic (or multi-tone) trigonometric model of a price series xi, i=1..n, is given by:
xi = m + Sum( a*Cos(w*i) + b*Sin(w*i), h=1..H )
Where:
xi - past price at i-th bar, total n past prices;
m - bias;
a and b - scaling coefficients of harmonics;
w - frequency of a harmonic ;
h - harmonic number;
H - total number of fitted harmonics.
Fitting this model means finding m, a, b, and w that make the modeled values to be close to real values. Finding the harmonic frequencies w is the most difficult part of fitting a trigonometric model. In the case of a Fourier series, these frequencies are set at 2*pi*h/n. But, the Fourier series extrapolation means simply repeating the n past prices into the future.
This indicator uses the Quinn-Fernandes algorithm to find the harmonic frequencies. It fits harmonics of the trigonometric series one by one until the specified total number of harmonics H is reached. After fitting a new harmonic , the coded algorithm computes the residue between the updated model and the real values and fits a new harmonic to the residue.
see here: A Fast Efficient Technique for the Estimation of Frequency , B. G. Quinn and J. M. Fernandes, Biometrika, Vol. 78, No. 3 (Sep., 1991), pp . 489-497 (9 pages) Published By: Oxford University Press
The indicator has the following input parameters:
src - input source
npast - number of past bars, to which trigonometric series is fitted;
Nfut - number of predicted future bars;
nharm - total number of harmonics in model;
frqtol - tolerance of frequency calculations.
The indicator plots two curves: the green/red curve indicates modeled past values and the yellow/fuchsia curve indicates the modeled future values.
The purpose of this indicator is to showcase the Fourier Extrapolator method to be used in future indicators.
Volume Profile with a few polylinesThe base of "Volume Profile with a few polylines" is another script of mine, Volume Profile (Maps) .
The structure of maps is used to gather the data. However, the drawings is done with polylines.
This enables coders to draw an entire volume profile with just a few polylines, while the range is broader.
This results in the benefit to draw more "lines" than with line.new() / box.new() alone.
🔶 CONCEPTS
🔹 Polylines
polyline.new creates a new polyline instance and displays it on the chart, sequentially connecting all of the points in the `points` array with line segments.
The segments in the drawing can be straight or curved depending on the `curved` parameter.
In this script, points are connected, starting from the bottom. The created line moves up until there is a price level where a volume value needs to be displayed,
at which the line goes to the left to the concerning volume value, coming back at the same price level until the line returns to its initial x-axis,
after which the line will continue to rise until all values are displayed.
A polyline can contain maximum 10000 points (10K).
Since the line has to go back and forth, each price/volume line takes 3 points.
In the case that 20K bars all have a different price, we would need 60K points, or just 6 polylines. A maximum of 100 polylines can be displayed.
The 3 highest volume values are displayed with line.new(), each with their own colour.
🔹 Maps
A map object is a collection that consists of key - value pairs
Each key is unique and can only appear once. When adding a new value with a key that the map already contains, that value replaces the old value associated with the key .
You can change the value of a particular key though, for example adding volume (value) at the same price (key), the latter technique is used in this script.
Volume is added to the map, associated with a particular price (default close, can be set at high, low, open,...)
When the map already contains the same price (key), the value (volume) is added to the existing volume at the associated price.
A map can contain maximum 50K values, which is more than enough to hold 20K bars (Basic 5K - Premium plan 20K), so the whole history can be put into a map.
🔹 Rounding function
This publication contains 2 round functions, which can be used to widen the Volume Profile
Round
• "Round" set at zero -> nothing changes to the source number
• "Round" set below zero -> x digit(s) after the decimal point, starting from the right side, and rounded.
• "Round" set above zero -> x digit(s) before the decimal point, starting from the right side, and rounded.
Example: 123456.789
0->123456.789
1->123456.79
2->123456.8
3->123457
-1->123460
-2->123500
Step
Another option is custom steps.
After setting "Round" to "Step", choose the desired steps in price,
Examples
• 2 -> 1234.00, 1236.00, 1238.00, 1240.00
• 5 -> 1230.00, 1235.00, 1240.00, 1245.00
• 100 -> 1200.00, 1300.00, 1400.00, 1500.00
• 0.05 -> 1234.00, 1234.05, 1234.10, 1234.15
•••
🔶 FEATURES
🔹 Volume * currency
Let's take as example BTCUSD, relative to USD, 10 volume at a price of 100 BTCUSD will be very different than 10 volume at a price of 30000 (1K vs. 300K)
If you want volume to be associated with USD, enable Volume * currency . Volume will then be multiplied by the price:
• 10 volume, 1 BTC = 100 -> 1000
• 10 volume, 1 BTC = 30K -> 300K
Polylines has the attributes curved & closed.
When "curved" is enabled the drawing will connect all points from the `points` array using curved line segments.
When "closed" is enabled the drawing will also connect the first point to the last point from the `points` array, resulting in a closed polyline.
They are default disabled, but can be enabled:
🔶 DETAILS
🔹 Put
When the map doesn't contain a price, it will be added, using map.put(id, key, value)
In our code:
map.put(originalMap, price, volume)
or
originalMap.put(price, volume)
A key (price) is now associated with a value (volume) -> key : value
Since all keys are unique, we don't have to know its position to extract the value, we just need to know the key -> map.get(id, key)
We use map.get() when a certain key already exists in the map, and we want to add volume with that value.
if originalMap.contains(price)
originalMap.put(price, originalMap.get(price) + volume)
-> At the last bar, all prices (source) are now associated with volume.
🔶 SETTINGS
Source : Set source of choice; default close , can be set as high , low , open , ...
Volume & currency : Enable to multiply volume with price (see Features )
Amount of bars : Set amount of bars which you want to include in the Volume Profile
🔹 Round -> ' Round/Step '
Round -> see Concepts
Step -> see Concepts
🔹 Display Volume Profile
Offset: shifts the Volume Profile (max. 500 bars to the right of last bar, see Features )
Max width Volume Profile: largest volume will be x bars wide, the rest is displayed as a ratio against largest volume (see Features )
Colours
Curved: make lines curved
Closed: connect last with first point
🔶 LIMITATIONS
• Lines won't go further than first bar (coded).
• The Volume Profile can be placed maximum 500 bar to the right of last price.
PLR-Z For Loop🧠 Overview
PLR-Z For Loop is a trend-following indicator built on the Power Law Residual Z-score model of Bitcoin price behavior. By measuring how far price deviates from a long-term power law regression and applying a custom scoring loop, this tool identifies consistent directional pressure in market structure. Designed for BTC, this indicator helps traders align with macro trends.
🧩 Key Features
Power Law Residual Model: Tracks deviations of BTC price from its long-term logarithmic growth curve.
Z-Score Normalization: Applies long-horizon statistical normalization (400/1460 bars) to smooth residual deviations into a usable trend signal.
Loop-Based Trend Filter: Iteratively scores how often the current Z-score exceeds prior values, emphasizing trend persistence over volatility.
Optional Smoothing: Toggleable exponential smoothing helps filter noise in choppier market conditions.
Directional Regime Coloring: Aqua (bullish) and Red (bearish) visuals reinforce trend alignment across plots and candles.
🔍 How It Works
Power Law Curve: Price is compared against a logarithmic regression model fitted to historical BTC price evolution (starting July 2010), defining structural support, resistance, and centerline levels.
Residual Z-Score: The residual is calculated as the log-difference between price and the power law center.
This residual is then normalized using a rolling mean (400 days) and standard deviation (1460 days) to create a long-term Z-score.
Loop Scoring Logic:
A loop compares the current Z-score to a configurable number of past bars.
Each higher comparison adds +1, and each lower one subtracts -1.
The result is a trend persistence score (z_loop) that grows with consistent directional momentum.
Smoothing Option: A user-defined EMA smooths the score, if enabled, to reduce short-term signal noise.
Signal Logic:
Long signal when trend score exceeds long_threshold.
Short signal when score drops below short_threshold.
Directional State (CD): Internally manages the current market regime (1 = long, -1 = short), controlling all visual output.
🔁 Use Cases & Applications
Macro Trend Alignment: Ideal for traders and analysts tracking Bitcoin’s structural momentum over long timeframes.
Trend Persistence Filter: Helps confirm whether the current move is part of a sustained trend or short-lived volatility.
Best Suited for BTC: Built specifically on the BNC BLX price history and Bitcoin’s power law behavior. Not designed for use with other assets.
✅ Conclusion
PLR-Z For Loop reframes Bitcoin’s long-term power law model into a trend-following tool by scoring the persistence of deviations above or below fair value. It shifts the focus from valuation-based mean reversion to directional momentum, making it a valuable signal for traders seeking high-conviction participation in BTC’s broader market cycles.
⚠️ Disclaimer
The content provided by this indicator is for educational and informational purposes only. Nothing herein constitutes financial or investment advice. Trading and investing involve risk, including the potential loss of capital. Always backtest and apply risk management suited to your strategy.
Hurst Exponent Oscillator [PhenLabs]📊 Hurst Exponent Oscillator -
Version: PineScript™ v5
📌 Description
The Hurst Exponent Oscillator (HEO) by PhenLabs is a powerful tool developed for traders who want to distinguish between trending, mean-reverting, and random market behaviors with clarity and precision. By estimating the Hurst Exponent—a statistical measure of long-term memory in financial time series—this indicator helps users make sense of underlying market dynamics that are often not visible through traditional moving averages or oscillators.
Traders can quickly know if the market is likely to continue its current direction (trending), revert to the mean, or behave randomly, allowing for more strategic timing of entries and exits. With customizable smoothing and clear visual cues, the HEO enhances decision-making in a wide range of trading environments.
🚀 Points of Innovation
Integrates advanced Hurst Exponent calculation via Rescaled Range (R/S) analysis, providing unique market character insights.
Offers real-time visual cues for trending, mean-reverting, or random price action zones.
User-controllable EMA smoothing reduces noise for clearer interpretation.
Dynamic coloring and fill for immediate visual categorization of market regime.
Configurable visual thresholds for critical Hurst levels (e.g., 0.4, 0.5, 0.6).
Fully customizable appearance settings to fit different charting preferences.
🔧 Core Components
Log Returns Calculation: Computes log returns of the selected price source to feed into the Hurst calculation, ensuring robust and scale-independent analysis.
Rescaled Range (R/S) Analysis: Assesses the dispersion and cumulative deviation over a rolling window, forming the core statistical basis for the Hurst exponent estimate.
Smoothing Engine: Applies Exponential Moving Average (EMA) smoothing to the raw Hurst value for enhanced clarity.
Dynamic Rolling Windows: Utilizes arrays to maintain efficient, real-time calculations over user-defined lengths.
Adaptive Color Logic: Assigns different highlight and fill colors based on the current Hurst value zone.
🔥 Key Features
Visually differentiates between trending, mean-reverting, and random market modes.
User-adjustable lookback and smoothing periods for tailored sensitivity.
Distinct fill and line styles for each regime to avoid ambiguity.
On-chart reference lines for strong trending and mean-reverting thresholds.
Works with any price series (close, open, HL2, etc.) for versatile application.
🎨 Visualization
Hurst Exponent Curve: Primary plotted line (smoothed if EMA is used) reflects the ongoing estimate of the Hurst exponent.
Colored Zone Filling: The area between the Hurst line and the 0.5 reference line is filled, with color and opacity dynamically indicating the current market regime.
Reference Lines: Dash/dot lines mark standard Hurst thresholds (0.4, 0.5, 0.6) to contextualize the current regime.
All visual elements can be customized for thickness, color intensity, and opacity for user preference.
📖 Usage Guidelines
Data Settings
Hurst Calculation Length
Default: 100
Range: 10-300
Description: Number of bars used in Hurst calculation; higher values mean longer-term analysis, lower values for quicker reaction.
Data Source
Default: close
Description: Select which data series to analyze (e.g., Close, Open, HL2).
Smoothing Length (EMA)
Default: 5
Range: 1-50
Description: Length for smoothing the Hurst value; higher settings yield smoother but less responsive results.
Style Settings
Trending Color (Hurst > 0.5)
Default: Blue tone
Description: Color used when trending regime is detected.
Mean-Reverting Color (Hurst < 0.5)
Default: Orange tone
Description: Color used when mean-reverting regime is detected.
Neutral/Random Color
Default: Soft blue
Description: Color when market behavior is indeterminate or shifting.
Fill Opacity
Default: 70-80
Range: 0-100
Description: Transparency of area fills—higher opacity for stronger visual effect.
Line Width
Default: 2
Range: 1-5
Description: Thickness of the main indicator curve.
✅ Best Use Cases
Identifying if a market is regime-shifting from trending to mean-reverting (or vice versa).
Filtering signals in automated or systematic trading strategies.
Spotting periods of randomness where trading signals should be deprioritized.
Enhancing mean-reversion or trend-following models with regime-awareness.
⚠️ Limitations
Not predictive: Reflects current and recent market state, not future direction.
Sensitive to input parameters—overfitting may occur if settings are changed too frequently.
Smoothing can introduce lag in regime recognition.
May not work optimally in markets with structural breaks or extreme volatility.
💡 What Makes This Unique
Employs advanced statistical market analysis (Hurst exponent) rarely found in standard toolkits.
Offers immediate regime visualization through smart dynamic coloring and zone fills.
🔬 How It Works
Rolling Log Return Calculation:
Each new price creates a log return, forming the basis for robust, non-linear analysis. This ensures all price differences are treated proportionally.
Rescaled Range Analysis:
A rolling window maintains cumulative deviations and computes the statistical “range” (max-min of deviations). This is compared against the standard deviation to estimate “memory”.
Exponent Calculation & Smoothing:
The raw Hurst value is translated from the log of the rescaled range ratio, and then optionally smoothed via EMA to dampen noise and false signals.
Regime Detection Logic:
The smoothed value is checked against 0.5. Values above = trending; below = mean-reverting; near 0.5 = random. These control plot/fill color and zone display.
💡 Note:
Use longer calculation lengths for major market character study, and shorter ones for tactical, short-term adaptation. Smoothing balances noise vs. lag—find a best fit for your trading style. Always combine regime awareness with broader technical/fundamental context for best results.
Asset Rotation System [InvestorUnknown]Overview
This system creates a comprehensive trend "matrix" by analyzing the performance of six assets against both the US Dollar and each other. The objective is to identify and hold the asset that is currently outperforming all others, thereby focusing on maintaining an investment in the most "optimal" asset at any given time.
- - - Key Features - - -
1. Trend Classification:
The system evaluates the trend for each of the six assets, both individually against USD and in pairs (assetX/assetY), to determine which asset is currently outperforming others.
Utilizes five distinct trend indicators: RSI (50 crossover), CCI, SuperTrend, DMI, and Parabolic SAR.
Users can customize the trend analysis by selecting all indicators or choosing a single one via the "Trend Classification Method" input setting.
2. Backtesting:
Calculates an equity curve for each asset and for the system itself, which assumes holding only the asset deemed optimal at any time.
Customizable start date for backtesting; by default, it begins either 5000 bars ago (the maximum in TradingView) or at the inception of the youngest asset included, whichever is shorter. If the youngest asset's history exceeds 5000 bars, the system uses 5000 bars to prevent errors.
The equity curve is dynamically colored based on the asset held at each point, with this coloring also reflected on the chart via barcolor().
Performance metrics like returns, standard deviation of returns, Sharpe, Sortino, and Omega ratios, along with maximum drawdown, are computed for each asset and the system's equity curve.
3 Alerts:
Supports alerts for when a new, confirmed optimal asset is identified. However, due to TradingView limitations, the specific asset cannot be included in the alert message.
- - - Usage - - -
1. Select Assets/Tickers:
Choose which assets or tickers you want to include in the rotation system. Ensure that all selected tickers are denominated in USD to maintain consistency in analysis.
2. Configure Trend Classification:
Decide on the trend classification method from the available options (RSI, CCI, SuperTrend, DMI, or Parabolic SAR, All) and adjust the settings to your preferences. This customization allows you to tailor the system to different market conditions or your specific trading strategy.
3. Utilize Backtesting for Calibration:
Use the backtesting results, including equity curves and performance metrics, to fine-tune your chosen trend indicators.
Be cautious not to overemphasize performance maximization, as this can lead to overfitting. The goal is to achieve a robust system that performs well across various market conditions, rather than just optimizing for past data.
- - - Parameters - - -
Tickers:
Asset 1: Select the symbol for the first asset.
Asset 2: Select the symbol for the second asset.
Asset 3: Select the symbol for the third asset.
Asset 4: Select the symbol for the fourth asset.
Asset 5: Select the symbol for the fifth asset.
Asset 6: Select the symbol for the sixth asset.
General Settings:
Trend Classification Method: Choose from RSI, CCI, SuperTrend, DMI, PSAR, or "All" to determine how trends are analyzed.
Use Custom Starting Date for Backtest: Toggle to use a custom date for beginning the backtest.
Custom Starting Date: Set the custom start date for backtesting.
Plot Perf. Metrics Table: Option to display performance metrics in a table on the chart.
RSI (Relative Strength Index):
RSI Source: Choose the price data source for RSI calculation.
RSI Length: Set the period for the RSI calculation.
CCI (Commodity Channel Index):
CCI Source: Select the price data source for CCI calculation.
CCI Length: Determine the period for the CCI.
SuperTrend:
SuperTrend Factor: Adjust the sensitivity of the SuperTrend indicator.
SuperTrend Length: Set the period for the SuperTrend calculation.
DMI (Directional Movement Index):
DMI Length: Define the period for DMI calculations.
Parabolic SAR:
PSAR Start: Initial acceleration factor for the Parabolic SAR.
PSAR Increment: Increment value for the acceleration factor.
PSAR Max Value: Maximum value the acceleration factor can reach.
Notes/Recommendations:
While this system is operational, it's important to recognize that it relies on "basic" indicators, which may not be ideal for generating trading signals on their own. I strongly suggest that users delve into the code to grasp the underlying logic of the system. Consider customizing it by integrating more sophisticated and higher-quality trend-following indicators to enhance its performance and reliability.
Disclaimer:
This system's backtest results are historical and do not predict future performance. Use for educational purposes only; not investment advice.
Smooth First Derivative IndicatorIntroducing the Smooth First Derivative indicator. For each time step, the script numerically differentiates the price data using prior datapoints from the look-back window. The resulting time derivative (the rate of price change over time) is presented as a centered oscillator.
A first derivative is a versatile tool used in functional data analysis. When applied to price data, it can be applied to analyze momentum, confirm trend direction, and identify pivot points.
Model Description:
The model assumes that, within the look-back window, price data can be well approximated by a smooth differentiable function. The first derivative can then be computed numerically using a noise-robust one-sided differentiator. The current version of the script employs smooth differentiators developed by P. Holoborodko (www.holoborodko.com). Note that the Indicator should not be confused with Constance Brown's Derivative Oscillator.
Input parameter:
The Bandwidth parameter sets the number of points in the moving look-back window and thus determines the smoothness of the first derivative curve. Note that a smoother Indicator shows a greater lag.
Interpretation:
When using this Indicator, one should recall that the first derivative can simply be interpreted as the slope of the curve:
- The maximum (minimum) in the Indicator corresponds to the point at which the market experiences the maximum upward (downward) slope, i.e., the inflection point. The steeper the slope, the greater the Indicator value.
- The positive-to-negative zero-crossing in the Indicator suggests that the market has formed a local maximum (potential start of a downtrend or a period of consolidation). Likewise, a zero-crossing from negative to positive is a potential bullish signal.
Auto Fractal [theUltimator5]This indicator is what I call the Auto Fractal. It is a unique algorithm that looks back in time, finds a segment on the chart that closest matches the recent price action, then projects the price forwards. It effectively finds chart patterns and shows you what the price did the last time the same/similar chart pattern was observed.
Creating an algorithm to match abstract curves to other abstract curves and provide a confidence score was the fundamental problem that needed to be solved in order to create this indicator, which curve matches with surprising accuracy.
The most effective method to "curve match" that I found is the Pearson Coefficient, set by a segment length and a lookback period. After the highest coefficient curve is located, the curve then gets scaled and offset to match the current price.
The past segment is drawn over the current price (orange line), giving a visualization of the two curves and how closely they match each other. The indicator then projects the price forwards in time based on the price action of the chart from the historical segment (dashed fuchsia line).
A bounding box also gets drawn around the historical segment to give you a clear visual of where the price is getting pulled from for proper analysis and ease of use.
The Pearson Coefficient % is shown in a table in the top right-hand corner of the chart and can be toggled off if desired. The values range from -100% (perfectly inverse correlation) to +100% (perfectly correlated) with 0 meaning no correlation whatsoever. The closer to +100% the value is, the better the segment match.
As with most/all of my indicators, user interface and simplicity was at the top of my priority list. I designed this to be easily readable and intuitive to both novice and veteran traders, without cluttering the chart.
Note:
This indicator is extremely heavy in terms of memory usage due to nested for loops, and takes several seconds to initially load the chart overlay. If the lookback period is increased too high (>600) then the indicator may time out and fail to load anything. If nothing loads on the chart, try reducing the lookback length and wait up to 10 seconds for lines to appear.
Functionally Weighted Moving AverageOVERVIEW
An anchor-able moving average that weights historical prices with mathematical curves (shaping functions) such as Smoothstep , Ease In / Out , or even a Cubic Bézier . This level of configurability lends itself to more versatile price modeling, over conventional moving averages.
SESSION ANCHORS
Aside from VWAP, conventional moving averages do not allow you to use the first bar of each session as an anchor. This can make averages less useful near the open when price is sufficiently different from yesterdays close. For example, in this screenshot the EMA (blue) lags behind the sessionally anchored FWMA (yellow) at the open, making it slower to indicate a pivot higher.
An incrementing length is what makes a moving average anchor-able. VWAP is designed to do this, indefinitely growing until a new anchor resets the average (which is why it doesn't have a length parameter). But conventional MA's are designed to have a set length (they do not increment). Combining these features, the FWMA treats the length like a maximum rather than a set length, incrementing up to it from the anchor (when enabled).
Quick aside: If you code and want to anchor a conventional MA, the length() function in my UtilityLibrary will help you do this.
Incrementing an averages length introduces near-anchor volatility. For this reason, the FWMA also includes an option to saturate the anchor with the source , making values near the anchor more resistant to change. The following screenshot illustrates how saturation affects the average near the anchor when disabled (aqua) and enabled (fuchsia).
AVERAGING MATH
While there's nothing special about the math, it's worth documenting exactly how the average is affected by the anchor.
Average = Dot Product / Sum of Weights
Dot Product
This is the sum of element-wise multiplication between the Price and Weight arrays.
Dot Product = Price1 × Weight1 + Price2 × Weight2 + Price3 × Weight3 ...
When the Price and Weight arrays are equally sized (aka. the length is no longer incrementing from the anchor), there's a 1-1 mapping between Price and Weight indices. Anchoring, however, purges historical data from the Price array, making it temporarily smaller. When this happens, a dot product is synthesized by linearly interpolating for proportional indices (rather than a 1-1 mapping) to maintain the intended shape of weights.
Synthetic Dot Product = FirstPrice × FirstWeight + ... MidPrice × MidWeight ... + LastPrice × LastWeight
Sum of Weights
Exactly what it sounds like, the sum of weights used by the dot product operation. The sum of used weights may be less than the sum of all weights when the dot product is synthesized.
Sum of Weights = Weight1 + Weight2 + Weight3 ...
CALCULATING WEIGHTS
Shaping functions are mathematical curves used for interpolation. They are what give the Functionally Weighted Moving Average its name, and define how each historical price in the look back period is weighted.
The included shaping functions are:
Linear (conventional WMA)
Smoothstep (S curve)
Ease In Out (adjustable S curve)
Ease In (first half of Ease In Out)
Ease Out (second half of Ease In Out)
Ease Out In (eases out and then back in)
Cubic Bézier (aka. any curve you want)
In the following screenshot, the only difference between the three FWMA's is the shaping function (Ease In, Ease In Out, and Ease Out) illustrating how different curves can influence the responsiveness of an average.
And here is the same example, but with anchor saturation disabled .
ADJUSTING WEIGHTS
Each function outputs a range of values between 0 and 1. While you can't expand or shrink the range, you can nudge it higher or lower using the Scalar . For example, setting the scalar to -0.2 remaps to , and +0.2 remaps to . The following screenshot illustrates how -0.2 (lightest blue) and +0.2 (darkest blue) affect the average.
Easing functions can be further adjusted with the Degree (how much the shaping function curves). There's an interactive example of this here and the following illustrates how a degrees 0, 1, and 20 (dark orange, orange, and light orange) affect the average.
This level of configurability completely changes how a moving average models price for a given length, making the FWMA extremely versatile.
INPUTS
You can configure:
Length (how many historical bars to average)
Source (the bar value to average)
Offset (horizontal offset of the plot)
Weight (the shaping function)
Scalar (how much to adjust each weight)
Degree (how much to ease in / out)
Bézier Points (controls shape of Bézier)
Divisor & Anchor parameters
Style of the plot
BUT ... WHY?
We use moving averages to anticipate trend initialization, continuation, and termination. For a given look back period (length) we want the average to represent the data as accurately and smoothly as possible. The better it does this, the better it is at modeling price.
In this screenshot, both the FWMA (yellow) and EMA (blue) have a length of 9. They are both smooth, but one of them more accurately models price.
You wouldn't necessarily want to trade with these FWMA parameters, but knowing it does a better job of modeling price allows you to confidently expand the model to larger timeframes for bigger moves. Here, both the FWMA (yellow) and EMA (blue) have a length of 195 (aka. 50% of NYSE market hours).
INSPIRATION
I predominantly trade ETF derivatives and hold the position that markets are chaotic, not random . The salient difference being that randomness is entirely unpredictable, and chaotic systems can be modeled. The kind of analysis I value requires a very good pricing model.
The term "model" sounds more intimidating than it is. Math terms do that sometimes. It's just a mathematical estimation . That's it. For example, a regression is an "average regressing" model (aka. mean reversion ), and LOWESS (Locally Weighted Scatterplot Smoothing) is a statistically rigorous local regression .
LOWESS is excellent for modeling data. Also, it's not practical for trading. It's computationally expensive and uses data to the right of the point it's averaging, which is impossible in realtime (everything to the right is in the future). But many techniques used within LOWESS are still valuable.
My goal was to create an efficient real time emulation of LOWESS. Specifically I wanted something that was weighted non-linearly, was efficient, left-side only, and data faithful. Incorporate trading paradigms (like anchoring) and you get a Functionally Weighted Moving Average.
The formulas for determining the weights in LOWESS are typically chosen just because they seem to work well. Meaning ... they can be anything, and there's no justification other than "looks about right". So having a variety of functions (aka. kernels) for the FWMA, and being able to slide the weight range higher or lower, allows you to also make it "look about right".
William Cleveland, prominent figure in statistics known for his contributions to LOWESS, preferred using a tri-cube weighting function. Using Weight = Ease Out In with the Degrees = 3 is comparable to this. Enjoy!
Distribution Histogram [SS]This is the frequency histogram indicator. It does just that—creates a frequency histogram distribution based on your desired lookback period. It then uses Pine's new Polyline function to plot a normal curve of the expected results for a normal distribution. This allows you to see quite a few things:
🎯 Firstly, it allows you to see where the accumulation rests in terms of a bell curve. The histogram represents a bell curve, and you can visually observe what the curve would look like.
🎯 Secondly, it will assess the normal distribution and the degree of skewness based on the curve itself. The indicator imports the SPTS statistics library to assess the distribution using Kurtosis and Skewness. However, it also adds functionality in this regard by making a qualitative assessment of the data. For example, if there are heavy left tails or heavier right tails present in the histogram, the indicator will alert you that a heavier left or right tail has been observed.
🎯 Thirdly, it provides you with the kurtosis and skewness of the dataset.
🎯 Fourthly, it provides the mean, median, and mode of the dataset, as well as the maximum and minimum values within the dataset.
🎯 Lastly, it provides you with the ability to toggle on tips/explanations of the curve itself. Simply toggle on "Show Distribution Explanation" in the settings menu:
How is the indicator helpful for trading?
If you are a mean reversion trader, this helps you identify the areas and price ranges of high and low accumulation. It also allows you to ascertain the probability by looking at the standard deviation of the bell curve. Remember, the majority of values should fall between -1 and 1 standard deviation of the mean (68%).
If it is revealed that the distribution has a heavier right or left tail, you will know that the stock is more likely to experience sudden drops and shifts in the curve in one direction or the other. Heavier left tails will tend to shift to the values on the far left, and vice versa for right tails.
Customization
You can turn off and on the following:
👉 The normal curve,
👉 The standard deviation levels, and
👉 The distribution explanations and tips.
Conclusion: And that is the indicator! Hope you enjoy it!
Orion Algo Strategy v2.0Hi everyone.
I decided to make the latest Orion Algo open to people. I don't have enough time to work on it lately, so I figured it would be best that everyone can have it to work on it. I took out some stuff from the original but it should give an idea on how things work. I made two strategies with this so far so you can use that to come up with your own. I recommend the DCA strategy because it gives you the most bang for Orion Algo's buck. It's pretty good at finding long entries.
Overall I hope you guys like this one. Also, Banano is the best crypto currency :)
-INFO-
Orion Algo is a trading algorithm designed to help traders find the highs and lows of the market before, during, and after they happen. We wanted to give an indicator to people that was simple to use. In fact we created the algorithm in such a way that it currently only needs a single input from the user. Since no indicator can predict the market perfectly, Orion should be used as just another tool (although quite a sharp one) for you to trade with. Fundamental knowledge of price action and TA should be used with Orion Algo.
Being an oscillator, Orion currently has a bias towards market volatility . So you will want to be trading markets over 30% volatility . We have plans to develop future versions that take this into account and adjust automatically for dead conditions. Also, while there are some similarities across all oscillators, what sets ours apart is the prediction curve. The prediction curve looks at the current signal values and gives it a relative score to approximate tops and bottoms 1-2 bars ahead of the signal curve. We also designed a velocity curve that attempts to predict the signal curve 2+ bars ahead. You can find the relative change in velocity in the Info panel. The bottom momentum wave is based on the signal curve and helps find overall market direction of higher time-frames while in a lower one.
Settings and How to Use them:
User Agreement – Orion Algo is a tool for you to use while trading. We aren’t responsible for losses OR the gains you make with it. By clicking the checkbox on the left you are agreeing to the terms.
Super Smooth – Smooths the main signal line based on the value inside the box. Lower values shift the pivot points to the left but also make things more noisy. Higher values move things to the right making it lag a bit more while creating a smoother signal. 8 is a good value to start with.
Theme – Changes the color scheme of Orion.
Dashboard – Turns on a dashboard with useful stats, such as Delta v, Volatility , Rsi , etc. Changing the value box will move the dashboard left and right.
Prediction – A secondary prediction model that attempts to predict a reversal before it happens (0-2bars). This can be noisy some times so make your best judgement. Curve will toggle a curve view of the prediction. Pivots will toggle bull/bear dots.
∆v – Delta v (change in velocity). This shows momentum of the signal. Crossing 0 signals a reversal. If you see the delta v changing direction, it may signify a reversal in the several bars depending on the overall momentum of the market.
Momentum Wave – Uses the signal as a macro trend indicator. Changes in direction of the wave can signify macro changes in the market. Average will toggle an averaging algorithm of the momentum waves and makes it easy to understand.
-STRATEGIES-
Simple - Just buy and sell on the dots
DCA - Uses the settings in the script for entries. If a buy dot appears then it will buy, if the price goes below the percentage it will wait for another dot before entering. This drastically improves DCA potential.
Support and Resistance LevelsDetecting Support and Resistance Levels
Description:
Support & Resistance levels are essential for every trader to define the decision points of the markets. If you are long and the market falls below the previous support level, you most probably have got the wrong position and better exit.
This script uses the first and second deviation of a curve to find the turning points and extremes of the price curve.
The deviation of a curve is nothing else than the momentum of a curve (and inertia is another name for momentum). It defines the slope of the curve. If the slope of a curve is zero, you have found a local extreme. The curve will change from rising to falling or the other way round.
The second deviation, or the momentum of momentum, shows you the turning points of the first deviation. This is important, as at this point the original curve will switch from acceleration to break mode.
Using the logic laid out above the support&resistance indicator will show the turning points of the market in a timely manner. Depending on level of market-smoothing it will show the long term or short term turning points.
This script first calculates the first and second deviation of the smoothed market, and in a second step runs the turning point detection.
Style tags: Trend Following, Trend Analysis
Asset class: Equities, Futures, ETFs, Currencies and Commodities
Dataset: FX Minutes/Hours/Days
ErrorFunctionsLibrary "ErrorFunctions"
A collection of functions used to approximate the area beneath a Gaussian curve.
Because an ERF (Error Function) is an integral, there is no closed-form solution to calculating the area beneath the curve. Meaning all ERFs are approximations; precisely wrong, but mostly accurate. How close you need to get to the actual area depends entirely on your use case, with more precision being less efficient.
The internal precision of floats in Pine Script is 1e-16 (16 decimals, aka. double precision). This library adapts well known algorithms designed to efficiently reach double precision. Single precision alternates are also included. All of them were made free to use, modify, and distribute by their original authors.
HASTINGS
Adaptation of a single precision ERF by Cecil Hastings Jr, published through Princeton University in 1955. It was later documented by Abramowitz and Stegun as equation 7.1.26 in their 1972 Handbook of Mathematical Functions. Fast, efficient, and ideal when precision beyond a few decimals is unnecessary.
GILES
Adaptation of a single precision Inverse ERF by Michael Giles, published through the University of Oxford in 2012. It reverses the ERF, estimating an X coordinate from an area. It too is fast, efficient, and ideal when precision beyond a few decimals is unnecessary.
LIBC
Adaptation of the double precision ERF & ERFC in the standard C library (aka. libc). It is also the same ERF & ERFC that SciPy uses. While not quite as efficient as the Hastings approximation, it's still very fast and fully maximizes Pines precision.
BOOST
Adaptation of the double precision Inverse ERF & Inverse ERFC in the Boost Math C++ library. SciPy uses these as well. These reverse the ERF & ERFC, estimating an X coordinate from an area. It too isn't quite as efficient as the Giles approximation, but still fast and fully maximizes Pines precision.
While these algorithms are not exported directly, they are available through their exported counterparts.
- - -
ERROR FUNCTIONS
erf(x, precise)
An Error Function estimates the theoretical error of a measurement.
Parameters:
x (float) : (float) Upper limit of the integration.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between -1 and 1.
erfc(x, precise)
A Complementary Error Function estimates the difference between a theoretical error and infinity.
Parameters:
x (float) : (float) Lower limit of the integration.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and 2.
erfinv(x, precise)
An Inverse Error Function reverses the erf() by estimating the original measurement from the theoretical error.
Parameters:
x (float) : (float) Theoretical error.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and ± infinity.
erfcinv(x, precise)
An Inverse Complementary Error Function reverses the erfc() by estimating the original measurement from the difference between the theoretical error and infinity.
Parameters:
x (float) : (float) Difference between the theoretical error and infinity.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and ± infinity.
- - -
DISTRIBUTION FUNCTIONS
pdf(x, m, s)
A Probability Density Function estimates the probability density . For clarity, density is not a probability .
Parameters:
x (float) : (float) X coordinate for which a density will be estimated.
m (float) : (float) Mean
s (float) : (float) Sigma
Returns: (float) Between 0 and ∞.
cdf(z, precise)
A Cumulative Distribution Function estimates the area under a Gaussian curve between negative infinity and the Z Score.
Parameters:
z (float) : (float) Z Score.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and 1.
cdfinv(a, precise)
An Inverse Cumulative Distribution Function reverses the cdf() by estimating the Z Score from an area.
Parameters:
a (float) : (float) Area between 0 and 1.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between -∞ and +∞
cdfab(z1, z2, precise)
A Cumulative Distribution Function from A to B estimates the area under a Gaussian curve between two Z Scores (A and B).
Parameters:
z1 (float) : (float) First Z Score.
z2 (float) : (float) Second Z Score.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and 1.
ttt(z, precise)
A Two-Tailed Test estimates the area under a Gaussian curve between symmetrical ± Z scores and ± infinity.
Parameters:
z (float) : (float) One of the symmetrical Z Scores.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and 1.
tttinv(a, precise)
An Inverse Two-Tailed Test reverses the ttt() by estimating the absolute Z Score from an area.
Parameters:
a (float) : (float) Area between 0 and 1.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and ∞.
ott(z, precise)
A One-Tailed Test estimates the area under a Gaussian curve between an absolute Z Score and infinity.
Parameters:
z (float) : (float) Z Score.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and 1.
ottinv(a, precise)
An Inverse One-Tailed Test Reverses the ott() by estimating the Z Score from a an area.
Parameters:
a (float) : (float) Area between 0 and 1.
precise (bool) : Double precision (true) or single precision (false).
Returns: (float) Between 0 and ∞.
Trailing Management (Zeiierman)█ Overview
The Trailing Management (Zeiierman) indicator is designed for traders who seek an automated and dynamic approach to managing trailing stops. It helps traders make systematic decisions regarding when to enter and exit trades based on the calculated risk-reward ratio. By providing a clear visual representation of trailing stop levels and risk-reward metrics, the indicator is an essential tool for both novice and experienced traders aiming to enhance their trading discipline.
The Trailing Management (Zeiierman) indicator integrates a Break-Even Curve feature to enhance its utility in trailing stop management and risk-reward optimization. The Break-Even Curve illuminates the precise point at which a trade neither gains nor loses value, offering clarity on the risk-reward landscape. Furthermore, this precise point is calculated based on the required win rate and the risk/reward ratio. This calculation aids traders in understanding the type of strategy they need to employ at any given time to be profitable. In other words, traders can, at any given point, assess the kind of strategy they need to utilize to make money, depending on the price's position within the risk/reward box.
█ How It Works
The indicator operates by computing the highest high and the lowest low over a user-defined period and then applying this information to determine optimal trailing stop levels for both long and short positions.
Directional Bias:
It establishes the direction of the market trend by comparing the index of the highest high and the lowest low within the lookback period.
Bullish
Bearish
Trailing Stop Adjustment:
The trailing stops are adjusted using one of three methods: an automatic calculation based on the median of recent peak differences, pivot points, or a fixed percentage defined by the user.
The Break-Even Curve:
The Break-Even Curve, along with the risk/reward ratio, is determined through the trailing method. This approach utilizes the current closing price as a hypothetical entry point for trades. All calculations, including those for the curve, are based on this current closing price, ensuring real-time accuracy and relevance. As market conditions fluctuate, the curve dynamically adjusts, offering traders a visual benchmark that signifies the break-even point. This real-time adjustment provides traders with an invaluable tool, allowing them to visually track how shifts in the market could impact the point at which their trades neither gain nor lose value.
Example:
Let's say the price is at the midpoint of the risk/reward box; this means that the risk/reward ratio should be 1:1, and the minimum win rate is 50% to break even.
In this example, we can see that the price is near the stop-loss level. If you are about to take a trade in this area and would respect your stop, you only need to have a minimum win rate of 11% to earn money, given the risk/reward ratio, assuming that you hold the trade to the target.
In other words, traders can, at any given point, assess the kind of strategy they need to employ to make money based on the price's position within the risk/reward box.
█ How to Use
Market Bias:
When using the Auto Bias feature, the indicator calculates the underlying market bias and displays it as either bullish or bearish. This helps traders align their trades with the underlying market trend.
Risk Management:
By observing the plotted trailing stops and the risk-reward ratios, traders can make strategic decisions to enter or exit positions, effectively managing the risk.
Strategy selection:
The Break-Even Curve is a powerful tool for managing risk, allowing traders to visualize the relationship between their trailing stops and the market's price movements. By understanding where the break-even point lies, traders can adjust their strategies to either lock in profits or cut losses.
Based on the plotted risk/reward box and the location of the price within this box, traders can easily see the win rate required by their strategy to make money in the long run, given the risk/reward ratio.
Consider this example: The market is bullish, as indicated by the bias, and the indicator suggests looking into long trades. The price is near the top of the risk/reward box, which means entering the market right now carries a huge risk, and the potential reward is very low. To take this trade, traders must have a strategy with a win rate of at least 90%.
█ Settings
Trailing Method:
Auto: The indicator calculates the trailing stop dynamically based on market conditions.
Pivot: The trailing stop is adjusted to the highest high (long positions) or lowest low (short positions) identified within a specified lookback period. This method uses the pivotal points of the market to set the trailing stop.
Percentage: The trailing stop is set at a fixed percentage away from the peak high or low.
Trailing Size (prd):
This setting defines the lookback period for the highest high and lowest low, which affects the sensitivity of the trailing stop to price movements.
Percentage Step (perc):
If the 'Percentage' method is selected, this setting determines the fixed percentage for the trailing stop distance.
Set Bias (bias):
Allows users to set a market bias which can be Bullish, Bearish, or Auto, affecting how the trailing stop is adjusted in relation to the market trend.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
[blackcat] L2 Ehlers Truncated BP FilterLevel: 2
Background
John F. Ehlers introuced Truncated BandPass (BP) Filter in Jul, 2020.
Function
In Dr. Ehlers' article “Truncated Indicators” in Jul, 2020, he introduces a method that can be used to modify some indicators, improving how accurately they are able to track and respond to price action. By limiting the data range, that is, truncating the data, indicators may be able to better handle extreme price events. A reasonable goal, especially during times of high volatility. John Ehlers shows how to improve a bandpass filter’s ability to reflect price by limiting the data range. Filtering out the temporary spikes and price extremes should positively affect the indicator stability. Enter a new indicator ——— the Truncated BandPass (BP) filter.
Cumulative indicators, such as the EMA or MACD, are affected not only by previous candles, but by a theoretically infinite history of candles. Although this effect is often assumed to be negligible, John Ehlers demonstrates in his article that it is not so. Or at least not for a narrow-band bandpass filter.
Bandpass filters are normally used for detecting cycles in price curves. But they do not work well with steep edges in the price curve. Sudden price jumps cause a narrow-band filter to “ring like a bell” and generate artificial cycles that can cause false triggers. As a solution, Ehlers proposes to truncate the candle history of the filter. Limiting the history to 10 bars effectively dampened the filter output and produced a better representation of the cycles in the price curve. For limiting the history of a cumulative indicator, John Ehlers proposes “Truncated Indicators,” John Ehlers takes us aside to look at the impact of sharp price movements on two fundamentally different types of filters: finite impulse response, and infinite impulse response filters. Given recent market conditions, this is a very well timed subject.
As demostrated in this script, Ehlers suggests “truncation” as an approach to the way the trader calculates filters. He explains why truncation is not appropriate for finite impulse response filters but why truncation can be beneficial to infinite impulse response filters. He then explains how to apply truncation to infinite impulse response filters using his bandpass filter as an example.
Key Signal
BPT --> Truncated BandPass (BP) Filter fast line
Trigger --> Truncated BandPass (BP) Filter slow line
Pros and Cons
100% John F. Ehlers definition translation, even variable names are the same. This help readers who would like to use pine to read his book.
Remarks
The 98th script for Blackcat1402 John F. Ehlers Week publication.
Readme
In real life, I am a prolific inventor. I have successfully applied for more than 60 international and regional patents in the past 12 years. But in the past two years or so, I have tried to transfer my creativity to the development of trading strategies. Tradingview is the ideal platform for me. I am selecting and contributing some of the hundreds of scripts to publish in Tradingview community. Welcome everyone to interact with me to discuss these interesting pine scripts.
The scripts posted are categorized into 5 levels according to my efforts or manhours put into these works.
Level 1 : interesting script snippets or distinctive improvement from classic indicators or strategy. Level 1 scripts can usually appear in more complex indicators as a function module or element.
Level 2 : composite indicator/strategy. By selecting or combining several independent or dependent functions or sub indicators in proper way, the composite script exhibits a resonance phenomenon which can filter out noise or fake trading signal to enhance trading confidence level.
Level 3 : comprehensive indicator/strategy. They are simple trading systems based on my strategies. They are commonly containing several or all of entry signal, close signal, stop loss, take profit, re-entry, risk management, and position sizing techniques. Even some interesting fundamental and mass psychological aspects are incorporated.
Level 4 : script snippets or functions that do not disclose source code. Interesting element that can reveal market laws and work as raw material for indicators and strategies. If you find Level 1~2 scripts are helpful, Level 4 is a private version that took me far more efforts to develop.
Level 5 : indicator/strategy that do not disclose source code. private version of Level 3 script with my accumulated script processing skills or a large number of custom functions. I had a private function library built in past two years. Level 5 scripts use many of them to achieve private trading strategy.
Neural Pulse System [Alpha Extract]Neural Pulse System (NPS)
The Neural Pulse System (NPS) is a custom technical indicator that analyzes price action through a probabilistic lens, offering a dynamic view of bullish and bearish tendencies.
Unlike traditional binary classification models, NPS employs Ordinary Least Squares (OLS) regression with dynamically computed coefficients to produce a smooth probability output ranging from -1 to 1.
Paired with ATR-based bands, this indicator provides an intuitive and volatility-aware approach to trend analysis.
🔶 CALCULATION
The Neural Pulse System utilizes OLS regression to compute probabilities of bullish or bearish price action while incorporating ATR-based bands for volatility context:
Dynamic Coefficients: Coefficients are recalculated in real-time and scaled up to ensure the regression adapts to evolving market conditions.
Ordinary Least Squares (OLS): Uses OLS regression instead of gradient descent for more precise and efficient coefficient estimation.
ATR Bands: Smoothed Average True Range (ATR) bands serve as dynamic boundaries, framing the regression within market volatility.
Probability Output: Instead of a binary result, the output is a continuous probability curve (-1 to 1), helping traders gauge the strength of bullish or bearish momentum.
Formula:
OLS Regression = Line of best fit minimizing squared errors
Probability Signal = Transformed regression output scaled to -1 (bearish) to 1 (bullish)
ATR Bands = Smoothed Average True Range (ATR) to frame price movements within market volatility
🔶 DETAILS
📊 Visual Features:
Probability Curve: Smooth probability signal ranging from -1 (bearish) to 1 (bullish)
ATR Bands: Price action is constrained within volatility bands, preventing extreme deviations
Color-Coded Signals:
Blue to Green: Increasing probability of bullish momentum
Orange to Red: Increasing probability of bearish momentum
Interpretation:
Bullish Bias: Probability output consistently above 0 suggests a bullish trend.
Bearish Bias: Probability output consistently below 0 indicates bearish pressure.
Reversals: Extreme values near -1 or 1, followed by a move toward 0, may signal potential trend reversals.
🔶 EXAMPLES
📌 Trend Identification: Use the probability output to gauge trend direction.
📌Example: On a 1-hour chart, NPS moves from -0.5 to 0.8 as price breaks resistance, signaling a bullish trend.
Reversal Signals: Watch for probability extremes near -1 or 1 followed by a reversal toward 0.
Example: NPS hits 0.9, price touches the upper ATR band, then both retreat—indicating a potential pullback.
📌 Example snapshots:
Volatility Context: ATR bands help assess whether price action aligns with typical market conditions.
Example: During low volatility, the probability signal hovers near 0, and ATR bands tighten, suggesting a potential breakout.
🔶 SETTINGS
Customization Options:
ATR Period – Defines lookback length for ATR calculation (shorter = more responsive, longer = smoother).
ATR Multiplier – Adjusts band width for better volatility capture.
Regression Length – Controls how many bars feed into the coefficient calculation (longer = smoother, shorter = more reactive).
Scaling Factor – Adjusts the strength of regression coefficients.
Output Smoothing – Option to apply a moving average for a cleaner probability curve