Titan Investments|Quantitative THEMIS|Demo|BINANCE:BTCUSDTP:4hInvestment Strategy (Quantitative Trading)
| 🛑 | Watch "LIVE" and 'COPY' this strategy in real time:
🔗 Link: www.tradingview.com
Hello, welcome, feel free 🌹💐
Since the stone age to the most technological age, one thing has not changed, that which continues impress human beings the most, is the other human being!
Deep down, it's all very simple or very complicated, depends on how you look at it.
I believe that everyone was born to do something very well in life.
But few are those who have, let's use the word 'luck' .
Few are those who have the 'luck' to discover this thing.
That is why few are happy and successful in their jobs and professions.
Thank God I had this 'luck' , and discovered what I was born to do well.
And I was born to program. 👨💻
📋 Summary : Project Titan
0️⃣ : 🦄 Project Titan
1️⃣ : ⚖️ Quantitative THEMIS
2️⃣ : 🏛️ Titan Community
3️⃣ : 👨💻 Who am I ❔
4️⃣ : ❓ What is Statistical/Probabilistic Trading ❓
5️⃣ : ❓ How Statistical/Probabilistic Trading works ❓
6️⃣ : ❓ Why use a Statistical/Probabilistic system ❓
7️⃣ : ❓ Why the human brain is not prepared to do Trading ❓
8️⃣ : ❓ What is Backtest ❓
9️⃣ : ❓ How to build a Consistent system ❓
🔟 : ❓ What is a Quantitative Trading system ❓
1️⃣1️⃣ : ❓ How to build a Quantitative Trading system ❓
1️⃣2️⃣ : ❓ How to Exploit Market Anomalies ❓
1️⃣3️⃣ : ❓ What Defines a Robust, Profitable and Consistent System ❓
1️⃣4️⃣ : 🔧 Fixed Technical
1️⃣5️⃣ : ❌ Fixed Outputs : 🎯 TP(%) & 🛑SL(%)
1️⃣6️⃣ : ⚠️ Risk Profile
1️⃣7️⃣ : ⭕ Moving Exits : (Indicators)
1️⃣8️⃣ : 💸 Initial Capital
1️⃣9️⃣ : ⚙️ Entry Options
2️⃣0️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Third-Party Services'
2️⃣1️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Exchanges
2️⃣2️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Messaging Services'
2️⃣3️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : '🧲🤖Copy-Trading'
2️⃣4️⃣ : ❔ Why be a Titan Pro 👽❔
2️⃣5️⃣ : ❔ Why be a Titan Aff 🛸❔
2️⃣6️⃣ : 📋 Summary : ⚖️ Strategy: Titan Investments|Quantitative THEMIS|Demo|BINANCE:BTCUSDTP:4h
2️⃣7️⃣ : 📊 PERFORMANCE : 🆑 Conservative
2️⃣8️⃣ : 📊 PERFORMANCE : Ⓜ️ Moderate
2️⃣9️⃣ : 📊 PERFORMANCE : 🅰 Aggressive
3️⃣0️⃣ : 🛠️ Roadmap
3️⃣1️⃣ : 🧻 Notes ❕
3️⃣2️⃣ : 🚨 Disclaimer ❕❗
3️⃣3️⃣ : ♻️ ® No Repaint
3️⃣4️⃣ : 🔒 Copyright ©️
3️⃣5️⃣ : 👏 Acknowledgments
3️⃣6️⃣ : 👮 House Rules : 📺 TradingView
3️⃣7️⃣ : 🏛️ Become a Titan Pro member 👽
3️⃣8️⃣ : 🏛️ Be a member Titan Aff 🛸
0️⃣ : 🦄 Project Titan
This is the first real, 100% automated Quantitative Strategy made available to the public and the pinescript community for TradingView.
You will be able to automate all signals of this strategy for your broker , centralized or decentralized and also for messaging services : Discord, Telegram or Twitter .
This is the first strategy of a larger project, in 2023, I will provide a total of 6 100% automated 'Quantitative' strategies to the pinescript community for TradingView.
The future strategies to be shared here will also be unique , never before seen, real 'Quantitative' bots with real, validated results in real operation.
Just like the 'Quantitative THEMIS' strategy, it will be something out of the loop throughout the pinescript/tradingview community, truly unique tools for building mutual wealth consistently and continuously for our community.
1️⃣ : ⚖️ Quantitative THEMIS : Titan Investments|Quantitative THEMIS|Demo|BINANCE:BTCUSDTP:4h
This is a truly unique and out of the curve strategy for BTC /USD .
A truly real strategy, with real, validated results and in real operation.
A unique tool for building mutual wealth, consistently and continuously for the members of the Titan community.
Initially we will operate on a monthly, quarterly, annual or biennial subscription service.
Our goal here is to build a great community, in exchange for an extremely fair value for the use of our truly unique tools, which bring and will bring real results to our community members.
With this business model it will be possible to provide all Titan users and community members with the purest and highest degree of sophistication in the market with pinescript for tradingview, providing unique and truly profitable strategies.
My goal here is to offer the best to our members!
The best 'pinescript' tradingview service in the world!
We are the only Start-Up in the world that will decentralize real and full access to truly real 'quantitative' tools that bring and will bring real results for mutual and ongoing wealth building for our community.
2️⃣ : 🏛️ Titan Community : 👽 Pro 🔁 Aff 🛸
Become a Titan Pro 👽
To get access to the strategy: "Quantitative THEMIS" , and future Titan strategies in a 100% automated way, along with all tutorials for automation.
Pro Plans: 30 Days, 90 Days, 12 Months, 24 Months.
👽 Pro 🅼 Monthly
👽 Pro 🆀 Quarterly
👽 Pro🅰 Annual
👽 Pro👾Two Years
You will have access to a truly unique system that is out of the curve .
A 100% real, 100% automated, tested, validated, profitable, and in real operation strategy.
Become a Titan Affiliate 🛸
By becoming a Titan Affiliate 🛸, you will automatically receive 50% of the value of each new subscription you refer .
You will receive 50% for any of the above plans that you refer .
This way we will encourage our community to grow in a fair and healthy way, because we know what we have in our hands and what we deliver real value to our users.
We are at the highest level of sophistication in the market, the consistency here and the results here speak for themselves.
So growing our community means growing mutual wealth and raising collective conscience.
Wealth must be created not divided.
And here we are creating mutual wealth on all ends and in all ways.
A non-zero sum system, where everybody wins.
3️⃣ : 👨💻 Who am I ❔
My name is FilipeSoh I am 26 years old, Technical Analyst, Trader, Computer Engineer, pinescript Specialist, with extensive experience in several languages and technologies.
For the last 4 years I have been focusing on developing, editing and creating pinescript indicators and strategies for Tradingview for people and myself.
Full-time passionate workaholic pinescript developer with over 10,000 hours of pinescript development.
• Pinescript expert ▬Tradingview.
• Specialist in Automated Trading
• Specialist in Quantitative Trading.
• Statistical/Probabilistic Trading Specialist - Mark Douglas Scholl.
• Inventor of the 'Classic Forecast' Indicators.
• Inventor of the 'Backtest Table'.
4️⃣ : ❓ What is Statistical/Probabilistic Trading ❓
Statistical/probabilistic trading is the only way to get a positive mathematical expectation regarding the market and consequently that is the only way to make money consistently from it.
I will present below some more details about the Quantitative THEMIS strategy, it is a real strategy, tested, validated and in real operation, 'Skin in the Game' , a consistent way to make money with statistical/probabilistic trading in a 100% automated.
I am a Technical Analyst , I used to be a Discretionary Trader , today I am 100% a Statistical Trader .
I've gotten rich and made a lot of money, and I've also lost a lot with 'leverage'.
That was a few years ago.
The book that changed everything for me was "Trading in The Zone" by Mark Douglas.
That's when I understood that the market is just a game of statistics and probability, like a casino!
It was then that I understood that the human brain is not prepared for trading, because it involves triggers and mental emotions.
And emotions in trading and in making trading decisions do not go well together, not in the long run, because you always have the burden of being wrong with the outcome of that particular position.
But remembering that the market is just a statistical game!
5️⃣ : ❓ How Statistical/Probabilistic Trading works ❓
Let's use a 'coin' as an example:
If we toss a 'coin' up 10 times.
Do you agree that it is impossible for us to know exactly the result of the 'plays' before they actually happen?
As in the example above, would you agree, that we cannot "guess" the outcome of a position before it actually happens?
As much as we cannot "guess" whether the coin will drop heads or tails on each flip.
We can analyze the "backtest" of the 10 moves made with that coin:
If we analyze the 10 moves and count the number of times the coin fell heads or tails in a specific sequence, we then have a percentage of times the coin fell heads or tails, so we have a 'backtest' of those moves.
Then on the next flip we can now assume a point or a favorable position for one side, the side with the highest probability .
In a nutshell, this is more or less how probabilistic statistical trading works.
As Statistical Traders we can never say whether such a Trader/Position we take will be a winner or a loser.
But still we can have a positive and consistent result in a "sequence" of trades, because before we even open a position, backtests have already been performed so we identify an anomaly and build a system that will have a positive statistical advantage in our favor over the market.
The advantage will not be in one trade itself, but in the "sequence" of trades as a whole!
Because our system will work like a casino, having a positive mathematical expectation relative to the players/market.
Design, develop, test models and systems that can take advantage of market anomalies, until they change.
Be the casino! - Mark Douglas
6️⃣ : ❓ Why use a Statistical/Probabilistic system ❓
In recent years I have focused and specialized in developing 100% automated trading systems, essentially for the cryptocurrency market.
I have developed many extremely robust and efficient systems, with positive mathematical expectation towards the market.
These are not complex systems per se , because here we want to avoid 'over-optimization' as much as possible.
As Da Vinci said: "Simplicity is the highest degree of sophistication".
I say this because I have tested, tried and developed hundreds of systems/strategies.
I believe I have programmed more than 10,000 unique indicators/strategies, because this is my passion and purpose in life.
I am passionate about what I do, completely!
I love statistical trading because it is the only way to get consistency in the long run!
This is why I have studied, applied, developed, and specialized in 100% automated cryptocurrency trading systems.
The reason why our systems are extremely "simple" is because, as I mentioned before, in statistical trading we want to exploit the market anomaly to the maximum, that is, this anomaly will change from time to time, usually we can exploit a trading system efficiently for about 6 to 12 months, or for a few years, that is; for fixed 'scalpers' systems.
Because at some point these anomalies will be identified , and from the moment they are identified they will be exploited and will stop being anomalies .
With the system presented here; you can even copy the indicators and input values shared here;
However; what I have to offer you is: it is me , our team , and our community !
That is, we will constantly monitor this system, for life , because our goal here is to create a unique , perpetual , profitable , and consistent system for our community.
Myself , our team and our community will keep this script periodically updated , to ensure the positive mathematical expectation of it.
So we don't mind sharing the current parameters and values , because the real value is also in the future updates that this system will receive from me and our team , guided by our culture and our community of real users !
As we are hosted on 'tradingview', all future updates for this strategy, will be implemented and updated automatically on your tradingview account.
What we want here is: to make sure you get gains from our system, because if you get gains , our ecosystem will grow as a whole in a healthy and scalable way, so we will be generating continuous mutual wealth and raising the collective consciousness .
People Need People: 3️⃣🅿
7️⃣ : ❓ Why the human brain is not prepared to do Trading ❓
Today my greatest skill is to develop statistically profitable and 100% automated strategies for 'pinescript' tradingview.
Note that I said: 'profitable' because in fact statistical trading is the only way to make money in a 'consistent' way from the market.
And consequently have a positive wealth curve every cycle, because we will be based on mathematics, not on feelings and news.
Because the human brain is not prepared to do trading.
Because trading is connected to the decision making of the cerebral cortex.
And the decision making is automatically linked to emotions, and emotions don't match with trading decision making, because in those moments, we can feel the best and also the worst sensations and emotions, and this certainly affects us and makes us commit grotesque mistakes!
That's why the human brain is not prepared to do trading.
If you want to participate in a fully automated, profitable and consistent trading system; be a Titan Pro 👽
I believe we are walking an extremely enriching path here, not only in terms of financial returns for our community, but also in terms of knowledge about probabilistic and automated statistical trading.
You will have access to an extremely robust system, which was built upon very strong concepts and foundations, and upon the world's main asset in a few years: Bitcoin .
We are the tip of the best that exists in the cryptocurrency market when it comes to probabilistic and automated statistical trading.
Result is result! Me being dressed or naked.
This is just the beginning!
But there is a way to consistently make money from the market.
Being the Casino! - Mark Douglas
8️⃣ : ❓ What is Backtest ❓
Imagine the market as a purely random system, but even in 'randomness' there are patterns.
So now imagine the market and statistical trading as follows:
Repeating the above 'coin' example, let's think of it as follows:
If we toss a coin up 10 times again.
It is impossible to know which flips will have heads or tails, correct?
But if we analyze these 10 tosses, then we will have a mathematical statistic of the past result, for example, 70 % of the tosses fell 'heads'.
That is:
7 moves fell on "heads" .
3 moves fell on "tails" .
So based on these conditions and on the generic backtest presented here, we could adopt " heads " as our system of moves, to have a statistical and probabilistic advantage in relation to the next move to be performed.
That is, if you define a system, based on backtests , that has a robust positive mathematical expectation in relation to the market you will have a profitable system.
For every move you make you will have a positive statistical advantage in your favor over the market before you even make the move.
Like a casino in relation to all its players!
The casino does not have an advantage over one specific player, but over all players, because it has a positive mathematical expectation about all the moves that night.
The casino will always have a positive statistical advantage over its players.
Note that there will always be real players who will make real, million-dollar bankrolls that night, but this condition is already built into the casino's 'strategy', which has a pre-determined positive statistical advantage of that night as a whole.
Statistical trading is the same thing, as long as you don't understand this you will keep losing money and consistently.
9️⃣ : ❓ How to build a Consistent system ❓
See most traders around the world perform trades believing that that specific position taken will make them filthy rich, because they simply believe faithfully that the position taken will be an undoubted winner, based on a trader's methodology: 'trading a trade' without analyzing the whole context, just using 'empirical' aspects in their system.
But if you think of trading, as a sequence of moves.
You see, 'a sequence' !
When we think statistically, it doesn't matter your result for this , or for the next specific trade , but the final sequence of trades as a whole.
As the market has a random system of results distribution , if your system has a positive statistical advantage in relation to the market, at the end of that sequence you'll have the biggest probability of having a winning bank.
That's how you do real trading!
And with consistency!
Trading is a long term game, but when you change the key you realize that it is a simple game to make money in a consistent way from the market, all you need is patience.
Even more when we are based on Bitcoin, which has its 'Halving' effect where, in theory, we will never lose money in 3 to 4 years intervals, due to its scarcity and the fact that Bitcoin is the 'discovery of digital scarcity' which makes it the digital gold, we believe in this thesis and we follow Satoshi's legacy.
So align Bitcoin with a probabilistic statistical trading system with a positive mathematical expectation of the market and 100% automated with the long term, and all you need is patience, and you will become rich.
In fact Bitcoin by itself is already a path, buy, wait for each halving and your wealth will be maintained.
No inflation, unlike fiat currencies.
This is a complete and extremely robust strategy, with the most current possible and 'not possible' techniques involved and applied here.
Today I am at another level in developing 100% automated 'quantitative' strategies.
I was born for this!
🔟 : ❓ What is a Quantitative Trading system ❓
In addition to having access to a revolutionary strategy you will have access to disruptive 100% multifunctional tables with the ability to perform 'backtests' for better tracking and monitoring of your system on a customized basis.
I would like to emphasize one thing, and that is that you keep this in mind.
Today my greatest skill in 'pinescript' is to build indicators, but mainly strategies, based on statistical and probabilistic trading, with a postive mathematical expectation in relation to the market, in a 100% automated way.
This with the goal of building a consistent and continuous positive equity curve through mathematics using data, converting it into statistical / probabilistic parameters and applying them to a Quantitative model.
Before becoming a Quantitative Trader , I was a Technical Analyst and a Discretionary Trader .
First as a position trader and then as a day trader.
Before becoming a Trader, I trained myself as a Technical Analyst , to masterly understand the shape and workings of the market in theory.
But everything changed when I met 'Mark Douglas' , when I got to know his works, that's when my head exploded 🤯, and I started to understand the market for good!
The market is nothing more than a 'random' system of distributing results.
See that I said: 'random' .
Do yourself a mental exercise.
Is there really such a thing as random ?
I believe not, as far as we know maybe the 'singularity'.
So thinking this way, to translate, the market is nothing more than a game of probability, statistics and pure mathematics.
Like a casino!
What happens is that most traders, whenever they take a position, take it with all the empirical certainty that such position will win or lose, and do not take into consideration the total sequence of results to understand their place in the market.
Understanding your place in the market gives you the ability to create and design systems that can exploit the present market anomaly, and thus make money statistically, consistently, and 100% automated.
Thinking of it this way, it is easy to make money from the market.
There are many ways to make money from the market, but the only consistent way I know of is through 'probabilistic and automated statistical trading'.
1️⃣1️⃣ : ❓ How to build a Quantitative Trading system ❓
There are some fundamental points that must be addressed here in order to understand what makes up a system based on statistics and probability applied to a quantitative model.
When we talk about 'discretionary' trading, it is a trading system based on human decisions after the defined 'empirical' conditions are met.
It is quite another thing to build a fully automated system without any human interference/interaction .
That said:
Building a statistically profitable system is perfectly possible, but this is a high level task , but with possible high rewards and consistent gains.
Here you will find a real "Skin In The Game" strategy.
With all due respect, but the vast majority of traders who post strategies on TradingView do not understand what they are doing.
Most of them do not understand the minimum complexity involved in the main variable for the construction of a real strategy, the mother variable: "strategy".
I say this by my own experience, because I have analyzed practically all the existing publications of TradingView + 200,000 indicators and strategies.
I breathe pinescript, I eat pinescript, I sleep pinescript, I bathe pinescript, I live TradingView.
But the main advantage for the TradingView users, is that all entry and exit orders made by this strategy can be checked and analyzed thoroughly, to validate and prove the veracity of this strategy, because this is a 100% real strategy.
Here there is a huge world of possibilities, but only one way to build a 'pinescript strategy' that will work correctly aligned to the real world with real results .
There are some fundamental points to take into consideration when building a profitable trading system:
The most important of these for me is: 'DrawDown' .
Followed by: 'Hit Rate' .
And only after that we use the parameter: 'Profit'.
See, this is because here, we are dealing with the 'imponderable' , and anything can happen in this scenario.
But there is one thing that makes us sleep peacefully at night, and that is: controlling losses .
That is, in other words: controlling the DrawDown .
The amateur is concerned with 'winning', the professional is concerned with conserving capital.
If we have the losses under control, then we can move on to the other two parameters: hit rate and profit.
See, the second most important factor in building a system is the hit rate.
I say this from my own experience.
I have worked with many systems with a 'low hit rate', but extremely profitable.
For example: systems with hit rates of 40 to 50%.
But as much as statistically and mathematically the profit is rewarding, operating systems with a low hit rate is always very stressful psychologically.
That's why there are two big reasons why when I build an automated trading system, I focus on the high hit rate of the system, they are
1 - To reduce psychological damage as much as possible .
2 - And more important , when we create a system with a 'high hit rate' , there is a huge intrinsic advantage here, that most statistic traders don't take in consideration.
That is: knowing more quickly when the system stops being functional.
The main advantage of a system with a high hit rate is: to identify when the system stops being functional and stop exploiting the market's anomaly.
Look: When we are talking about trading and random distribution of results on the market, do you agree that when we create a trading system, we are focused on exploring some anomaly of that market?
When that anomaly is verified by the market, it will stop being functional with time.
That's why trading systems, 'scalpers', especially for cryptocurrencies, need constant monitoring, quarterly, semi-annually or annually.
Because market movements change from time to time.
Because we go through different cycles from time to time, such as congestion cycles, accumulation , distribution , volatility , uptrends and downtrends .
1️⃣2️⃣ : ❓ How to Exploit Market Anomalies ❓
You see there is a very important point that must be stressed here.
As we are always trying to exploit an 'anomaly' in the market.
So the 'number' of indicators/tools that will integrate the system is of paramount importance.
But most traders do not take this into consideration.
To build a professional, robust, consistent, and profitable system, you don't need to use hundreds of indicators to build your setup.
This will actually make it harder to read when the setup stops working and needs some adjustment.
So focusing on a high hit rate is very important here, this is a fundamental principle that is widely ignored , and with a high hit rate, we can know much more accurately when the system is no longer functional much faster.
As Darwin said: "It is not the strongest or the most intelligent that wins the game of life, it is the most adapted.
So simple systems, as contradictory as it may seem, are more efficient, because they help to identify inflection points in the market much more quickly.
1️⃣3️⃣ : ❓ What Defines a Robust, Profitable and Consistent System ❓
See I have built, hundreds of thousands of indicators and 'pinescript' strategies, hundreds of thousands.
This is an extremely professional, robust and profitable system.
Based on the currency pairs: BTC /USDT
There are many ways and avenues to build a profitable trading setup/system.
And actually this is not a difficult task, taking in consideration, as the main factor here, that our trading and investment plan is for the long term, so consequently we will face scenarios with less noise.
He who is in a hurry eats raw.
As mentioned before.
Defining trends in pinescript is technically a simple task, the hardest task is to determine congestion zones with low volume and volatility, it's in these moments that many false signals are generated, and consequently is where most setups face their maximum DrawDown.
That's why this strategy was strictly and thoroughly planned, built on a very solid foundation, to avoid as much noise as possible, for a positive and consistent equity curve in each market cycle, 'Consistency' is our 'Mantra' around here.
1️⃣4️⃣ : 🔧 Fixed Technical
• Strategy: Titan Investments|Quantitative THEMIS|Demo|BINANCE:BTCUSDTP:4h
• Pair: BTC/USDTP
• Time Frame: 4 hours
• Broker: Binance (Recommended)
For a more conservative scenario, we have built the Quantitative THEMIS for the 4h time frame, with the main focus on consistency.
So we can avoid noise as much as possible!
1️⃣5️⃣ : ❌ Fixed Outputs : 🎯 TP(%) & 🛑SL(%)
In order to build a 'perpetual' system specific to BTC/USDT, it took a lot of testing, and more testing, and a lot of investment and research.
There is one initial and fundamental point that we can address to justify the incredible consistency presented here.
That fundamental point is our exit via Take Profit or Stop Loss percentage (%).
🎯 Take Profit (%)
🛑 Stop Loss (%)
See, today I have been testing some more advanced backtesting models for some cryptocurrency systems.
In which I perform 'backtest of backtest', i.e. we use a set of strategies each focused on a principle, operating individually, but they are part of something unique, i.e. we do 'backtests' of 'backtests' together.
What I mean is that we do a lot of backtesting around here.
I can assure you, that always the best output for a trading system is to set fixed output values!
In other words:
🎯 Take Profit (%)
🛑 Stop Loss (%)
This happens because statistically setting fixed exit structures in the vast majority of times, presents a superior result on the capital/equity curve, throughout history and for the vast majority of setups compared to other exit methods.
This is due to a mathematical principle of simplicity, 'avoiding more noise'.
Thus whenever the Quantitative THEMIS strategy takes a position it has a target and a defined maximum stop percentage.
1️⃣6️⃣ : ⚠️ Risk Profile
The strategy, currently has 3 risk profiles ⚠️ patterns for 'fixed percentage exits': Take Profit (%) and Stop Loss (%) .
They are: ⚠️ Rich's Profiles
✔️🆑 Conservative: 🎯 TP=2.7 % 🛑 SL=2.7 %
❌Ⓜ️ Moderate: 🎯 TP=2.8 % 🛑 SL=2.7 %
❌🅰 Aggressive: 🎯 TP=1.6 % 🛑 SL=6.9 %
You will be able to select and switch between the above options and profiles through the 'input' menu of the strategy by navigating to the "⚠️ Risk Profile" menu.
You can then select, test and apply the Risk Profile above that best suits your risk management, expectations and reality , as well as customize all the 'fixed exit' values through the TP and SL menus below.
1️⃣7️⃣ : ⭕ Moving Exits : (Indicators)
The strategy currently also has 'Moving Exits' based on indicator signals.
These are Moving Exits (Indicators)
📈 LONG : (EXIT)
🧃 (MAO) Short : true
📉 SHORT : (EXIT)
🧃 (MAO) Long: false
You can select and toggle between the above options through the 'input' menu of the strategy by navigating to the "LONG : Exit" and "SHORT : Exit" menu.
1️⃣8️⃣ : 💸 Initial Capital
By default the "Initial Capital" set for entries and backtests of this strategy is: 10000 $
You can set another value for the 'Starting Capital' through the tradingview menu under "properties" , and edit the value of the "Initial Capital" field.
This way you can set and test other 'Entry Values' for your trades, tests and backtests.
1️⃣9️⃣ : ⚙️ Entry Options
By default the 'order size' set for this strategy is 100 % of the 'initial capital' on each new trade.
You can set and test other entry options like : contracts , cash , % of equity
You should make these changes directly in the input menu of the strategy by navigating to the menu "⚙️ Properties : TradingView" below.
⚙️ Properties : (TradingView)
📊 Strategy Type: strategy.position_size != 1
📝💲 % Order Type: % of equity
📝💲 % Order Size: 100
Leverage: 1
So you can define and test other 'Entry Options' for your trades, tests and backtests.
2️⃣0️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Third-Party Services'
It is possible to automate the signals of this strategy for any centralized or decentralized broker, as well as for messaging services: Discord, Telegram and Twitter.
All in an extremely simple and uncomplicated way through the tutorials available in PDF /VIDEO for our Titan Pro 👽 subscriber community.
With our tutorials in PDF and Video it will be possible to automate the signals of this strategy for the chosen service in an extremely simple way with less than 10 steps only.
Tradingview naturally doesn't count with native integration between brokers and tradingview.
But it is possible to use 'third party services' to do the integration and automation between Tradingview and your centralized or decentralized broker.
Here are the standard, available and recommended 'third party services' to automate the signals from the 'Quantitative THEMIS' strategy on the tradingview for your broker:
1) Wundertrading (Recommended):
2) 3commas:
3) Zignaly:
4) Aleeert.com (Recommended):
5) Alertatron:
Note! 'Third party services' cannot perform 'withdrawals' via their key 'API', they can only open positions, so your funds will always be 'safe' in your brokerage firm, being traded via the 'API', when they receive an entry and exit signal from this strategy.
2️⃣1️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Exchanges
You can automate this strategy for any of the brokers below, through your broker's 'API' by connecting it to the 'third party automation services' for tradingview available and mentioned in the menu above:
1) Binance (Recommended)
2) Bitmex
3) Bybit
4) KuCoin
5) Deribit
6) OKX
7) Coinbase
8) Huobi
9) Bitfinex
10) Bitget
11) Bittrex
12) Bitstamp
13) Gate. io
14) Kraken
15) Gemini
16) Ascendex
17) VCCE
2️⃣2️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : 'Messaging Services'
You can also automate and monitor the signals of this strategy much more efficiently by sending them to the following popular messaging services:
1) Discord
2) Telegram
3) Twitter
2️⃣3️⃣ : ❓ How to Automate this Strategy ❓ : 🤖 Automation : '🧲🤖Copy-Trading'
It will also be possible to copy/replicate the entries and exits of this strategy to your broker in an extremely simple and agile way, through the available copy-trader services.
This way it will be possible to replicate the signals of this strategy at each entry and exit to your broker through the API connecting it to the integrated copy-trader services available through the tradingview automation services below:
1) Wundetrading:
2) Zignaly:
2️⃣4️⃣ : ❔ Why be a Titan Pro 👽❔
I believe that today I am at another level in 'pinescript' development.
I consider myself today a true unicorn as a pinescript developer, someone unique and very rare.
If you choose another tool or another pinescript service, this tool will be just another one, with no real results.
But if you join our Titan community, you will have access to a unique tool! And you will get real results!
I already earn money consistently with statistical and automated trading and as an expert pinescript developer.
I am here to evolve my skills as much as possible, and one day become a pinescript 'Wizard'.
So excellence, quality and professionalism will always be my north here.
You will never find a developer like me, and who will take so seriously such a revolutionary project as this one. A Maverick! ▬ The man never stops!
Here you will find the highest degree of sophistication and development in the market for 'pinescript'.
You will get the best of me and the best of pinescript possible.
Let me show you how a professional in my field does it.
Become a Titan Pro Member 👽 and get Full Access to this strategy and all the Automation Tutorials.
Be the Titan in your life!
2️⃣5️⃣ : ❔ Why be a Titan Aff 🛸❔
Get financial return for your referrals, Decentralize the World, and raise the collective consciousness.
2️⃣6️⃣ : 📋 Summary : ⚖️ Strategy: Titan Investments|Quantitative THEMIS|Demo|BINANCE:BTCUSDTP:4h
® Titan Investimentos | Quantitative THEMIS ⚖️ | Demo 🐄 2.6 | Dev: © FilipeSoh 🧙 | 🤖 100% Automated : Discord, Telegram, Twitter, Wundertrading, 3commas, Zignaly, Aleeert, Alertatron, Uniswap-v3 | BINANCE:BTCUSDTPERP 4h
🛒 Subscribe this strategy ❗️ Be a Titan Member 🏛️
🛒 Titan Pro 👽 🔗 🏛️ Titan Pro 👽 Version with ✔️100% Integrated Automation 🤖 and 📚 Automation Tutorials ✔️100% available at: (PDF/VIDEO)
🛒 Titan Affiliate 🛸 🔗 🏛️ Titan Affiliate 🛸 (Subscription Sale) 🔥 Receive 50% commission
📋 Summary : QT THEMIS ⚖️
🕵️♂️ Check This Strategy..................................................................0
🦄 ® Titan Investimentos...............................................................1
👨💻 © Developer..........................................................................2
📚 Signal Automation Tutorials : (PDF/VIDEO).......................................3
👨🔧 Revision...............................................................................4
📊 Table : (BACKTEST)..................................................................5
📊 Table : (INFORMATIONS).............................................................6
⚙️ Properties : (TRADINGVIEW)........................................................7
📆 Backtest : (TRADINGVIEW)..........................................................8
⚠️ Risk Profile...........................................................................9
🟢 On 🔴 Off : (LONG/SHORT).......................................................10
📈 LONG : (ENTRY)....................................................................11
📉 SHORT : (ENTRY)...................................................................12
📈 LONG : (EXIT).......................................................................13
📉 SHORT : (EXIT)......................................................................14
🧩 (EI) External Indicator.............................................................15
📡 (QT) Quantitative...................................................................16
🎠 (FF) Forecast......................................................................17
🅱 (BB) Bollinger Bands................................................................18
🧃 (MAP) Moving Average Primary......................................................19
🧃 (MAP) Labels.........................................................................20
🍔 (MAQ) Moving Average Quaternary.................................................21
🍟 (MACD) Moving Average Convergence Divergence...............................22
📣 (VWAP) Volume Weighted Average Price........................................23
🪀 (HL) HILO..........................................................................24
🅾 (OBV) On Balance Volume.........................................................25
🥊 (SAR) Stop and Reverse...........................................................26
🛡️ (DSR) Dynamic Support and Resistance..........................................27
🔊 (VD) Volume Directional..........................................................28
🧰 (RSI) Relative Momentum Index.................................................29
🎯 (TP) Take Profit %..................................................................30
🛑 (SL) Stop Loss %....................................................................31
🤖 Automation Selected...............................................................32
📱💻 Discord............................................................................33
📱💻 Telegram..........................................................................34
📱💻 Twitter...........................................................................35
🤖 Wundertrading......................................................................36
🤖 3commas............................................................................37
🤖 Zignaly...............................................................................38
🤖 Aleeert...............................................................................39
🤖 Alertatron...........................................................................40
🤖 Uniswap-v3..........................................................................41
🧲🤖 Copy-Trading....................................................................42
♻️ ® No Repaint........................................................................43
🔒 Copyright ©️..........................................................................44
🏛️ Be a Titan Member..................................................................45
Nº Active Users..........................................................................46
⏱ Time Left............................................................................47
| 0 | 🕵️♂️ Check This Strategy
🕵️♂️ Version Demo: 🐄 Version with ❌non-integrated automation 🤖 and 📚 Tutorials for automation ❌not available
🕵️♂️ Version Pro: 👽 Version with ✔️100% Integrated Automation 🤖 and 📚 Automation Tutorials ✔️100% available at: (PDF/VIDEO)
| 1 | 🦄 ® Titan Investimentos
Decentralizing the World 🗺
Raising the Collective Conscience 🗺
🦄Site:
🦄TradingView: www.tradingview.com
🦄Discord:
🦄Telegram:
🦄Youtube:
🦄Twitter:
🦄Instagram:
🦄TikTok:
🦄Linkedin:
🦄E-mail:
| 2 | 👨💻 © Developer
🧠 Developer: @FilipeSoh🧙
📺 TradingView: www.tradingview.com
☑️ Linkedin:
✅ Fiverr:
✅ Upwork:
🎥 YouTube:
🐤 Twitter:
🤳 Instagram:
| 3 | 📚 Signal Automation Tutorials : (PDF/VIDEO)
📚 Discord: 🔗 Link: 🔒Titan Pro👽
📚 Telegram: 🔗 Link: 🔒Titan Pro👽
📚 Twitter: 🔗 Link: 🔒Titan Pro👽
📚 Wundertrading: 🔗 Link: 🔒Titan Pro👽
📚 3comnas: 🔗 Link: 🔒Titan Pro👽
📚 Zignaly: 🔗 Link: 🔒Titan Pro👽
📚 Aleeert: 🔗 Link: 🔒Titan Pro👽
📚 Alertatron: 🔗 Link: 🔒Titan Pro👽
📚 Uniswap-v3: 🔗 Link: 🔒Titan Pro👽
📚 Copy-Trading: 🔗 Link: 🔒Titan Pro👽
| 4 | 👨🔧 Revision
👨🔧 Start Of Operations: 01 Jan 2019 21:00 -0300 💡 Start Of Operations (Skin in the game) : Revision 1.0
👨🔧 Previous Review: 01 Jan 2022 21:00 -0300 💡 Previous Review : Revision 2.0
👨🔧 Current Revision: 01 Jan 2023 21:00 -0300 💡 Current Revision : Revision 2.6
👨🔧 Next Revision: 28 May 2023 21:00 -0300 💡 Next Revision : Revision 2.7
| 5 | 📊 Table : (BACKTEST)
📊 Table: true
🖌️ Style: label.style_label_left
📐 Size: size_small
📏 Line: defval
🎨 Color: #131722
| 6 | 📊 Table : (INFORMATIONS)
📊 Table: false
🖌️ Style: label.style_label_right
📐 Size: size_small
📏 Line: defval
🎨 Color: #131722
| 7 | ⚙️ Properties : (TradingView)
📊 Strategy Type: strategy.position_size != 1
📝💲 % Order Type: % of equity
📝💲 % Order Size: 100 %
🚀 Leverage: 1
| 8 | 📆 Backtest : (TradingView)
🗓️ Mon: true
🗓️ Tue: true
🗓️ Wed: true
🗓️ Thu: true
🗓️ Fri: true
🗓️ Sat: true
🗓️ Sun: true
📆 Range: custom
📆 Start: UTC 31 Oct 2008 00:00
📆 End: UTC 31 Oct 2030 23:45
📆 Session: 0000-0000
📆 UTC: UTC
| 9 | ⚠️ Risk Profile
✔️🆑 Conservative: 🎯 TP=2.7 % 🛑 SL=2.7 %
❌Ⓜ️ Moderate: 🎯 TP=2.8 % 🛑 SL=2.7 %
❌🅰 Aggressive: 🎯 TP=1.6 % 🛑 SL=6.9 %
| 10 | 🟢 On 🔴 Off : (LONG/SHORT)
🟢📈 LONG: true
🟢📉 SHORT: true
| 11 | 📈 LONG : (ENTRY)
📡 (QT) Long: true
🧃 (MAP) Long: false
🅱 (BB) Long: false
🍟 (MACD) Long: false
🅾 (OBV) Long: false
| 12 | 📉 SHORT : (ENTRY)
📡 (QT) Short: true
🧃 (MAP) Short: false
🅱 (BB) Short: false
🍟 (MACD) Short: false
🅾 (OBV) Short: false
| 13 | 📈 LONG : (EXIT)
🧃 (MAP) Short: true
| 14 | 📉 SHORT : (EXIT)
🧃 (MAP) Long: false
| 15 | 🧩 (EI) External Indicator
🧩 (EI) Connect your external indicator/filter: false
🧩 (EI) Connect your indicator here (Study mode only): close
🧩 (EI) Connect your indicator here (Study mode only): close
| 16 | 📡 (QT) Quantitative
📡 (QT) Quantitative: true
📡 (QT) Market: BINANCE:BTCUSDTPERP
📡 (QT) Dice: openai
| 17 | 🎠 (FF) Forecast
🎠 (FF) Include current unclosed current candle: true
🎠 (FF) Forecast Type: flat
🎠 (FF) Nº of candles to use in linear regression: 3
| 18 | 🅱 (BB) Bollinger Bands
🅱 (BB) Bollinger Bands: true
🅱 (BB) Type: EMA
🅱 (BB) Period: 20
🅱 (BB) Source: close
🅱 (BB) Multiplier: 2
🅱 (BB) Linewidth: 0
🅱 (BB) Color: #131722
| 19 | 🧃 (MAP) Moving Average Primary
🧃 (MAP) Moving Average Primary: true
🧃 (MAP) BarColor: false
🧃 (MAP) Background: false
🧃 (MAP) Type: SMA
🧃 (MAP) Source: open
🧃 (MAP) Period: 100
🧃 (MAP) Multiplier: 2.0
🧃 (MAP) Linewidth: 2
🧃 (MAP) Color P: #42bda8
🧃 (MAP) Color N: #801922
| 20 | 🧃 (MAP) Labels
🧃 (MAP) Labels: true
🧃 (MAP) Style BUY ZONE: shape.labelup
🧃 (MAP) Color BUY ZONE: #42bda8
🧃 (MAP) Style SELL ZONE: shape.labeldown
🧃 (MAP) Color SELL ZONE: #801922
| 21 | 🍔 (MAQ) Moving Average Quaternary
🍔 (MAQ) Moving Average Quaternary: true
🍔 (MAQ) BarColor: false
🍔 (MAQ) Background: false
🍔 (MAQ) Type: SMA
🍔 (MAQ) Source: close
🍔 (MAQ) Primary: 14
🍔 (MAQ) Secondary: 22
🍔 (MAQ) Tertiary: 44
🍔 (MAQ) Quaternary: 16
🍔 (MAQ) Linewidth: 0
🍔 (MAQ) Color P: #42bda8
🍔 (MAQ) Color N: #801922
| 22 | 🍟 (MACD) Moving Average Convergence Divergence
🍟 (MACD) Macd Type: EMA
🍟 (MACD) Signal Type: EMA
🍟 (MACD) Source: close
🍟 (MACD) Fast: 12
🍟 (MACD) Slow: 26
🍟 (MACD) Smoothing: 9
| 23 | 📣 (VWAP) Volume Weighted Average Price
📣 (VWAP) Source: close
📣 (VWAP) Period: 340
📣 (VWAP) Momentum A: 84
📣 (VWAP) Momentum B: 150
📣 (VWAP) Average Volume: 1
📣 (VWAP) Multiplier: 1
📣 (VWAP) Diviser: 2
| 24 | 🪀 (HL) HILO
🪀 (HL) Type: SMA
🪀 (HL) Function: Maverick🧙
🪀 (HL) Source H: high
🪀 (HL) Source L: low
🪀 (HL) Period: 20
🪀 (HL) Momentum: 26
🪀 (HL) Diviser: 2
🪀 (HL) Multiplier: 1
| 25 | 🅾 (OBV) On Balance Volume
🅾 (OBV) Type: EMA
🅾 (OBV) Source: close
🅾 (OBV) Period: 16
🅾 (OBV) Diviser: 2
🅾 (OBV) Multiplier: 1
| 26 | 🥊 (SAR) Stop and Reverse
🥊 (SAR) Source: close
🥊 (SAR) High: 1.8
🥊 (SAR) Mid: 1.6
🥊 (SAR) Low: 1.6
🥊 (SAR) Diviser: 2
🥊 (SAR) Multiplier: 1
| 27 | 🛡️ (DSR) Dynamic Support and Resistance
🛡️ (DSR) Source D: close
🛡️ (DSR) Source R: high
🛡️ (DSR) Source S: low
🛡️ (DSR) Momentum R: 0
🛡️ (DSR) Momentum S: 2
🛡️ (DSR) Diviser: 2
🛡️ (DSR) Multiplier: 1
| 28 | 🔊 (VD) Volume Directional
🔊 (VD) Type: SMA
🔊 (VD) Period: 68
🔊 (VD) Momentum: 3.8
🔊 (VD) Diviser: 2
🔊 (VD) Multiplier: 1
| 29 | 🧰 (RSI) Relative Momentum Index
🧰 (RSI) Type UP: EMA
🧰 (RSI) Type DOWN: EMA
🧰 (RSI) Source: close
🧰 (RSI) Period: 29
🧰 (RSI) Smoothing: 22
🧰 (RSI) Momentum R: 64
🧰 (RSI) Momentum S: 142
🧰 (RSI) Diviser: 2
🧰 (RSI) Multiplier: 1
| 30 | 🎯 (TP) Take Profit %
🎯 (TP) Take Profit: false
🎯 (TP) %: 2.2
🎯 (TP) Color: #42bda8
🎯 (TP) Linewidth: 1
| 31 | 🛑 (SL) Stop Loss %
🛑 (SL) Stop Loss: false
🛑 (SL) %: 2.7
🛑 (SL) Color: #801922
🛑 (SL) Linewidth: 1
| 32 | 🤖 Automation : Discord | Telegram | Twitter | Wundertrading | 3commas | Zignaly | Aleeert | Alertatron | Uniswap-v3
🤖 Automation Selected : Discord
| 33 | 🤖 Discord
🔗 Link Discord:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Discord ▬ Enter Long: 🔒Titan Pro👽
📱💻 Discord ▬ Exit Long: 🔒Titan Pro👽
📱💻 Discord ▬ Enter Short: 🔒Titan Pro👽
📱💻 Discord ▬ Exit Short: 🔒Titan Pro👽
| 34 | 🤖 Telegram
🔗 Link Telegram:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Telegram ▬ Enter Long: 🔒Titan Pro👽
📱💻 Telegram ▬ Exit Long: 🔒Titan Pro👽
📱💻 Telegram ▬ Enter Short: 🔒Titan Pro👽
📱💻 Telegram ▬ Exit Short: 🔒Titan Pro👽
| 35 | 🤖 Twitter
🔗 Link Twitter:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Twitter ▬ Enter Long: 🔒Titan Pro👽
📱💻 Twitter ▬ Exit Long: 🔒Titan Pro👽
📱💻 Twitter ▬ Enter Short: 🔒Titan Pro👽
📱💻 Twitter ▬ Exit Short: 🔒Titan Pro👽
| 36 | 🤖 Wundertrading : Binance | Bitmex | Bybit | KuCoin | Deribit | OKX | Coinbase | Huobi | Bitfinex | Bitget
🔗 Link Wundertrading:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Wundertrading ▬ Enter Long: 🔒Titan Pro👽
📱💻 Wundertrading ▬ Exit Long: 🔒Titan Pro👽
📱💻 Wundertrading ▬ Enter Short: 🔒Titan Pro👽
📱💻 Wundertrading ▬ Exit Short: 🔒Titan Pro👽
| 37 | 🤖 3commas : Binance | Bybit | OKX | Bitfinex | Coinbase | Deribit | Bitmex | Bittrex | Bitstamp | Gate.io | Kraken | Gemini | Huobi | KuCoin
🔗 Link 3commas:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 3commas ▬ Enter Long: 🔒Titan Pro👽
📱💻 3commas ▬ Exit Long: 🔒Titan Pro👽
📱💻 3commas ▬ Enter Short: 🔒Titan Pro👽
📱💻 3commas ▬ Exit Short: 🔒Titan Pro👽
| 38 | 🤖 Zignaly : Binance | Ascendex | Bitmex | Kucoin | VCCE
🔗 Link Zignaly:
🔗 Link 📚 Automation: 🔒Titan Pro👽
🤖 Type Automation: Profit Sharing
🤖 Type Provider: Webook
🔑 Key: 🔒Titan Pro👽
🤖 pair: BTCUSDTP
🤖 exchange: binance
🤖 exchangeAccountType: futures
🤖 orderType: market
🚀 leverage: 1x
% positionSizePercentage: 100 %
💸 positionSizeQuote: 10000 $
🆔 signalId: @Signal1234
| 39 | 🤖 Aleeert : Binance
🔗 Link Aleeert:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Aleeert ▬ Enter Long: 🔒Titan Pro👽
📱💻 Aleeert ▬ Exit Long: 🔒Titan Pro👽
📱💻 Aleeert ▬ Enter Short: 🔒Titan Pro👽
📱💻 Aleeert ▬ Exit Short: 🔒Titan Pro👽
| 40 | 🤖 Alertatron : Binance | Bybit | Deribit | Bitmex
🔗 Link Alertatron:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Alertatron ▬ Enter Long: 🔒Titan Pro👽
📱💻 Alertatron ▬ Exit Long: 🔒Titan Pro👽
📱💻 Alertatron ▬ Enter Short: 🔒Titan Pro👽
📱💻 Alertatron ▬ Exit Short: 🔒Titan Pro👽
| 41 | 🤖 Uniswap-v3
🔗 Link Alertatron:
🔗 Link 📚 Automation: 🔒Titan Pro👽
📱💻 Uniswap-v3 ▬ Enter Long: 🔒Titan Pro👽
📱💻 Uniswap-v3 ▬ Exit Long: 🔒Titan Pro👽
📱💻 Uniswap-v3 ▬ Enter Short: 🔒Titan Pro👽
📱💻 Uniswap-v3 ▬ Exit Short: 🔒Titan Pro👽
| 42 | 🧲🤖 Copy-Trading : Zignaly | Wundertrading
🔗 Link 📚 Copy-Trading: 🔒Titan Pro👽
🧲🤖 Copy-Trading ▬ Zignaly: 🔒Titan Pro👽
🧲🤖 Copy-Trading ▬ Wundertrading: 🔒Titan Pro👽
| 43 | ♻️ ® Don't Repaint!
♻️ This Strategy does not Repaint!: ® Signs Do not repaint❕
♻️ This is a Real Strategy!: Quality : ® Titan Investimentos
📋️️ Get more information about Repainting here:
| 44 | 🔒 Copyright ©️
🔒 Copyright ©️: Copyright © 2023-2024 All rights reserved, ® Titan Investimentos
🔒 Copyright ©️: ® Titan Investimentos
🔒 Copyright ©️: Unique and Exclusive Strategy. All rights reserved
| 45 | 🏛️ Be a Titan Members
🏛️ Titan Pro 👽 Version with ✔️100% Integrated Automation 🤖 and 📚 Automation Tutorials ✔️100% available at: (PDF/VIDEO)
🏛️ Titan Affiliate 🛸 (Subscription Sale) 🔥 Receive 50% commission
| 46 | ⏱ Time Left
Time Left Titan Demo 🐄: ⏱♾ | ⏱ : ♾ Titan Demo 🐄 Version with ❌non-integrated automation 🤖 and 📚 Tutorials for automation ❌not available
Time Left Titan Pro 👽: 🔒Titan Pro👽 | ⏱ : Pro Plans: 30 Days, 90 Days, 12 Months, 24 Months. (👽 Pro 🅼 Monthly, 👽 Pro 🆀 Quarterly, 👽 Pro🅰 Annual, 👽 Pro👾Two Years)
| 47 | Nº Active Users
Nº Active Subscribers Titan Pro 👽: 5️⃣6️⃣ | 1✔️ 5✔️ 10✔️ 100❌ 1K❌ 10K❌ 50K❌ 100K❌ 1M❌ 10M❌ 100M❌ : ⏱ Active Users is updated every 24 hours (Check on indicator)
Nº Active Affiliates Titan Aff 🛸: 6️⃣ | 1✔️ 5✔️ 10❌ 100❌ 1K❌ 10K❌ 50K❌ 100K❌ 1M❌ 10M❌ 100M❌ : ⏱ Active Users is updated every 24 hours (Check on indicator)
2️⃣7️⃣ : 📊 PERFORMANCE : 🆑 Conservative
📊 Exchange: Binance
📊 Pair: BINANCE: BTCUSDTPERP
📊 TimeFrame: 4h
📊 Initial Capital: 10000 $
📊 Order Type: % equity
📊 Size Per Order: 100 %
📊 Commission: 0.03 %
📊 Pyramid: 1
• ⚠️ Risk Profile: 🆑 Conservative: 🎯 TP=2.7 % | 🛑 SL=2.7 %
• 📆All years: 🆑 Conservative: 🚀 Leverage 1️⃣x
📆 Start: September 23, 2019
📆 End: January 11, 2023
📅 Days: 1221
📅 Bars: 7325
Net Profit:
🟢 + 1669.89 %
💲 + 166989.43 USD
Total Close Trades:
⚪️ 369
Percent Profitable:
🟡 64.77 %
Profit Factor:
🟢 2.314
DrawDrown Maximum:
🔴 -24.82 %
💲 -10221.43 USD
Avg Trade:
💲 + 452.55 USD
✔️ Trades Winning: 239
❌ Trades Losing: 130
✔️ Average Gross Win: + 12.31 %
❌ Average Gross Loss: - 9.78 %
✔️ Maximum Consecutive Wins: 9
❌ Maximum Consecutive Losses: 6
% Average Gain Annual: 499.33 %
% Average Gain Monthly: 41.61 %
% Average Gain Weekly: 9.6 %
% Average Gain Day: 1.37 %
💲 Average Gain Annual: 49933 $
💲 Average Gain Monthly: 4161 $
💲 Average Gain Weekly: 960 $
💲 Average Gain Day: 137 $
• 📆 Year: 2020: 🆑 Conservative: 🚀 Leverage 1️⃣x
• 📆 Year: 2021: 🆑 Conservative: 🚀 Leverage 1️⃣x
• 📆 Year: 2022: 🆑 Conservative: 🚀 Leverage 1️⃣x
2️⃣8️⃣ : 📊 PERFORMANCE : Ⓜ️ Moderate
📊 Exchange: Binance
📊 Pair: BINANCE: BTCUSDTPERP
📊 TimeFrame: 4h
📊 Initial Capital: 10000 $
📊 Order Type: % equity
📊 Size Per Order: 100 %
📊 Commission: 0.03 %
📊 Pyramid: 1
• ⚠️ Risk Profile: Ⓜ️ Moderate: 🎯 TP=2.8 % | 🛑 SL=2.7 %
• 📆 All years: Ⓜ️ Moderate: 🚀 Leverage 1️⃣x
📆 Start: September 23, 2019
📆 End: January 11, 2023
📅 Days: 1221
📅 Bars: 7325
Net Profit:
🟢 + 1472.04 %
💲 + 147199.89 USD
Total Close Trades:
⚪️ 362
Percent Profitable:
🟡 63.26 %
Profit Factor:
🟢 2.192
DrawDrown Maximum:
🔴 -22.69 %
💲 -9269.33 USD
Avg Trade:
💲 + 406.63 USD
✔️ Trades Winning: 229
❌ Trades Losing : 133
✔️ Average Gross Win: + 11.82 %
❌ Average Gross Loss: - 9.29 %
✔️ Maximum Consecutive Wins: 9
❌ Maximum Consecutive Losses: 8
% Average Gain Annual: 440.15 %
% Average Gain Monthly: 36.68 %
% Average Gain Weekly: 8.46 %
% Average Gain Day: 1.21 %
💲 Average Gain Annual: 44015 $
💲 Average Gain Monthly: 3668 $
💲 Average Gain Weekly: 846 $
💲 Average Gain Day: 121 $
• 📆 Year: 2020: Ⓜ️ Moderate: 🚀 Leverage 1️⃣x
• 📆 Year: 2021: Ⓜ️ Moderate: 🚀 Leverage 1️⃣x
• 📆 Year: 2022: Ⓜ️ Moderate: 🚀 Leverage 1️⃣x
2️⃣9️⃣ : 📊 PERFORMANCE : 🅰 Aggressive
📊 Exchange: Binance
📊 Pair: BINANCE: BTCUSDTPERP
📊 TimeFrame: 4h
📊 Initial Capital: 10000 $
📊 Order Type: % equity
📊 Size Per Order: 100 %
📊 Commission: 0.03 %
📊 Pyramid: 1
• ⚠️ Risk Profile: 🅰 Aggressive: 🎯 TP=1.6 % | 🛑 SL=6.9 %
• 📆 All years: 🅰 Aggressive: 🚀 Leverage 1️⃣x
📆 Start: September 23, 2019
📆 End: January 11, 2023
📅 Days: 1221
📅 Bars: 7325
Net Profit:
🟢 + 989.38 %
💲 + 98938.38 USD
Total Close Trades:
⚪️ 380
Percent Profitable:
🟢 84.47 %
Profit Factor:
🟢 2.156
DrawDrown Maximum:
🔴 -17.88 %
💲 -9182.84 USD
Avg Trade:
💲 + 260.36 USD
✔️ Trades Winning: 321
❌ Trades Losing: 59
✔️ Average Gross Win: + 5.75 %
❌ Average Gross Loss: - 14.51 %
✔️ Maximum Consecutive Wins: 21
❌ Maximum Consecutive Losses: 6
% Average Gain Annual: 295.84 %
% Average Gain Monthly: 24.65 %
% Average Gain Weekly: 5.69 %
% Average Gain Day: 0.81 %
💲 Average Gain Annual: 29584 $
💲 Average Gain Monthly: 2465 $
💲 Average Gain Weekly: 569 $
💲 Average Gain Day: 81 $
• 📆 Year: 2020: 🅰 Aggressive: 🚀 Leverage 1️⃣x
• 📆 Year: 2021: 🅰 Aggressive: 🚀 Leverage 1️⃣x
• 📆 Year: 2022: 🅰 Aggressive: 🚀 Leverage 1️⃣x
3️⃣0️⃣ : 🛠️ Roadmap
🛠️• 14/ 01 /2023 : Titan THEMIS Launch
🛠️• Updates January/2023 :
• 📚 Tutorials for Automation 🤖 already Available : ✔️
• ✔️ Discord
• ✔️ Wundertrading
• ✔️ Zignaly
• 📚 Tutorials for Automation 🤖 In Preparation : ⭕
• ⭕ Telegram
• ⭕ Twitter
• ⭕ 3comnas
• ⭕ Aleeert
• ⭕ Alertatron
• ⭕ Uniswap-v3
• ⭕ Copy-Trading
🛠️• Updates February/2023 :
• 📰 Launch of advertising material for Titan Affiliates 🛸
• 🛍️🎥🖼️📊 (Sales Page/VSL/Videos/Creative/Infographics)
🛠️• 28/05/2023 : Titan THEMIS update ▬ Version 2.7
🛠️• 28/05/2023 : BOT BOB release ▬ Version 1.0
• (Native Titan THEMIS Automation - Through BOT BOB, a bot for automation of signals, indicators and strategies of TradingView, of own code ▬ in validation.
• BOT BOB
Automation/Connection :
• API - For Centralized Brokers.
• Smart Contracts - Wallet Web - For Decentralized Brokers.
• This way users can automate any indicator or strategy of TradingView and Titan in a decentralized, secure and simplified way.
• Without having the need to use 'third party services' for automating TradingView indicators and strategies like the ones available above.
🛠️• 28/05/2023 : Release ▬ Titan Culture Guide 📝
3️⃣1️⃣ : 🧻 Notes ❕
🧻 • Note ❕ The "Demo 🐄" version, ❌does not have 'integrated automation', to automate the signals of this strategy and enjoy a fully automated system, you need to have access to the Pro version with '100% integrated automation' and all the tutorials for automation available. Become a Titan Pro 👽
🧻 • Note ❕ You will also need to be a "Pro User or higher on Tradingview", to be able to use the webhook feature available only for 'paid' profiles on the platform.
With the webhook feature it is possible to send the signals of this strategy to almost anywhere, in our case to centralized or decentralized brokerages, also to popular messaging services such as: Discord, Telegram or Twiter.
3️⃣2️⃣ : 🚨 Disclaimer ❕❗
🚨 • Disclaimer ❕❕ Past positive result and performance of a system does not guarantee its positive result and performance for the future!
🚨 • Disclaimer ❗❗❗ When using this strategy: Titan Investments is totally Exempt from any claim of liability for losses. The responsibility on the management of your funds is solely yours. This is a very high risk/volatility market! Understand your place in the market.
3️⃣3️⃣ : ♻️ ® No Repaint
This Strategy does not Repaint! This is a real strategy!
3️⃣4️⃣ : 🔒 Copyright ©️
Copyright © 2022-2023 All rights reserved, ® Titan Investimentos
3️⃣5️⃣ : 👏 Acknowledgments
I want to start this message in thanks to TradingView and all the Pinescript community for all the 'magic' created here, a unique ecosystem! rich and healthy, a fertile soil, a 'new world' of possibilities, for a complete deepening and improvement of our best personal skills.
I leave here my immense thanks to the whole community: Tradingview, Pinecoders, Wizards and Moderators.
I was not born Rich .
Thanks to TradingView and pinescript and all its transformation.
I could develop myself and the best of me and the best of my skills.
And consequently build wealth and patrimony.
Gratitude.
One more story for the infinite book !
If you were born poor you were born to be rich !
Raising🔼 the level and raising🔼 the ruler! 📏
My work is my 'debauchery'! Do better! 💐🌹
Soul of a first-timer! Creativity Exudes! 🦄
This is the manifestation of God's magic in me. This is the best of me. 🧙
You will copy me, I know. So you owe me. 💋
My mission here is to raise the consciousness and self-esteem of all Titans and Titanids! Welcome! 🧘 🏛️
The only way to accomplish great work is to do what you love ! Before I learned to program I was wasting my life!
Death is the best creation of life .
Now you are the new , but in the not so distant future you will gradually become the old . Here I stay forever!
Playing the game like an Athlete! 🖼️ Enjoy and Enjoy 🍷 🗿
In honor of: BOB ☆
1 name, 3 letters, 3 possibilities, and if read backwards it's the same thing, a palindrome. ☘
Gratitude to the oracles that have enabled me the 'luck' to get this far: Dal&Ni&Fer
3️⃣6️⃣ : 👮 House Rules : 📺 TradingView
House Rules : This publication and strategy follows all TradingView house guidelines and rules:
📺 TradingView House Rules: www.tradingview.com
📺 Script publication rules: www.tradingview.com
📺 Vendor requirements: www.tradingview.com
📺 Links/References rules: www.tradingview.com
3️⃣7️⃣ : 🏛️ Become a Titan Pro member 👽
🟩 Titan Pro 👽 🟩
3️⃣8️⃣ : 🏛️ Be a member Titan Aff 🛸
🟥 Titan Affiliate 🛸 🟥
Probability
Probability Oscillator (Zeiierman)█ Overview
The Probability Oscillator (Zeiierman) turns price dynamics into a regime-aware probability map of continuation vs. reversal. Rather than treating momentum as a single, fixed signal, it adapts its core estimator to current market conditions—favoring trend persistence in calm regimes and oscillation/reversion in volatile regimes.
You get a fast Probability line, a slower Signal line, dynamic OB/OS bounds, midline bias, color-coded trend probability, background regime cues, and Momentum Impulse dots that reveal concentrated bursts of directional intent. Beneath the surface, the Probability line functions as a sequential Bayesian filter — continuously updating a regime-conditioned prior (trend or volatility) with new market evidence. The resulting posterior odds are then expressed as a bounded oscillator for intuitive interpretation. In stable markets, the prior favors continuity; in volatile markets, it reweights toward mean reversion.
⚪ Why This One Is Unique
The Probability Oscillator operates within a self-adaptive probabilistic framework that continuously reshapes itself in response to the market’s evolving structure. Rather than relying on fixed formulas or static thresholds, it employs a context-aware Bayesian core that interprets flow dynamics through an adaptive regime model.
Its internal architecture blends state recognition, probability normalization, and dynamic envelope mapping, allowing it to adjust between conditions of directional stability and volatility-driven reversion fluidly. The result is an intelligent, self-adjusting probability field that remains stable in trends, reactive in consolidations, and contextually aware across all market states—delivering a refined sense of probabilistic direction without exposing raw computational structure.
█ Main Features
⚪ Probability Oscillator
At the core lies a probability-driven oscillator that continuously adapts its internal weighting to evolving market behavior. It translates incoming price evidence into a smooth probability curve that distinguishes between continuation and reversion phases, providing a refined view of conviction beneath price action.
The Probability Oscillator estimates the likelihood of trend continuation while dynamically adjusting to the surrounding volatility regime:
Probability Line (fast) – Captures short-term probability shifts, weighted by current market conditions — calm or volatile.
Signal Line (slow) – A smoothed probability filter that defines the prevailing bias and confirms directional persistence.
Momentum Impulse Dots – Small markers highlighting bursts of positive (green) or negative (red) momentum, indicating transitions in conviction strength.
The oscillator’s probabilistic framework automatically transitions between two self-adaptive modes:
Low-Volatility Mode – Prioritizes directional momentum and smooth trend continuity, ideal for trending markets.
High-Volatility Mode – Emphasizes oscillatory probability swings and reversals, optimized for range-bound or transitional conditions.
This dual-regime behavior allows the Probability Oscillator to remain stable in directional trends yet responsive in volatile ranges, producing a coherent probabilistic signal across any timeframe.
⚪ Trend Probability Coloring
The Trend Probability Coloring system transforms the Signal Line into a live confidence gauge. Its adaptive hue reflects the underlying probabilistic bias — green for sustained bullish pressure, red for bearish control, and yellow during transitional uncertainty. Behind the scenes, it applies curvature-sensitive weighting and probabilistic smoothing to display a visually coherent measure of directional conviction.
⚪ Impulse Dots
Impulse Dots identify moments of concentrated momentum expansion — short bursts of probabilistic acceleration that often precede shifts in structure. Each impulse represents a localized jump in directional confidence, isolating meaningful change-points from background noise. The result is a precise visualization of where probability and price begin to align, revealing early cues of strength, exhaustion, or imminent rotation.
█ How to Use
⚪ Trend Following
The Signal Line acts as the long-term probabilistic trend gauge, revealing when the market is building or losing directional conviction. Its slope and color communicate both bias and transition strength:
Green → bullish probability bias (trend continuation likely).
Red → bearish probability bias (downside continuation likely).
Yellow → transitional or indecisive phase (potential regime shift).
Use the Signal Line to confirm directional alignment:
A transition from red → yellow → green signals that the market is turning bullish and probability is shifting toward continuation on the upside.
A transition from green → yellow → red signals that bullish conviction is fading and bearish control is emerging.
⚪ Overbought & Oversold
The Probability Oscillator can also be used to identify overbought and oversold conditions by observing when the Probability Line moves above its upper bound or below its lower bound. These events often signal potential market slowdowns, pullbacks, or even broader reversals depending on context and regime.
The OB/OS levels automatically adapt to the prevailing market mode:
Trend Mode (~70/30) – Optimized for riding trends and timing pullbacks within directional continuations.
Volatility Mode (~80/20) – Tailored for fading extremes and capturing fast mean-reversion moves during consolidation phases.
Signals: Reclaims from oversold zones within a bullish bias, or rejections from overbought zones in a bearish bias, represent high-probability inflection points — especially when confirmed by Impulse Dots or regime-aligned Signal Line color transitions.
⚪ Using Volatility Modes to Choose Strategy
The Probability Oscillator automatically adapts its behavior to the active volatility regime, enabling traders to align their approach with the current market state. One of the most effective ways to use the tool is to select a trading strategy that aligns with the prevailing market mode.
Trend Mode (purple fill) – Represents low-volatility, directional environments where markets move smoothly and sustain momentum over time. In these conditions, a trend-following approach is most effective. Focus on the broader direction, participate on Probability-over-Signal crossups above 50, and trail positions as long as the Signal Line remains green. These calm phases often persist before volatility expansion, making them ideal for riding steady continuation waves rather than reacting to short-term fluctuations.
Volatility Mode (blue switch bar) – Activates in high-volatility conditions, signaling increased market agitation and sharper price swings. In this regime, trading becomes more tactical. Mean-reversion and scalping strategies perform best—fade OB/OS extremes, use midline reclaims for timing, or trade Impulse confirmations to capture breakout accelerations and short-term momentum surges.
⚪ Impulse
The Momentum Impulses highlight periods when the market experiences sharp bursts of directional momentum, marking transitions in conviction strength and energy expansion.
Green top dots → Indicate strong bullish impulses, often signaling the onset or acceleration of upward momentum.
Red bottom dots → Indicate strong bearish impulses, highlighting pressure buildup or downside continuation.
These impulses are particularly useful in two contexts:
During ranging markets , they help confirm overbought and oversold conditions, signaling when reversals or exhaustion points are highly probable.
During regime transitions , they validate breakout strength, confirming that new directional phases are supported by genuine momentum rather than noise.
In essence, Impulse Dots visualize the heartbeat of market conviction—pinpointing where momentum surges align with probabilistic bias, whether to confirm a breakout or warn of exhaustion in choppy conditions.
█ How It Works
⚪ Regime Switch Engine
At the foundation lies a Bayesian regime adaptation process that treats volatility as evolving market evidence. The system continuously updates a prior belief about whether the market favors directional persistence or oscillatory reversion. In calm states, it maintains a continuity-biased belief structure that favors smoother probability propagation.
Calculation: Employs a volatility-normalized Bayesian comparator, generating a posterior distribution over regime likelihoods. This ensures the oscillator remains statistically invariant to scale and consistent across instruments and timeframes.
⚪ Trend Probability Coloring (Conviction Layer)
The Trend Probability Coloring system visualizes Bayesian posterior confidence in real time. It continuously updates the Signal Line’s color as new evidence shifts the model’s belief between bullish, neutral, and bearish states.
When the posterior probability leans strongly upward, the line turns green; as uncertainty grows, it fades to yellow; and when conviction turns negative, it transitions to red. Each color change represents a probabilistic reweighting — the model’s evolving assessment of directional dominance.
Calculation: Applies posterior-weighted smoothing and curvature-based confidence mapping to translate Bayesian belief strength into a fluid visual gradient.
⚪ Momentum Impulse Engine
The Momentum Impulse Engine detects sudden bursts in probabilistic conviction — moments when the Bayesian posterior sharply reweights toward one directional outcome. These impulses represent statistically significant shifts in belief, where new evidence rapidly alters the model’s assessment of market direction.
Green impulses highlight surges in bullish probability; red impulses mark spikes in bearish conviction. Each impulse reflects a brief phase of directional dominance, revealing where probability momentum begins to accelerate or exhaust.
Calculation: Employs nonlinear Bayesian change detection and extreme-value gating to isolate abrupt posterior inflections.
-----------------
Disclaimer
The content provided in my scripts, indicators, ideas, algorithms, and systems is for educational and informational purposes only. It does not constitute financial advice, investment recommendations, or a solicitation to buy or sell any financial instruments. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
付費腳本
Daily Manual KILLZONESThis indicator is to be used with "KILLSTATS", our indicator allowing to backtest on hundreds of days at which time, and which day the top/low of the day and week is formed.
"Manual Killzone" allows to define our statistical killzones by day of the week manually: you define your own rules according to your interpretation of our Killstats indicator.
It integrates a daily price action filter according to the ICT concept:
It will only display bullish probabilities (green) defined if and only if we are in discount and out of the daily range 25/75%.
Same for bearish probabilities (red)
The blue color is to be applied in case of reversal with high contradictory probability (Example: to be used for Tuesday from 2pm to 3pm, if Tuesday is a day with high probability to form a top, but 2pm/15pm is the time with high probability to form a bottom AND a top. Indecision => blue)
WARNING : Calculated according to Etc/UTC time : put "0" in the Timezone parameter of killstats.
It is necessary to use the replay mode regularly during the backtesting to update the data!
KillstatsBacktest and identify at what times/days the high/low were formed. The periods are shown on the graph along with detailed statistics.
Exemple with "days : 600" and "13h : top 12%" : we understand that over 600 days, in 12% of the cases we have formed the top of the day at 13h.
up to 1000+ days studied to find favorable reversal time slots: killstats! The data presented can sometimes be... surprising.
Increasing/decreasing the timeframe on chart = increase/decrease the studied period.
A period of 1000 days ( UT : h1) allows to have solid but not exact statistics.
A period of 30 days allows to have current statistics but too little sample to know if the data is relevant.
I recommend looking for intersections of killstats over several periods: If over 1000 days AND 30 days, 3pm was a time with a high probability of forming a top, it is interesting to look for short positions between 3pm and 4pm.
The data is displayed in the form of a diagram whose visual allows to identify effective time slots.
Caution. Timeframe: h1 maximum for the study of the day's high/low to be correct - and daily maximum for the study of the week's high/low.
Caution2. Match the timezone with the input (by default set to GMT+1). So if you are at GMT+2, you must put "2" in timezone.
I recommend using this as part of an aggressive high frequency scalping strategy to make the most of your trading session - with the aim of quickly moving to TP1/BE and leaving your winning position open.
Pair Prowler [CR]█ OVERVIEW
Pair Prowler v6 Enhanced is a sophisticated oscillator-based trading system designed for traders seeking high-probability setups with multiple confirmation layers. The indicator combines proprietary signal generation with institutional-grade filters to identify optimal entry and exit points while minimizing false signals.
The system features adaptive zones that dynamically adjust to market conditions, multi-timeframe support/resistance analysis, volume-weighted mean reversion filters, and real-time performance tracking. A comprehensive confluence scoring system evaluates each potential trade across eight technical dimensions, allowing traders to filter for only the highest-quality opportunities.
█ KEY FEATURES
Adaptive Dynamic Zones
Rather than using fixed overbought/oversold levels, the indicator employs statistical methods to calculate adaptive zones that adjust to recent price behavior. These zones automatically widen during high volatility and tighten during consolidation, ensuring signals remain relevant across all market conditions.
VWAP Mean Reversion Filter
This filter uses volume-weighted price analysis to identify when price has moved significantly away from fair value. The system calculates statistical deviation from VWAP and only permits:
- Long entries when price is substantially below VWAP (oversold relative)
- Short entries when price is substantially above VWAP (overbought relative)
Higher Timeframe Support/Resistance Filter
To avoid entries near major reversal zones, the indicator analyzes pivot highs and lows from a user-selected higher timeframe. The system maintains a database of recent support and resistance levels and blocks trades that would occur too close to these critical price levels. This prevents getting stopped out by predictable institutional activity at key levels.
Divergence Detection
The indicator automatically identifies four types of divergences between price and the oscillator.
Risk Entry Signals
For aggressive traders, the indicator provides early warning signals that fire before the main entry triggers. These risk entries offer better entry prices but come with lower probability. They are visually distinct from standard entries and can be toggled on or off.
Safe Exit Zones
In addition to standard exit signals, the system identifies optimal profit-taking zones using statistical analysis and adaptive thresholds. These safe exit zones are highlighted with background coloring to alert traders when positions have reached favorable risk-reward levels.
Performance Statistics Panel
A comprehensive real-time statistics dashboard tracks:
- Total trades executed (long and short separately)
- Win rate percentages (overall, long-only, short-only)
- Profit factor calculation
- Total and average profit/loss per trade
- Largest winning and losing trades
- Maximum consecutive wins and losses
The panel can be positioned in any corner of the chart and updates automatically as trades close. Note that statistics represent theoretical performance based on signal timing and do not account for slippage, commissions, or execution delays.
Comprehensive Alert System
The indicator includes over 20 pre-configured alert types
█ HOW TO USE
Initial Setup
1 — Select your preferred base strategy from the Signal Settings group. Strategy 1 is recommended for most traders as it provides a balanced approach suitable for various market conditions.
2 — Configure the VWAP filter threshold based on your trading style:
Lower thresholds (1.0–1.5) for more frequent entries
Higher thresholds (2.0+) for fewer but more extreme reversals
3 — Set the HTF S/R filter timeframe to approximately 4–6 times your chart timeframe. For example, use 4H pivots when trading on 1H charts.
Reading Signals
Entry signals appear as triangles at the oscillator level:
- Green upward triangles indicate long entries
- Red downward triangles indicate short entries
- Small circles mark early risk entries
Exit signals appear as opposite-colored triangles. Background shading indicates special conditions like safe exit zones or averaging opportunities.
Interpreting Statistics
Use the performance panel to gauge strategy effectiveness:
- Win rates above 50% indicate positive edge
- Profit factor above 1.5 suggests robust performance
- Review max consecutive losses for position sizing guidance
Remember that past theoretical performance does not guarantee future results.
█ NOTES
Timeframe Considerations
This indicator works on all timeframes but performs optimally on 15-minute to 4-hour charts. Very low timeframes (1m–5m) may produce excessive signals, while daily and weekly charts may produce insufficient signals for active trading.
Market Conditions
The adaptive nature of the indicator allows it to function in both trending and ranging markets. However, extremely choppy or low-liquidity conditions may reduce signal quality. The confluence scoring system helps filter these periods automatically.
VWAP Behavior
VWAP resets at session boundaries for traditional markets (stocks) but runs continuously for 24-hour markets (crypto, forex). The z-score filter accounts for this difference automatically.
HTF Pivot Lag
Higher timeframe pivots require confirmation bars before being identified, introducing slight lag. Pivots are detected retrospectively once the full pattern completes on the selected timeframe.
Performance Tracking Limitations
The statistics panel tracks theoretical entry at close of signal bar and exit at close of exit bar. Actual trading results will differ due to:
- Slippage and spread costs
- Commission and fees
- Execution timing and delays
- Partial fills or rejections
- Overnight holding costs
Use the statistics as a comparative tool for optimization rather than a profit predictor.
Filter Interactions
All filters work sequentially. A signal must pass the VWAP filter, then the S/R filter. If any filter rejects the signal, it will not appear on the chart. This hierarchical approach ensures only fully validated setups generate alerts.
Optimization Guidelines
If receiving too many signals, tighten filter thresholds. If receiving too few signals, relax filters. Monitor the statistics panel over at least 50 trades before making significant parameter adjustments.
付費腳本
Multi-Panel: Trade-Volatility-Probability [Loxx]Multi-Panel: Trade-Volatility-Probability shows user selected and volatility-based price levels and probabilities on the chart. This is useful for both options and all styles of up/down trading methods that rely on volatility.
Trading Panel: Shows trading information to take profits and stop-loss based on multiples of volatility. Also shows equity inputs by the user to calculate optimal position size
Key things to note about the Trading Panel
-Trade side: Long or short. you change this this to change the take profit and SL levels in displayed on the table to be used w/ up/down trading styles that rely on volatility stops
-Account size: User enters total balance available for trade
-Risk: Total % of account size you're willing to lose should the SL be hit
-Position size: Size of the position given the SL and your preferred Risk
-Take profit/Stop loss levels: Based on multipliers selected by the user in settings. These shouldn't be changed unless you really know what you're doing with volatility stops
-Entry: Source price. can be 1 of 37 different prices. See Loxx's Expanded Source Types:
Volatility Panel: Shows information about the volatility the user selected to be used to take profit/stop-loss/range calculations. Volatility types included are:
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility. That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Chi-squared Confidence Interval:
Confidence interval of volatility is calculated using an inverse CDF of a Chi-Squared Distribution. You can change the volatility input used to either realized, upper confidence interval, or lower confidence interval. This is included in case you'd like to see how far price can extend if volatility hits it's upper or lower confidence levels. Generally, you'd just used realized volatility, so I wouldn't change this setting.
Inverse CDF of a Chi-Squared Distribution
The chi-square distribution is a one-parameter family of curves. The parameter ν is the degrees of freedom.
The icdf of the chi-square distribution is
x=F^−1(p∣ν) = {x:F(x∣ν) = p}
where
p=F(x∣ν)= ∫ (t^(v-2)/2 * e^t/2) / (2^(v/2) / Γ(v/2))
ν is the degrees of freedom, and Γ( · ) is the Gamma function. The result p is the probability that a single observation from the chi-square distribution with ν degrees of freedom falls in the interval .
Additional notes on Volatility Panel
-Shows both current timeframe volatility per candle at whatever date backward you select
-Shows annualized volatility basaed on selected days per year and per bar volatility; this is automaitcally caulculated no matter the timeframe used. This means that it'll calculate annualized volatility for the current candle even on the 1 second timeframe. Days per year should be 252 for everything but cryptocurrency; however, for all types of tradable assets, anything over the 3 day timeframe will calculate on 365 days.
Probability Panel
This panel shows the probability levels of a user selected upper and lower price boundary. This includes the inside range of volatility between the lower and upper price levels and the outside probability below the lower price level and above the upper price level. These values are calculated using the CDF (cumulative density function) of a normal distribution. In simpler terms, CDF returns area under a bell curve between two points left and right, or for our purposes, high and low. This yeilds the probabilities you see in the Probability Panel. See the following graphic to visualize how this works:
The red line is the entry bar; the yellow line is the "mean" but in this case just the chosen source price.
Other things to know
You can turn on/off all labels and levels and fills
Probability Weighted Moving AverageThe Probability Weighted Moving Average uses a log-normal (continuous) distribution to calculate the probabilities of a range of lengths MAs to assess their performances, and respectively assign weights to a mean of this range. This assumes that the values of the MAs (call it A) aren't normally distributed, but instead log(A) is normally distributed, which can be a fair assumption. In P(t, t+X) where X is the number of trades assessed, it assumes the probability is not dependent on t and independent of previous price.
For P(t, t+X), the higher the value of X, the more trades are assessed.
Range of lengths comes in slow (default) or fast. Faster MAs are not preferable and should be limited to HTFs.
The color code can be either weighted (where lighter shades of blue suggests faster values have more weight, and darker shades suggest slower values have more weight) or coded for bull/bear: green when bullish, red when bearish.
Variety Distribution Probability Cone [Loxx]Variety Distribution Probability Cone forecasts price within a range of confidence using Geometric Brownian Motion (GBM) calculated using selected probability distribution, volatility, and drift. Below is detailed explanation of the inner workings of the indicator and the math involved. While normally this indicator would be used by options traders, this can also be used by regular directional traders who wish to observe a forecast of the confidence interval of possible prices over time.
What is a Random Walk
A random walk is a path which consists of a set of random steps. The starting point is zero and following movement may be one step to the left or to the right with equal probability. In the random walk process, there is no observable trend or pattern which are followed by the objects that is the movements are completely random. That is why the prices of a stock as it moves up and down can be modeled by random a walk process.
Stock Prices and Geometric Brownian Motion
Brownian motion, as first conceived by the botanist Robert Brown (1827), is a mathematical model used to describe random movements of small particles in a fluid or gas. These random movements are observed in the stock markets where the prices move up and down, randomly; hence, Brownian motion is considered as a mathematical model for stock prices.
P(exp(lnS0 + (mu + 1/2*sigma^2)t - z(0.05)*sigma*t^0.5) <= St <= exp(lnS0 + (mu + 1/2*sigma^2)t + z(0.05)*sigma*t^0.5)) = 0.95
Probability Distributions
Typically the normal distribution is used, but for our purposes here we extend this to Student t-distribution, Cauchy, Gaussian KDE, and Laplace
Student's t-Distribution
The probability density function of the Student’s t distribution is given by
g(x) = (L(v+1)/2) / L(v/2) * 1 / L(sqrt(v)) * (1 + x^2/v) ^ (-(v+1)/2)
with v degrees of freedom and v >= 0, denoted by X ~ t(v). The mean is 0 and the variance is v/(v-2). It is known that as v tends to infinity, the Student’s t-distribution tends to a standard normal probability density function, which has a variance of one. Blattberg and Gonedes were the first to propose that stock returns could be modeled by this distribution. (Blattberg and Gonedes, 1974) Platen and Sidorowicz later reaffirmed these findings.(Platen and Rendek, 2007) Finally, Cassidy, Hamp, and Ouyed used these findings to derive the Gosset formula, which is the Student t version of the Black-Scholes model.(Cassidy et al., 2010) They found that v = 2.65 provides the best fit when looking at the past 100 years of returns. They realized that as markets become more turbulent, the degrees of freedom should be adjusted to a smaller value.(Cassidy et al., 2010)
Cauchy Distribution
The probability density function of the Cauchy distribution is given by
f(x) = 1 / (theta*pi*(1 + ((x-n)/v)))
where n is the location parameter and theta is the scale parameter, for -infinity < x < infinity and is denoted by X ~ CAU(L,v). This model is similar to the normal distribution in that it is symmetric about zero, but the tails are fatter. This would mean that the probability of an extreme event occurring lies far out in the distributions tail. Using a crude example, if the normal distribution gave a probability of an extreme event occurring of 0.05% and the “best case” scenario of this event occurring 300 years, then using the Cauchy distribution one would find that the probability of occurring would be around 5% and now the “best case” scenario might have been reduced to only 63 years. Thus giving extreme events more of a likelihood of occurring. The mean, variance, and higher order moments are not defined (they are infinite); this implies that n and theta cannot be related to a mean and standard deviation. The Cauchy distribution is related to the Student’s t distribution T ~ CAU(1,0) when v = 1. In 1963, Benoit Mandelbrot was the first to suggest that stock returns follow a stable distribution, in particular, the Cauchy distribution.(Mandelbrot, 1963) His work was validated by Eugene Fama in 1965.(Fama, 1965) Recent research by Nassim Taleb came to the same conclusion as Mandelbrot, saying that stock returns follow a Cauchy distribution, as reported in his New York Times best-seller book “The Black Swan”.(Taleb, 2010)
Laplace Distribution
In probability theory and statistics, the Laplace distribution is a continuous probability distribution named after Pierre-Simon Laplace. It is also sometimes called the double exponential distribution, because it can be thought of as two exponential distributions (with an additional location parameter) spliced together along the abscissa, although the term is also sometimes used to refer to the Gumbel distribution. The difference between two independent identically distributed exponential random variables is governed by a Laplace distribution, as is a Brownian motion evaluated at an exponentially distributed random time. Increments of Laplace motion or a variance gamma process evaluated over the time scale also have a Laplace distribution.
The probability density function of the Cauchy distribution is given by
f(x) = 1/2b * exp(-|x-µ|/b)
Here, µ is a location parameter and b > 0, which is sometimes referred to as the "diversity", is a scale parameter. If µ = 0 and b=1, the positive half-line is exactly an exponential distribution scaled by 1/2.
The probability density function of the Laplace distribution is also reminiscent of the normal distribution; however, whereas the normal distribution is expressed in terms of the squared difference from the mean µ, the Laplace density is expressed in terms of the absolute difference from the mean. Consequently, the Laplace distribution has fatter tails than the normal distribution.
Gaussian Kernel Density Estimation
In statistics, kernel density estimation (KDE) is the application of kernel smoothing for probability density estimation, i.e., a non-parametric method to estimate the probability density function of a random variable based on kernels as weights. KDE is a fundamental data smoothing problem where inferences about the population are made, based on a finite data sample. In some fields such as signal processing and econometrics it is also termed the Parzen–Rosenblatt window method, after Emanuel Parzen and Murray Rosenblatt, who are usually credited with independently creating it in its current form. One of the famous applications of kernel density estimation is in estimating the class-conditional marginal densities of data when using a naive Bayes classifier, which can improve its prediction accuracy.
Let (x1, x2, ..., xn) be independent and identically distributed samples drawn from some univariate distribution with an unknown density f at any given point x. We are interested in estimating the shape of this function f. Its kernel density estimator is:
f(x) = 1/nh * sum(k(x-xi)/h, n)
where K is the kernel—a non-negative function—and h > 0 is a smoothing parameter called the bandwidth. A kernel with subscript h is called the scaled kernel and defined as Kh(x) = 1/h K(x/h). Intuitively one wants to choose h as small as the data will allow; however, there is always a trade-off between the bias of the estimator and its variance.
The probability density function of Gaussian Kernel Density Estimation is given by
f(x) = 1 / (v * 2*pi)^0.5 * exp(-(x - m)^2 / (2 * v))
where v is the bandwidth component h squared
KDE Bandwidth Estimation
Bandwidth selection strongly influences the estimate obtained from the KDE (much more so than the actual shape of the kernel). Bandwidth selection can be done by a "rule of thumb", by cross-validation, by "plug-in methods" or by other means. The default is Scott's Rule.
Scott's Rule
n ^ (-1/(d+4))
with n the number of data points and d the number of dimensions.
In the case of unequally weighted points, this becomes
neff^(-1/(d+4))
with neff the effective number of datapoints.
Silverman's Rule
(n * (d + 2) / 4)^(-1 / (d + 4))
or in the case of unequally weighted points:
(neff * (d + 2) / 4)^(-1 / (d + 4))
With a set of weighted samples, the effective number of datapoints neff
is defined by:
neff = sum(weights)^2 / sum(weights^2)
Manual input
You can provide your own bandwidth input. This is useful for those who wish to run external to TradingView Grid Search Machine Learning algorithms to solve for the bandwidth per ticker.
Inverse CDF of KDE Calculation
1. Create an array of random normalized numbers, using an inverse CDF of a normal distribution of mean of zero
and standard deviation one
2. Create a line space range of values -3 to 3
3. Create a Gaussian Kernel Density Estimate CDF by iterating over the line space array created in step 2. For each line space item, find the mean difference between the line space and the random variable divided by the bandwidth.
4. Derive test statistics from the resulting KDE inverse CDF, we use cubic spline interpolation to solve for line space value for a given alpha computed using the user selected probability percent value in the settings.
Volatility
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility.
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility.
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility. That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility. It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility. It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ)avg(var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg(var; N) against avg(var; M) - avg(var; N) and using the resulting beta estimate as θ.
Manual
User input % value
Drift
Cost of Equity / Required Rate of Return (CAPM)
Standard Capital Asset Pricing Model used to solve for Cost of Equity of Required Rate of Return. Due to the processor overhead required to compute CAPM, the user must plug in values for beta, alpha, and expected market return using Loxx's CAPM indicator series. Used for stocks.
Mean of Log Returns
Average of the log returns for the underlying ticker over the user selected period of evaluation. General purpose use.
Risk-free Rate (r)
10, 20, or 30 year bond yields for the user selected currency. Under equilibrium the drift of the empirical GBM must be the risk-free rate. If the price process is a GBM under the empirical measure, then a consequence of viability is that it is also a GBM under an equivalent (risk-neutral) measure.
Risk-free Rate adjusted for Dividends (r-q)
This is the Risk-free Rate minus the Dividend Yield.
Forex (r-rf)
This is derived from the Garman and Kohlhagen (1983) modified Black-Scholes model can be used to price European currency options. This is simply the diffeence between Risk-free Rate of the Forex currency in question. This is used for Forex pricing.
Martingale (0)
When the drift parameter is 0, geometric Brownian motion is a martingale. In probability theory, a martingale is a sequence of random variables (i.e., a stochastic process) for which, at a particular time, the conditional expectation of the next value in the sequence is equal to the present value, regardless of all prior values. Typically used for futures or margined futures.
Manual
User input % value
Additional notes
Indicator can be used on any timeframe. The T (time) variable used to annualize volatility and inside the GBM formula is automatically calculated based on the timeframe of the chart.
Confidence interval of volatility is calculated using an inverse CDF of a Chi-Squared Distribution. You change the volatility input used to create the probability cones from from realized volatility to upper or lower confidence levels of volatility to better visualize extremes of range. Generally, you'd stick with realized volatility.
Days per year should be 252 for everything but Cryptocurrency. These are days trader per year. Maximum future forecast bars is 365. Forecast bars are limited to the maximum of selected days per year.
Includes the ability to overlay option expiration dates by bars to see the range of prices for that date at that bar
You can select confidence % you wish for both the cone in general and the volatility. There are three levels for the cones, this will show on the three different levels up and down on the chart.
The table on the right displays important calculated values so you don't have to remember what they are or what settings you selected
All values are annualized no matter the timeframe.
Additional distributions and measures of volatility and drift will be added in future releases.
Bayesian BBSMA + nQQE Oscillator + Bank funds (whales detector)Three trend indicators in one. Fork of Gunslinger2005 indicator, with a fix to display the nQQE oscillator correctly and clearly, and converted to pinescript v5 (allowing to set a different timeframe and gaps).
How to use: Essentially, nQQE is a long term trend indicator which is more adequate in daily or weekly timeframe to indicate the current market cycle. Banker Fund seems better suited to indicate current local trend, although it is sensitive to relief rallies. Bayesian BBSMA is an awesome tool to visualize the buildup in bullish/bearish sentiment, and when it is more likely to get released, however it is unreliable, so it needs to be combined with other indicators.
Please show the original indicators some love:
Bayesian BBSMA:
nQQE:
L3 Banker Fund Flow Trend:
Originally mixed together by Gunslinger2005:
Probability Cloud BASIC [@AndorraInvestor]🔮☁️
This is the BASIC version of the PROBABILITY CLOUD indicator.
It is an evolution beyond traditional standard deviation probabilistic indicators only using bands or channels.
The new PROBABILITY CLOUD graphic representation with customizable transparent layers is based on -2 / +2 standard deviation calculated using 20 fixed predetermined time periods, and is available in several calculation MODES:
SMA , EMA , WMA , VWMA , VWMA & VAWMA
The indicator is designed to let the trader visually understand the probabilistic depth of past, present and future price action, and its evolution over time.
Looking forward to your comments and feedback to guide me on future updates!
🙏 Big THANKS @Electrified for letting me use his work on Deviation Bands/ as a starting point for my first script.
Breakout Probability (Expo)█ Overview
Breakout Probability is a valuable indicator that calculates the probability of a new high or low and displays it as a level with its percentage. The probability of a new high and low is backtested, and the results are shown in a table— a simple way to understand the next candle's likelihood of a new high or low. In addition, the indicator displays an additional four levels above and under the candle with the probability of hitting these levels.
The indicator helps traders to understand the likelihood of the next candle's direction, which can be used to set your trading bias.
█ Calculations
The algorithm calculates all the green and red candles separately depending on whether the previous candle was red or green and assigns scores if one or more lines were reached. The algorithm then calculates how many candles reached those levels in history and displays it as a percentage value on each line.
█ Example
In this example, the previous candlestick was green; we can see that a new high has been hit 72.82% of the time and the low only 28.29%. In this case, a new high was made.
█ Settings
Percentage Step
The space between the levels can be adjusted with a percentage step. 1% means that each level is located 1% above/under the previous one.
Disable 0.00% values
If a level got a 0% likelihood of being hit, the level is not displayed as default. Enable the option if you want to see all levels regardless of their values.
Number of Lines
Set the number of levels you want to display.
Show Statistic Panel
Enable this option if you want to display the backtest statistics for that a new high or low is made. (Only if the first levels have been reached or not)
█ Any Alert function call
An alert is sent on candle open, and you can select what should be included in the alert. You can enable the following options:
Ticker ID
Bias
Probability percentage
The first level high and low price
█ How to use
This indicator is a perfect tool for anyone that wants to understand the probability of a breakout and the likelihood that set levels are hit.
The indicator can be used for setting a stop loss based on where the price is most likely not to reach.
The indicator can help traders to set their bias based on probability. For example, look at the daily or a higher timeframe to get your trading bias, then go to a lower timeframe and look for setups in that direction.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Lune Market Analysis Premium- Version 0.9 -
Lune Algo was developed and built by Lune Trading, utilizing years of their trading expertise. This indicator works on all stocks, cryptos, indices, forex, futures , currencies, ETF's, energy and commodities. All the tools and features you need to assist you on your trading journey. Best of all, Lune Algo is easy to use and many of our tools and strategies have been thoroughly backtested thousands of times to ensure that users have the best experience possible.
Overview
Trade Dashboard—Provides information about the current market conditions, Such as if the market is trending up or down, how much volatility is in the market and even displays information about the current signal.
Trade Statistics—This tool gives you a breakdown of the Statistics of the current selected strategy based on backtests. It tells you the percentage of how often a Take Profit or Stop Loss was hit within a specific time period. Risk and Trade management is very important in trading, and can be the difference between a winning and losing strategy. So we believe that this was mandatory.
Current Features:
Advanced Buy and Sell Signals
Exclusive built-in Strategies
Lune Confidence AI
EK Clouds
Reversal Bands
Vray (Volume Ray)
Divergence Signals
Reversal Signals
Support/Resistance Zones
Built-in Themes
Built-in Risk Management system (take profit/stop loss)
Trade Statistics
Trade Assistance
Trade Dashboard
Advanced Settings
+ More coming soon, Big plans!
Features Breakdown:
Lune Confirmation—Used to help you confirm your trades and trend direction. It uses unique calculations, and its settings can be adjusted to allow traders to adapt the settings to fit their trading style.
Lune Confidence AI—All strategies are equipped with our exclusive built-in Confidence AI. This feature tells you how much confluence there is in a trade. It uses a rating system where signals are given a number from 0 to 5. A rating of 0 indicates that there is not a lot of confluence or confidence in the signal, while a rating of 5 indicates that there is a lot of confidence in the trade. This feature is not perfect and will be improved overtime.
Support/Resistance Zones—Calculates the most important support/resistance levels based on how many times a level has been used as support or resistance. Traders also refer to these as supply and demand zones and key levels.
EK Clouds—Used to further help you confirm trend and was optimized to also be used as support and resistance. This feature is powered by custom moving averages.
Reversal Bands—An optimized and improved version of the infamous Bollinger Bands. When price action takes place within the Reversal Bands it usually indicates that the current symbol is overextended and a reversal is possible.
Vray—Also Known as "Volume Ray", Assists you in better visualizing volume. This helps you find key levels and areas of support that you wouldn't be able to see otherwise. It helps you trade like the institutions.
This indicator's signals DO NOT REPAINT.
If you are using this script you acknowledge past performance is not necessarily indicative of future results and there are many more factors that go into being a profitable trader.
seasonThis script is meant to help verify the existence of a seasonal effect in asset returns, using a Z-test. There are three steps:
1. Think of a way to identify a season. The available methods are: by month, by week of the year, by day of the month, by day of the week, by hour of the day, and by minute of the hour.
2. Set the chart to the unit of your season. For example, if you want to check whether a crop commodity's harvest season has a seasonal implication, select "month". If you want to investigate the exchange's opening or close, select "hour".
3. Using the inputs, select the unit (e.g. "month", "dayofweek", "hour", etc.) and the range that identifies the season. The example natural gas chart has set "start" to 8 and "end" to 12 for September through December.
The test logic is as follows:
The "season" you select has a fixed length; for example, months eight through twelve has a length of four. This length is used to compute a sample mean, which is the mean return of all September-December periods in the chart. It is also used to calculate the mean/stdev of every other four-month period in the chart history. The latter is considered the "population." Using a Z-test, the script scores the difference between the sample returns and the population returns, and displays the results at two levels of significance (P = 0.05 and P = 0.01). The null hypothesis is "there is no difference between the seasonal periods and the population of ordinary periods". If the Z-score is sufficiently large or small, we can reject the null hypothesis and say that there is a seasonal effect at the given level of confidence. The output table will show green for a rejection of the null hypothesis (meaning there is a seasonal effect) or red of acceptance (there is no seasonal effect).
The seasonal periods that you have defined will be highlighted on the chart, so you can make sure they are correct. Additionally, the output table shows the mean, median, standard deviation, and top and bottom percentiles for both the seasonal and population samples.
Many news sites, twitter feeds, influences, etc. enjoy posting statistics about past returns, like "the stock market has gone up on this day 85 out of the past 100 years" and so on. Unfortunately, these posts don't tell you that many of these statistics are meaningless, as even totally random price fluctuations will cause many such interesting figures to occur. This script provides a limited means of testing some such seasonal effects so you can see if they are probably just random, or if they may have some meaning.
Note that Tradingview seems to use 1-based indexing for daily or higher timeframes, and 0-based indexing for intraday timeframes:
Months: 1-12
Weeks: 1-52
Days (of month): 1-31
Days (of week): 1-7
Hours (of day): 0-23
Minutes (of hour): 0-59
MathProbabilityDistributionLibrary "MathProbabilityDistribution"
Probability Distribution Functions.
name(idx) Indexed names helper function.
Parameters:
idx : int, position in the range (0, 6).
Returns: string, distribution name.
usage:
.name(1)
Notes:
(0) => 'StdNormal'
(1) => 'Normal'
(2) => 'Skew Normal'
(3) => 'Student T'
(4) => 'Skew Student T'
(5) => 'GED'
(6) => 'Skew GED'
zscore(position, mean, deviation) Z-score helper function for x calculation.
Parameters:
position : float, position.
mean : float, mean.
deviation : float, standard deviation.
Returns: float, z-score.
usage:
.zscore(1.5, 2.0, 1.0)
std_normal(position) Standard Normal Distribution.
Parameters:
position : float, position.
Returns: float, probability density.
usage:
.std_normal(0.6)
normal(position, mean, scale) Normal Distribution.
Parameters:
position : float, position in the distribution.
mean : float, mean of the distribution, default=0.0 for standard distribution.
scale : float, scale of the distribution, default=1.0 for standard distribution.
Returns: float, probability density.
usage:
.normal(0.6)
skew_normal(position, skew, mean, scale) Skew Normal Distribution.
Parameters:
position : float, position in the distribution.
skew : float, skewness of the distribution.
mean : float, mean of the distribution, default=0.0 for standard distribution.
scale : float, scale of the distribution, default=1.0 for standard distribution.
Returns: float, probability density.
usage:
.skew_normal(0.8, -2.0)
ged(position, shape, mean, scale) Generalized Error Distribution.
Parameters:
position : float, position.
shape : float, shape.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.ged(0.8, -2.0)
skew_ged(position, shape, skew, mean, scale) Skew Generalized Error Distribution.
Parameters:
position : float, position.
shape : float, shape.
skew : float, skew.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.skew_ged(0.8, 2.0, 1.0)
student_t(position, shape, mean, scale) Student-T Distribution.
Parameters:
position : float, position.
shape : float, shape.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.student_t(0.8, 2.0, 1.0)
skew_student_t(position, shape, skew, mean, scale) Skew Student-T Distribution.
Parameters:
position : float, position.
shape : float, shape.
skew : float, skew.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.skew_student_t(0.8, 2.0, 1.0)
select(distribution, position, mean, scale, shape, skew, log) Conditional Distribution.
Parameters:
distribution : string, distribution name.
position : float, position.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
shape : float, shape.
skew : float, skew.
log : bool, if true apply log() to the result.
Returns: float, probability.
usage:
.select('StdNormal', __CYCLE4F__, log=true)
Probability ConesA probability cone is an indicator that forecasts a statistical distribution from a set point in time into the future.
Features
Forecast a Standard or Laplace distribution.
Change the how many bars the cones will lookback and sample in their calculations.
Set how many bars to forecast the cones.
Let the cones follow price from a set number of bars back.
Anchor the cones and they will not update from their last location.
Show or hide any set of cones.
Change the deviation used of any cone's upper or lower line.
Change any line's color, style, or width.
Change or toggle the fill colors between any two cone lines.
Basic Interpretations
First, there is an assumption that the distribution starting from the cone's origin, based on the number of historical bars sampled, is likely to represent the distribution of future price.
Price typically hangs around the mean.
About 68% of price stays within the first deviation cones.
About 95% of price stays within the second deviation cones.
About 99.7% of price stays within the third deviation cones.
When price is between the first and second deviation cones, there is a higher probability for a reversal.
However, strong momentum while above or below the first deviation can indicate a trend where price maintains itself past the first deviation. For this reason it's recommended to use a momentum indicator alongside the cones.
There is no mean reversion assumption when price deviates. Price can continue to stay deviated.
It's recommended that the cones are placed at the beginning of calendar periods. Like the month, week, or day.
Be mindful when using the cones on various timeframes. As the lookback setting, which selects the number of bars back to load from the cone's origin, will load the number of bars back based on the current timeframe.
Second Deviation Strategy
How to react when price goes beyond the second deviation is contingent on your trading position.
If you are holding a losing trade and price has moved past the second deviation, it could be time to stop trading and exit.
If you are holding a winning trade and price has moved past the second deviation, it would be best to look at exit strategies to capitalize on the outperformance.
If price has moved beyond the second deviation and you hold no position, then do not open any new trades.
probability_of_touchBased on historical data (rather than theory), calculates the probability of a price level being "touched" within a given time frame. A "touch" means that price exceeded that level at some point. The parameters are:
- level: the "level" to be touched. it can be a number of points, percentage points, or standard deviations away from the mark price. a positive level is above the mark price, and a negative level is below the mark price.
- type: determines the meaning of the "level" parameter. "price" means price points (i.e. the numbers you see on the chart). "percentage" is expressed as a whole number, not a fraction. "stdev" means number of standard deviations, which is computed from recent realized volatlity.
- mark: the point from which the "level" is measured.
- length: the number of days within which the level must be touched.
- window: the number of days used to compute realized volatility. this parameter is only used when "type" is "stdev".
- debug: displays a fuchsia "X" over periods that touched the level. note that only a limited number of labels can be drawn.
- start: only include data after this time in the calculation.
- end: only include data before this time in the calculation.
Example: You want to know how many times Apple stock fell $1 from its closing price the next day, between 2020-02-26 and today. Use the following parameters:
level: -1
type: price
mark: close
length: 1
window:
debug:
start: 2020-02-26
end:
How does the script work? On every bar, the script looks back "length" days and sees if any day exceeded the "mark" price from "length" days ago, plus the limit. The probability is the ratio of such periods wherein price exceeded the limit to the total number of periods.
NEXT Regressive VWAPOverview:
This version of the Volume-Weighted Average Price (VWAP) indicator features an extended algorithm, which, in addition to volume and price, also incorporates regression analysis. The result is a more responsive, often leading VWAP slope with a degree of statistical predictability built in. Just like with the original VWAP, NEXT Regressive VWAP offers two optional Standard Deviation bands that parallel it. These can be set to any deviation level, with the default being 1 and -1, indicating one standard deviation above and one below Regressive VWAP, respectively.
Below is a screenshot comparing NEXT Regressive VWAP (green) to the original VWAP (blue) on CME_MINI:ES1! M3 chart.
Application and Strategy Ideas:
Price above NEXT Regressive VWAP is interpreted to have a bullish bias, and below, bearish. You can use TradingView's native Set Alert functionality to be notified, in real-time, when price crosses Regressive VWAP, and/or any of its standard deviation bands. Another popular "probability play" strategy is to scalp price when it crosses under the upper band (short) and crosses over the lower band (long). The screenshot below visualizes such a strategy on NASDAQ:QQQ M1 chart:
Input Parameters:
There are 3 groups of input.
Regression Settings
Length - controls the length of time (in bars) for regression analysis with higher values yielding smoother, more responsive values.
Regression Weighting - controls the degree of regression analysis incorporated into VWAP, with 5 being average, 0-4 less, 6-10 more. The higher the value, the more responsive the Regressive VWAP curve.
VWAP Settings
Anchor Period - controls the origin of VWAP calculations, start of session being the default.
Source - data used for calculating the VWAP, typically HLC/3, but can be used with other price formats and data sources as well.
Offset - shifting of the VWAP line forward (+) or backward (-).
Standard Deviation Bands Settings
Calculate Bands - checking this will add 2 bands, each equidistant (by the amount of Multiplier) from the NEXT Regressive VWAP line.
Bands Multiplier - standard deviation multiplier, with 1 being the default
Signals and Alerts:
Here is how to set price (close) crossing NEXT Regressive VWAP alerts: open a chart, attach NEXT Regressive VWAP, and right-click on chart -> Add Alert. Condition: Symbol e.g. ES (close) >> Crossing >> Regressive VWAP >> VWAP >> Once Per Bar Close.
FunctionProbabilityDistributionSamplingLibrary "FunctionProbabilityDistributionSampling"
Methods for probability distribution sampling selection.
sample(probabilities) Computes a random selected index from a probability distribution.
Parameters:
probabilities : float array, probabilities of sample.
Returns: int.
FunctionSMCMCLibrary "FunctionSMCMC"
Methods to implement Markov Chain Monte Carlo Simulation (MCMC)
markov_chain(weights, actions, target_path, position, last_value) a basic implementation of the markov chain algorithm
Parameters:
weights : float array, weights of the Markov Chain.
actions : float array, actions of the Markov Chain.
target_path : float array, target path array.
position : int, index of the path.
last_value : float, base value to increment.
Returns: void, updates target array
mcmc(weights, actions, start_value, n_iterations) uses a monte carlo algorithm to simulate a markov chain at each step.
Parameters:
weights : float array, weights of the Markov Chain.
actions : float array, actions of the Markov Chain.
start_value : float, base value to start simulation.
n_iterations : integer, number of iterations to run.
Returns: float array with path.
ProbabilityLibrary "Probability"
erf(value) Complementary error function
Parameters:
value : float, value to test.
Returns: float
ierf_mcgiles(value) Computes the inverse error function using the Mc Giles method, sacrifices accuracy for speed.
Parameters:
value : float, -1.0 >= _value >= 1.0 range, value to test.
Returns: float
ierf_double(value) computes the inverse error function using the Newton method with double refinement.
Parameters:
value : float, -1. > _value > 1. range, _value to test.
Returns: float
ierf(value) computes the inverse error function using the Newton method.
Parameters:
value : float, -1. > _value > 1. range, _value to test.
Returns: float
complement(probability) probability that the event will not occur.
Parameters:
probability : float, 0 >=_p >= 1, probability of event.
Returns: float
entropy_gini_impurity_single(probability) Gini Inbalance or Gini index for a given probability.
Parameters:
probability : float, 0>=x>=1, probability of event.
Returns: float
entropy_gini_impurity(events) Gini Inbalance or Gini index for a series of events.
Parameters:
events : float , 0>=x>=1, array with event probability's.
Returns: float
entropy_shannon_single(probability) Entropy information value of the probability of a single event.
Parameters:
probability : float, 0>=x>=1, probability value.
Returns: float, value as bits of information.
entropy_shannon(events) Entropy information value of a distribution of events.
Parameters:
events : float , 0>=x>=1, array with probability's.
Returns: float
inequality_chebyshev(n_stdeviations) Calculates Chebyshev Inequality.
Parameters:
n_stdeviations : float, positive over or equal to 1.0
Returns: float
inequality_chebyshev_distribution(mean, std) Calculates Chebyshev Inequality.
Parameters:
mean : float, mean of a distribution
std : float, standard deviation of a distribution
Returns: float
inequality_chebyshev_sample(data_sample) Calculates Chebyshev Inequality for a array of values.
Parameters:
data_sample : float , array of numbers.
Returns: float
intersection_of_independent_events(events) Probability that all arguments will happen when neither outcome
is affected by the other (accepts 1 or more arguments)
Parameters:
events : float , 0 >= _p >= 1, list of event probabilities.
Returns: float
union_of_independent_events(events) Probability that either one of the arguments will happen when neither outcome
is affected by the other (accepts 1 or more arguments)
Parameters:
events : float , 0 >= _p >= 1, list of event probabilities.
Returns: float
mass_function(sample, n_bins) Probabilities for each bin in the range of sample.
Parameters:
sample : float , samples to pool probabilities.
n_bins : int, number of bins to split the range
@return float
cumulative_distribution_function(mean, stdev, value) Use the CDF to determine the probability that a random observation
that is taken from the population will be less than or equal to a certain value.
Or returns the area of probability for a known value in a normal distribution.
Parameters:
mean : float, samples to pool probabilities.
stdev : float, number of bins to split the range
value : float, limit at which to stop.
Returns: float
transition_matrix(distribution) Transition matrix for the suplied distribution.
Parameters:
distribution : float , array with probability distribution. ex:.
Returns: float
diffusion_matrix(transition_matrix, dimension, target_step) Probability of reaching target_state at target_step after starting from start_state
Parameters:
transition_matrix : float , "pseudo2d" probability transition matrix.
dimension : int, size of the matrix dimension.
target_step : number of steps to find probability.
Returns: float
state_at_time(transition_matrix, dimension, start_state, target_state, target_step) Probability of reaching target_state at target_step after starting from start_state
Parameters:
transition_matrix : float , "pseudo2d" probability transition matrix.
dimension : int, size of the matrix dimension.
start_state : state at which to start.
target_state : state to find probability.
target_step : number of steps to find probability.
Probability MTF [Anan]█ OVERVIEW
Probability is simply how likely something is to happen.
Whenever we’re unsure about the outcome of an event, we can talk about the probabilities of certain outcomes—how likely they are.
The best example for understanding probability is flipping a coin, There are two possible outcomes—heads or tails..
In our case, the coin is (Green/Red) Candles
So:
Probability of an event = (# of ways it can happen) / (total number of outcomes)
P(A) = (# of ways A can happen) / (Total number of outcomes)
So:
The probability of the next candle is green (Up)= (# of past green candles) / (total both candles sum "length")
The probability of the next candle is red (Down)= (# of past red candles) / (total both candles sum "length")
█ FEATURES
- Fully control of Probability (Source / Length)
- Show / Hide / customize three rows of Probability.
- Multi-timeframe Table.
- Full control of displaying any row or any column.
- Full control of Table position and Size and Colors.
xGhozt Prophecies - A Forecast on the FuturexGhozt Prophecies - A Forecast on the Future, is an indicator based on past statistics and different dates.
The indicator goes back in time and checks all the candles of your selected time frame, and gives you the statistical potential outcome of the next candle. It has been created in order to anticipate potential violent moves from the markets when key dates arrive. On March 12, 2020, Bitcoin dropped by nearly 50% in one single day. Many crypto traders were left with a PTSD that emerged on March 11, 2021, as many anticipated another crash on March 12, 2021. Therefore I created this indicator to show you how a candle behaved in the past, on a certain date.
You can replicate the model on any given time frame, on any asset, and you can even pre-select important dates in the indicator settings box to keep an eye on these dates at any given time.
You can therefore check how an asset behave on Mondays, or on the last day of the month, or how the 1h candle behave on this asset, on a Tuesday. Many combinations are available.
Implied TargetThis script attempts to estimate the targets that the current price may reach based on an exponentially weighted volatility model.
Overall, with the assumption of normal distribution of log return, which might not always hold true, it calculates the estimated range within which the current candles will close. One, two, and three sigma will give the probability of around 68%, 95% and 99% respectively.
This can be used to give you a better sense of what is possible with the current level of volatility , thus assist in risk management and position sizing.
Like with any indicators, it is recommended that you use this script as a confirmation to your strategy, and not take the estimated range blindly to carry out, for instance, mean-reversion trade. Again, it is merely an estimation with volatility at its core.
May you be on the right side of the trade.






















