2023-9 trading log

18
Entropy is a measure of the amount of disorder or randomness in a system. In thermodynamics, it is a measure of the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. In information theory, entropy is a measure of the uncertainty or randomness of information content in a message or signal. The greater the entropy in a message or signal, the less predictable it is.

免責聲明

這些資訊和出版物並不意味著也不構成TradingView提供或認可的金融、投資、交易或其他類型的意見或建議。請在使用條款閱讀更多資訊。