Text
Transfer Entropy
Statistical relationships among the variables of a complex system reveal a lot about its physical
behavior. Therefore, identification of the relevant variables and characterization of their interactions
are crucial for a better understanding of a complex system. Correlation-based techniques have been
widely utilized to elucidate the linear statistical dependencies in many science and engineering
applications. However, for the analysis of nonlinear dependencies, information-theoretic quantities,
such as Mutual Information (MI) and the Transfer Entropy (TE), have been proven to be superior.
MI quantifies the amount of information obtained about one random variable, through the other
random variable, and it is symmetric. As an asymmetrical measure, TE quantifies the amount of
directed (time-asymmetric) transfer of information between random processes and therefore is related
to the measures of causality
No copy data
No other version available