Mathematical Conversations
Entropy, Coding and Mean Dimension
How much information is needed to describe a trajectory in a dynamical system? The answer depends on what one means by dynamical system.
If our system is a probability measure space, and one has a time evolution (with either discrete or continuous time) that preserves this measure, there is only one reasonable invariant to use - the Kolmogorov-Sinai (or ergodic theoretic) entropy, which is closely related to Shanon's work in information theory.
If the system is a compact metric space, and the evolution is given by continuous maps, there is a notion parallel to the Kolmogorov-Sinai entropy, namely topological entropy, and another that is less well known but seems also quite fundamental called mean dimension.
I will try to explain how these notions are related to each other, and to bandwidth, sampling and analog-to-digital conversion.