Skip to content
BY-NC-ND 3.0 license Open Access Published by De Gruyter June 2, 2014

Information Dimension and the Probabilistic Structure of Chaos

  • J. Doyne Farmer

The concepts of entropy and dimension as applied to dynamical systems are reviewed from a physical point of view. The information dimension, which measures the rate at which the information contained in a probability density scales with resolution, fills a logical gap in the classification of attractors in terms of metric entropy, fractal dimension, and topological entropy. Several examples are presented of chaotic attractors that have a self similar, geometrically scaling structure in their probability distribution; for these attractors the information dimension and fractal dimension are different. Just as the metric (Kolmogorov-Sinai) entropy places an upper bound on the information gained in a sequence of measurements, the information dimension can be used to estimate the information obtained in an isolated measurement. The metric entropy can be expressed in terms of the information dimension of a probability distribution constructed from a sequence of measurements. An algorithm is presented that allows the experimental determination of the information dimension and metric entropy.

Received: 1982-5-18
Published Online: 2014-6-2
Published in Print: 1982-11-1

© 1946 – 2014: Verlag der Zeitschrift für Naturforschung

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.

Downloaded on 15.5.2024 from https://www.degruyter.com/document/doi/10.1515/zna-1982-1117/html
Scroll to top button