Abstract
Pfaffelhuber in his paper [1] deals with the approximation of the entropy H of finite information sources on the basis of independent observations. He derives an error estimation for the experimental entropy, which depends only on the number of the possible source-cutputs. Using this result he succeeded in giving an upper estimation for the number of independent observations needed for obtaining a “reliable” approximation of the exact entropy. The aim of this short correspondence is to improve his error estimation and thereby to give a new upper bound for the number of the necessary observations which is nearly the logarithm of E. Pfaffelhuber's one. We mention only that the same assertion holds for the information rate T of observation channels, too.
Similar content being viewed by others
References
Pfaffelhuber, E.: Error estimation for the determination of entropy and information rate from relative frequencies. Kybernetik 8, 50–51 (1971).
Wozencraft, J.M., Jacobs, I.M.: Principles of communication engineering, London: Wiley 1965.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Nemetz, T. On the experimental determination of the entropy. Kybernetik 10, 137–139 (1972). https://doi.org/10.1007/BF00290511
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF00290511