Abstract
WOOLHOUSE1 remarks that the work of Shannon and Brillouin showed the fundamental relationship between information defined as I=−ΣPi log Pi (where 0 ⩽Pi⩽1, ΣPi=1 and Pi is the relative probability of the ith symbol generated by a source), and entropy defined in statistical terms as S=−KΣPi log Pi (where ΣPi=1 and Pi is, in this case, the probability of an idealized physical system being in the state i of n possible equivalent states or complexions). It is the unwarranted extrapolation of this relationship to biological systems which, Woolhouse says, leads to erroneous conclusions. He points to the warning given by Brillouin himself, that the theory of information ignores the value or the meaning of the information which is quantified by the definition. Yet in spite of these warnings by Brillouin, the confusion is already present in his work even before its extension to biology.
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 51 print issues and online access
$199.00 per year
only $3.90 per issue
Buy this article
- Purchase on Springer Link
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
References
Woolhouse, H. W., Nature, 216, 200 (1967).
Brillouin, L., Science and Information Theory, second ed. (Academic Press, New York, 1962).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
WILSON, J. Entropy, not Negentropy. Nature 219, 535–536 (1968). https://doi.org/10.1038/219535a0
Received:
Issue Date:
DOI: https://doi.org/10.1038/219535a0
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.