information theory: Measurement of Information Content
Measurement of Information Content
Numerically, information is measured in bits (short for
Interestingly, the mathematical expression for information content closely resembles the expression for entropy in thermodynamics. The greater the information in a message, the lower its randomness, or “noisiness,” and hence the smaller its entropy. Since the information content is, in general, associated with a source that generates messages, it is often called the entropy of the source. Often, because of constraints such as grammar, a source does not use its full range of choice. A source that uses just 70% of its freedom of choice would be said to have a relative entropy of 0.7. The redundancy of such a source is defined as 100% minus the relative entropy, or, in this case, 30%. The redundancy of English is estimated to be about 50%; i.e., about half of the elements used in writing or speaking are freely chosen, and the rest are required by the structure of the language.
Sections in this article:
- Introduction
- Analysis of the Transfer of Messages through Channels
- Measurement of Information Content
- Bibliography
The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2024, Columbia University Press. All rights reserved.
See more Encyclopedia articles on: Computers and Computing