Entropy (information theory)

From WikiMD.org
Jump to navigation Jump to search

Entropy (Information Theory)

Entropy in Information Theory is a measure of the uncertainty, randomness, or disorder in a set of data. The term was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication".

Pronunciation

en·tro·py /ˈentrəpē/

Etymology

The term "entropy" comes from the Greek word "entropia," which means "a turning toward" or "transformation." In the context of information theory, it was first used by Claude Shannon in 1948.

Definition

In information theory, entropy is a measure of the uncertainty or randomness of a set of data. It is used to quantify the expected value of the information contained in a message. Higher entropy implies greater uncertainty, and thus more information.

Related Terms

See Also

References

  • Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27(3), 379-423.
Esculaap.svg

This WikiMD.org article is a stub. You can help make it a full article.