Entropy (information theory)

ID: entropy-information-theory

In information theory, entropy is a measure of the uncertainty or unpredictability associated with a random variable or a probability distribution. It quantifies the amount of information that is produced on average by a stochastic source of data. The concept was introduced by Claude Shannon in his seminal 1948 paper "A Mathematical Theory of Communication.

New to topics? Read the docs here!