Source: wikibot/entropy-information-theory
= Entropy (information theory)
{wiki=Entropy_(information_theory)}
In information theory, entropy is a measure of the uncertainty or unpredictability associated with a random variable or a probability distribution. It quantifies the amount of information that is produced on average by a stochastic source of data. The concept was introduced by Claude Shannon in his seminal 1948 paper "A Mathematical Theory of Communication.