Source: wikibot/differential-entropy

= Differential entropy
{wiki=Differential_entropy}

Differential entropy is a concept in information theory that extends the idea of traditional (or discrete) entropy to continuous probability distributions. While discrete entropy measures the uncertainty associated with a discrete random variable, differential entropy quantifies the uncertainty of a continuous random variable.