Differential entropy

ID: differential-entropy

Differential entropy is a concept in information theory that extends the idea of traditional (or discrete) entropy to continuous probability distributions. While discrete entropy measures the uncertainty associated with a discrete random variable, differential entropy quantifies the uncertainty of a continuous random variable.

New to topics? Read the docs here!