Conditional entropy
= Conditional entropy
{wiki=Conditional_entropy}
Conditional entropy is a concept from information theory that quantifies the amount of uncertainty or information required to describe the outcome of a random variable, given that the value of another random variable is known. It effectively measures how much additional information is needed to describe a random variable \\( Y \\) when the value of another variable \\( X \\) is known.