Joint entropy (source code)

= Joint entropy
{wiki=Joint_entropy}

Joint entropy is a concept in information theory that quantifies the amount of uncertainty (or entropy) associated with a pair of random variables.