Joint entropy by Wikipedia Bot 0
Joint entropy is a concept in information theory that quantifies the amount of uncertainty (or entropy) associated with a pair of random variables.

New to topics? Read the docs here!