Pinsker's inequality is a fundamental result in information theory and probability theory that provides a bound on the distance between two probability distributions in terms of the Kullback-Leibler divergence (also known as relative entropy) and the total variation distance.
New to topics? Read the docs here!