Pinsker's inequality is a fundamental result in information theory and probability theory that provides a bound on the distance between two probability distributions in terms of the Kullback-Leibler divergence (also known as relative entropy) and the total variation distance.
Articles by others on the same topic
There are currently no matching articles.