Kullback–Leibler divergence
ID: kullback-leibler-divergence
Kullback-Leibler divergence, often abbreviated as KL divergence, is a measure from information theory that quantifies how one probability distribution diverges from a second, expected probability distribution. It is particularly useful in various fields such as statistics, machine learning, and information theory.
New to topics? Read the docs here!