Source: wikibot/kullback-leibler-divergence

= Kullback–Leibler divergence
{wiki=Kullback–Leibler_divergence}

Kullback-Leibler divergence, often abbreviated as KL divergence, is a measure from information theory that quantifies how one probability distribution diverges from a second, expected probability distribution. It is particularly useful in various fields such as statistics, machine learning, and information theory.