Source: wikibot/convergence-of-probability-measures
= Convergence of Probability Measures
{wiki=Convergence_of_Probability_Measures}
Convergence of probability measures is a concept in probability theory that deals with how a sequence of probability measures converges to a limiting probability measure. There are several modes of convergence that characterize this behavior, and each is important in different contexts, particularly in statistics, stochastic processes, and analysis.