Convergence of Probability Measures

ID: convergence-of-probability-measures

Convergence of probability measures is a concept in probability theory that deals with how a sequence of probability measures converges to a limiting probability measure. There are several modes of convergence that characterize this behavior, and each is important in different contexts, particularly in statistics, stochastic processes, and analysis.

New to topics? Read the docs here!