Hamburger moment problem
= Hamburger moment problem
{wiki=Hamburger_moment_problem}
The Hamburger moment problem is a classical problem in the theory of moments and can be described as follows: Given a sequence of real numbers \\( m_n \\) (where \\( n = 0, 1, 2, \\ldots \\)), called moments, the Hamburger moment problem asks whether there exists a probability measure \\( \\mu \\) on the real line \\( \\mathbb\{R\} \\) such that the moments of this measure match the given sequence.