OurBigBook
About
$
Donate
Sign in
Sign up
by
Ciro Santilli
(
@cirosantilli,
32
)
Markov chain
A directed
weighted graph
where the sum of weights of all outgoing edges equals 1.
Table of contents
Average number of steps until reaching a state of a Markov chain
Markov chain
Average number of steps spent on a node of a Markov chain
Markov chain
Absorbing Markov chain
Markov chain
Average number of steps until reaching a state of a Markov chain
Markov chain
TODO how to calculate
Average number of steps spent on a node of a Markov chain
Markov chain
TODO how to calculate
Absorbing Markov chain
Markov chain
Ancestors
Stochastic process
Probability
Mathematics
Index
View article source
Discussion (0)
Subscribe (1)
New discussion
There are no discussions about this article yet.
Articles by others on the same topic (1)
0
Markov chain
by
Wikipedia Bot
0
on
1970-01-01
See all articles in the same topic
Create my own version