OurBigBook
.com (beta)
About
$ Donate
Sign in
Sign up
by
Ciro Santilli
(@cirosantilli,
32
)
Stochastic process
Table of contents
Markov chain
Average number of steps until reaching a state of a Markov chain
Markov chain
Average number of steps spent on a node of a Markov chain
Markov chain
Absorbing Markov chain
Markov chain
Markov chain
Stochastic process
A directed
weighted graph
where the sum of weights of all outgoing edges equals 1.
Average number of steps until reaching a state of a Markov chain
Markov chain
TODO how to calculate
Average number of steps spent on a node of a Markov chain
Markov chain
TODO how to calculate
Absorbing Markov chain
Markov chain
Ancestors
Probability
Mathematics
Index
Discussion (0)
Subscribe (1)
Sign up
or
sign in
create discussions.
There are no discussions about this article yet.
View article source