OurBigBook
About
$
Donate
Sign in
Sign up
by
Ciro Santilli
(
@cirosantilli,
34
)
Stochastic process
Home
Mathematics
Probability
Like
(0)
0 By others
on same topic
0 Discussions
Updated
2024-11-15
Created
1970-01-01
See my version
Table of contents
Markov chain
Stochastic process
Average number of steps until reaching a state of a Markov chain
Markov chain
Average number of steps spent on a node of a Markov chain
Markov chain
Absorbing Markov chain
Markov chain
Markov chain
0
1
0
Stochastic process
A directed
weighted graph
where the sum of weights of all outgoing edges equals 1.
Average number of steps until reaching a state of a Markov chain
0
0
0
Markov chain
TODO how to calculate
Average number of steps spent on a node of a Markov chain
0
0
0
Markov chain
TODO how to calculate
Absorbing Markov chain
0
0
0
Markov chain
Ancestors
(3)
Probability
Mathematics
Home
View article source
Discussion
(0)
Subscribe (1)
New discussion
There are no discussions about this article yet.
Articles by others on the same topic
(0)
There are currently no matching articles.
See all articles in the same topic
Create my own version