Markov Chains
A Markov chain is a sequence of
random values whose probabilities at a time interval depends upon the value of
the number at the previous time.
A simple example is the
non-returning random walk, where the walkers are restricted to not go back to
the location just previously visited.
In this case, each possible position
is known as state or condition.
The controlling factor in a Markov
chain is the transition probability, it is a conditional probability for the
system to go to a particular new state, given the current state of the system.
Since the probability of moving from
one state to another depends on probability of the preceding state, transition
probability is a conditional probability.
For Example
Markov chain (process) has following properties:
- The set of all possible states of stochastic (probabilistic) system is finite.
- The variables move from one state to another and the probability of transition from a given state is dependent only on the present state of the system, not in which it was reached.
- The probabilities of reaching to various states from any given state are measurable and remain constant over a time (i.e. throughout the system’s operation).
Markov chains are classified by their order
- If the probability occurrence of each state depends only upon the immediately preceding state, then it is known as first order Markov chain.
- The zero order Markov chain is memory less chain
0 Comments