# Markov Property

To recap, Markov property suggest that previous events has no influence to future events. Formulated by [Eq.1]

# Transition Probability Matrix

## Transition Property of Markov Chain

Denoting ,

(by Markov Properties [Eq.1])

(Denote as )

## Transition Probability written as Matrix

The upper describes the transitional property of a Markov chain. If we write the probabilities of going to *j-*th state from *i*-th state into the components of a matrix, we call this a **Transition Probability Matrix (TPM)** or more commonly the **Stochastic Matrix. **In general, it can be written as follow [Eq.2]:

Which also carries the transitional property, for instance, the probabilities of going into *j-*th state from *i-*th state in exactly *n* steps would be .

### Example:

Suppose we want to know the **probabilities of the system ending in each states** which initially started from ** i-th state** and went through

**. First, construct a unit state vector where only the**

*n*steps*i-*th element is 1:

Then, probabilities we want can be calculate by the multiplication:

## Properties of TPM

(To be continue)