Markov chain models | Vose Software

Markov chain models

See also: Time series introduction, Time series modeling in finance, Autoregressive models, Credit ratings and Markov Chain models, Time series in ModelRisk

Markov Chains comprise a number of individuals who begin in certain allowed states of the system and who may or may not randomly change (transition) into other allowed states over time.

A Markov Chain has no memory, meaning that the joint distribution of how many individuals will be in each allowed state depends only on how many were in each state the moment before, not on the pathways that led there. This lack of memory is known as the Markov property. Markov chains come in two flavours: continuous time and discrete time. We will look at a discrete time process first because it is the easiest to model.

Discrete time Markov chain

In a discrete time Markov process the individuals can move between states only at set (usually equally spaced) intervals of time. Consider a set of 100 individuals in the following four marital states:

  • 43 are Single

  • 29 are Married

  • 11 are Separated

  • 17 are Divorced

We write this as a vector:

Given sufficient time (let's say a year) there is a reasonable probability that the individuals can change state. We can construct a matrix of the transition probabilities as follows:

Transition matrix

Is now:

Single

Married

Separated

Divorced

Was:

Single

0.85

0.12

0.02

0.01

Married

0

0.88

0.08

0.04

Separated

0

0.13

0.45

0.42

Divorced

0

0.09

0.02

0.89

 

We read this matrix row by row. For example, it says that (first row) a single person has an 85% chance of still being single one year later, a 12% chance of being married, a 2% chance of being separated and a 1% chance of being divorced. Since these are the only allowed states (e.g.  we haven't included 'Engaged' so that must be rolled up into 'Single') the probabilities must sum to 100%. Of course we'd have to decide what a death would mean: the transition matrix could either be defined such that if a person dies they retain their marital status for this model, or we could make this a transition matrix conditional on them surviving a year.

Notice that the Single column is all 0's, except the Single/Single cell, because once one is married the only states allowed after that are Married, Separated and Divorced. Also not that one can go directly from Single to Separated or Divorced, which implies that during that year the individual had passed through the Married state. Markov chain transition matrices describe the probability that one is in a state at some precise time, given some state at a previous time, and is not concerned with how one got there, i.e. all the other states one might have passed through.

We now have the two elements of the model, the initial state vector and the transition matrix, to estimate how many individuals will be in each state after a year. Let's go through an example calculation to estimate how many people will be Married in one year:

  • For the Single people: Binomial(43,0.12) will be married

  • For the Married people: Binomial(29,0.88) will be married

  • For the Separated people: Binomial(11,0.13) will be married

  • For the Divorced people: Binomial(17,0.09) will be married

Add together these four Binomial distributions and we get an estimate of the number of people from our group who will be married next year. However, the above calculation does not work when we want to look at the joint distribution of how many people will be in each state: clearly we cannot add four sets of four Binomial distributions because the total must sum to 100 people. Instead, we need to use the Multinomial distribution. The number of people who were Single but are now {Single, Married, Separated, Divorced} equals Multinomial(43,{0.85, 0.12, 0.02, 0.01}). Applying the Multinomial distribution for the other three initial states we can take a random sample from each Multinomial and add up how many are in each state, as shown in the following model:

Example model   Markov_chain_multinomial_method - Multinomial method of performing a Markov Chain model

Let's now look at extending the model to predict further ahead in time, say five years. If we can assume that the probability transition matrix remains valid for that period, and that nobody in our group dies, we could repeat the above exercise five times - calculating in each year how many individuals are in each state and using that as the input into the next year, etc. However, there is a more efficient method.

The probability a person starting in state i is in state j after two years is determined by looking at the probability of the person going from state i to each state after one year, and then going from that state to state j in the second year. So, for example, the probability of changing from Single to Divorced after two years is:

P(Single to Single) * P(Single to Divorced) +

P(Single to Married) * P(Married to Divorced) +

P(Single to Separated) * P(Separated to Divorced) +

P(Single to Divorced) * P(Divorced to Divorced)

=0.85*0.01 + 0.12*0.04 + 0.02*0.42 + 0.01*0.89 = 0.0306

Notice how we have multiplied the elements in the first row (Single) by the elements in the last column (Divorced) and added them. This is the operation performed in matrix multiplication. We can therefore determine the probability transition matrix over the two year period by simply multiplying the one year transition matrix by itself (using Excel's MMULT function) in the following model:

 

Example model   Markov chain multinomial method 2 - multinomial method of performing a Markov Chain model with time an integer > 1 unit

When one wants to forecast T periods in advance where T is large, performing the matrix multiplication (T-1) times can become rather tedious, but there is some mathematics based on transforming the matrix that allows one to directly determine the transition matrix over any number of periods. ModelRisk provides some efficient means to do this: the VoseMarkovMatrix function calculates the transition matrix for any time length, and the VoseMarkovSample goes the next step, simulating how many individuals are in each final state after some period. In this next example we calculate the transition matrix and simulate how many individuals will be in each state after 25 years:

Example model   Marital_status_in_25_years  - ModelRisk methods for performing a Markov Chain model with time an integer > 1 unit

Notice how after 25 years the probability of being Married is about 45% irrespective of what state one started in: a similar situation occurs for Separated and Divorced. This stabilizing property is very common and, as a matter of interest, is the basis of a statistical technique discussed briefly elsewhere in this help file. Of course, the above calculation does assume that the transition matrix for one year is valid to apply over such a long period (a big assumption in this case).

Application: credit ratings

Markov chain modeling is a useful tool for modeling the variation in a corporation's credit rating over time. This application is explained in the topic Credit ratings and Markov Chain models.

Continuous time Markov chain

For a continuous time Markov process we need to be able to produce the transition matrix for any positive time increment, not just an integer multiple of the time that applies to the base transition matrix. So, for example, we might have the above marital status transition matrix for a single year but wish to know what the matrix is for half a year, or 2.5 years.

There is a mathematical technique for finding the required matrix, based on converting the multinomial probabilities in the matrix into Poisson intensities that match the required probability. The mathematical manipulation is somewhat complex, particularly when one has to wrestle with numerical stability. The ModelRisk functions VoseMarkovMatrix and VoseMarkovSample detect when you are using non-integer time and automatically convert to the alternative mathematics.

Read on: Birth and death models

 

Navigation