Absorbing Markov Models

The chart review example is known as a regular Markov chain. The transition probabilities are constant, and depend only on the state of the process. Any state can be reached from any other state, although not necessarily in one step (e.g., Maureen cannot be followed immediately by Maureen, but can in two or more cycles). Regular chains converge to a limiting set of probabilities. The other principal category of Markov models is absorbing. In these systems the process has a state that is possible to enter, in a finite set of moves, from any other state, but from which no movement is possible. Once the process enters the absorbing state, it terminates (i.e., stays in that state forever). The analogy with clinical decision models is obvious—an absorbing Markov model has a state equivalent to death in the clinical problem.

Behavior of the Absorbing Model

This is shown in Figure 4.1, a simplified three-state absorbing clinical Markov model. In a clinical model, the notion of time appears naturally. Assume that a clinical process is modeled where definitive disease progression is possible and that death often ensues from progressive disease. At any given month the patient may be in a Well state, shown in the upper left of Figure 4.1, the Sick state in the upper right, or Dead in the lower center. If in the Well state, the most likely result for the patient is that he/she would remain well for the ensuing month and next be still found in the Well state. Alternatively, the patient could become sick and enter the Progressive state or die and move to the Dead state. If in Progressive, the patient would most likely stay in that state, but could also die from the Progressive state, presumably at a higher probability than from the Well state. There is also a very small probability of returning to the Well state.

A possible transition probability matrix for this model is shown in Table 4.5. In the upper row, a Well patient remains so with probability 80%, has a 15% chance of having progressive disease over one cycle, and a 5% chance of dying in the cycle. A sick patient with progressive disease is shown with a 2% chance of returning to the Well state, a 28% chance of dying in one month, and the remainder (i.e., 70% chance) staying in the Progressive state. Of course, the Dead state is absorbing, reflected by a 100% chance of staying Dead.

Table 4.5 is a probability matrix, so it can be multiplied as in the prior example. After two cycles, the matrix is shown in Table 4.6. Thus, after two

Simple Three-State Absorbing Markov Model

FIGURE 4.1 Simple Three-State Absorbing Markov Model.

TABLE 4.5

Transition Probability Matrix for Clinical Example

Current

Next

Well

Progressive

Dead

Well

0.80

0.15

0.05

Progressive

0.02

0.70

0.28

Dead

0.00

0.00

1.00

TABLE 4.6

Two-Cycle State Matrix for Clinical Example

Well

Progressive

Dead

Well

0.643

0.225

0.132

Progressive

0.030

0.493

0.477

Dead

0.000

0.000

1.000

cycles of the Markov process, someone who started in the Well state has slightly less than a two-thirds chance of staying well, and a 22.5% chance of having Progressive disease. By the 10th cycle, the top row of the transition matrix is

Well

Progressive

Dead

0.124

0.126

0.750

So, someone starting well has a 75% chance of being dead within 10 cycles and, of the remaining 25%, roughly an even chance of being well or having Progressive disease. This matrix converges slowly because of the moderate probability of death in any one cycle, but eventually this matrix would end up as a set of rows:

0 0 l

Everyone in this process eventually dies.

Clinical Markov models offer interesting insights into the natural history of a process. If the top row of the transition matrix is taken at each cycle and graphed. Figure 4.2 results. This graph can be interpreted as the fate of a cohort of patients beginning together at Well. The membership of the Well state decreases rapidly, as the forward transitions to Progressive and Dead overwhelm the back transition from Progressive to Well. The Progressive state grows at first, as it collects patients transitioning from Well, but soon the transitions to Dead, which, of course, are permanent, cause the state to lose members. The Progressive state peaks at Cycle 4, with 25.6% of the cohort. The Dead state actually is a sigmoid (S-shaped) curve, rising moderately for a few cycles because most people are Well, but as soon as the 28% mortality from the Progressive state takes effect, the curve gets steeper. Finally, it flattens, as few people remain alive. This graph is typical of absorbing Markov process models.

 
Source
< Prev   CONTENTS   Source   Next >