CS345, Machine Learning
Prof. Alvarez

The Forward-Backward Algorithm

This algorithm addresses the issue of estimating the parameters of a Hidden Markov Model (HMM) that are most likely to generate an observed sequence y1...yk of output values. The underlying topology of the HMM is assumed to be known in advance. The objective is to find the values of the state transition probabilities and the output probabilities for each state for which the likelihood of the observed output sequence is maximized. The forward-backward algorithm may be viewed as an Expectation Maximization (EM) approach to this problem.

Key Quantities

The forward-backward algorithm iteratively updates probability estimates using pseudo-counts associated with a given output sequence y1...yk. The key target quantities are the following:
P*(i,s',s) = P(y1...yk observed AND i-th state transition is s--> s')

c(s --> s') = sum from i=0 to i=k of P*(i, s', s)

Notice that c is defined in terms of P*. In turn, P* can be computed from intermediate quantities alphai(s) and betai(s) defined as follows:
alphai(s) = P(y1...yi observed AND i-th state is s)

betai(s) = P(yi+1...yk observed AND i-th state is s)

In fact, we have:
P*(i,s',s) = alphai(s)*ps'<--s*q(yi+1 | s'<--s)*betai+1(s')
The forward-backward algorithm updates the alphas and betas and then computes maximum likelihood estimates for P* and c using the preceding expressions. See the pseudocode below.

Forward-Backward Algorithm Pseudocode

Iterate the following stages until the termination condition is met:

Forward stage

Backward stage

Parameter updating