Tutorial 10

1) Hidden Markov Model Algorithms


Question 1: The Viterbi, forward, backward and Baum-Welch algorithms can all be applied to hidden Markov models. What does each algorithm do?

Viterbi algorithm:  

Forward algorithm:  

Backward algorithm:  

Baum-Welch algorithm:  








2) Baum-Welch Algorithm





Suppose that we have observed the following 2 sequences: $ (x^1, \pi^1) = (TTS, AOH), (x^2, \pi^2) = (TST, AOA)$.

We will use the Baum-Welch algorithm on these training sequences to estimate the parameters $a_{AA}$, $a_{AH}$, $a_{AO}$, $e_A(T)$ and $e_A(S)$ for the HMM above. This can be done using the following equations (where $a'$ and $e'$ are counts):

$$ a'_{kl} = \sum\limits_{j=1}^2 \frac{1}{P(x^j)} \sum\limits_{i=1}^{3-1} f^j_k(i)\; a_{kl} \; e_l(x^j_{i+1}) \; b^j_l(i+1) $$

$$ e'_k(b) = \sum\limits_{j=1}^2 \frac{1}{P(x^j)} \sum\limits_{i:x_i^j=b} f^j_k(i) \; b^j_k(i) $$

Question 2a) Use a pesudo-count of 1 to estimate the initial values for $a$ and $e$: (ie. count the observed number of each event and add 1)

$a = $
H A O
H
A
O
$e =$
S T
H
A
O


Question 2b) What are the 3 steps in the Baum-Welch algorithm? Use HMM.ipynb to help you perform these steps. We will only apply one iteration.

Step 1:

Step 2:

Step 3:




3) UPGMA Trees

$$ D_1 = \begin{bmatrix} 0 & 6 & 8 \\ & 0 & 5 \\ & & 0 \end{bmatrix} $$

$$ D_2 = \begin{bmatrix} 0 & 4 & 5 & 2 \\ & 0 & 1 & 3 \\ & & 0 & 8 \\ & & & 0 \end{bmatrix} $$

Question 3: Construct the UPGMA trees for the distance matrices above.

Tree 1:

Tree 2: