Question
All info is provided. Don't mark the question for review if you aren't sure how to do it as is. We consider the following hidden
All info is provided. Don't mark the question for review if you aren't sure how to do it as is.
We consider the following hidden Markov model (HMM).
S1 | S2 | |
Z1 | Z2 |
We have binary states (labeled s and s) and binary observations (labeled z and z) and the probabilities as in the given table.
S1 | P(S1) |
s | 0.3 |
s | 0.7 |
Si | Si+1 | P(Si+1|Si) |
s | s | 0.2 |
s | s | 0.8 |
s | s | 0.9 |
s | s | 0.1 |
Si | Zi | P(Zi|Si) |
s | z | 0.1 |
s | z | 0.9 |
s | z | 0.8 |
s | z | 0.2 |
The above is the setup for the following question:
We observed that Z1=z and then Z2=z. We used the forward algorithm and obtained P(S1=s|Z1=z) = 0.6585. Please use the forward algorithm to obtain the following probabilities (keep only four decimal values for all probabilities):
- [5 points] P(S2=s|Z1=z)
- [5 points] P(S2=s|Z1=z, Z2=z)
The following is for bonus points. You should leave this to the last in your to-do list.
After observing the two evidences, we can have a better inference of the state S1. Use either (i) Bayesian inference or (ii) Bayes' rule and conditional independence to calculate the following probability:
- [6 bonus points] P(S1=s|Z1=z, Z2=z)
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started