Is a GMM-HMM equivalent to a no-mixture HMM enriched with more states?
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
2
down vote
favorite
I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.
Am I wrong thinking that the two approaches are equivalent?
hidden-markov-model gaussian-mixture
add a comment |Â
up vote
2
down vote
favorite
I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.
Am I wrong thinking that the two approaches are equivalent?
hidden-markov-model gaussian-mixture
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.
Am I wrong thinking that the two approaches are equivalent?
hidden-markov-model gaussian-mixture
I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.
Am I wrong thinking that the two approaches are equivalent?
hidden-markov-model gaussian-mixture
hidden-markov-model gaussian-mixture
asked 1 hour ago
Patrick
1093
1093
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
No you are not wrong thinking that.
If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
$$
Y mid X_1, X_2 = 1 sim f_1(y)
$$
and
$$
Y mid X_1, X_2 = 0 sim f_2(y).
$$
This is because
$$
f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
$$
Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).
add a comment |Â
up vote
2
down vote
It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.
Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.
Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).
For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:
$s_i$ to $s_6$ with probability $t cdot p_6$
$s_i$ to $s_7$ with probability $t cdot p_7$
For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.
If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.
However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.
Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.
In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.
New contributor
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
No you are not wrong thinking that.
If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
$$
Y mid X_1, X_2 = 1 sim f_1(y)
$$
and
$$
Y mid X_1, X_2 = 0 sim f_2(y).
$$
This is because
$$
f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
$$
Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).
add a comment |Â
up vote
2
down vote
No you are not wrong thinking that.
If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
$$
Y mid X_1, X_2 = 1 sim f_1(y)
$$
and
$$
Y mid X_1, X_2 = 0 sim f_2(y).
$$
This is because
$$
f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
$$
Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).
add a comment |Â
up vote
2
down vote
up vote
2
down vote
No you are not wrong thinking that.
If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
$$
Y mid X_1, X_2 = 1 sim f_1(y)
$$
and
$$
Y mid X_1, X_2 = 0 sim f_2(y).
$$
This is because
$$
f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
$$
Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).
No you are not wrong thinking that.
If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
$$
Y mid X_1, X_2 = 1 sim f_1(y)
$$
and
$$
Y mid X_1, X_2 = 0 sim f_2(y).
$$
This is because
$$
f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
$$
Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).
edited 39 mins ago
answered 52 mins ago
Taylor
10.6k11642
10.6k11642
add a comment |Â
add a comment |Â
up vote
2
down vote
It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.
Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.
Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).
For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:
$s_i$ to $s_6$ with probability $t cdot p_6$
$s_i$ to $s_7$ with probability $t cdot p_7$
For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.
If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.
However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.
Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.
In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.
New contributor
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
add a comment |Â
up vote
2
down vote
It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.
Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.
Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).
For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:
$s_i$ to $s_6$ with probability $t cdot p_6$
$s_i$ to $s_7$ with probability $t cdot p_7$
For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.
If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.
However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.
Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.
In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.
New contributor
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
add a comment |Â
up vote
2
down vote
up vote
2
down vote
It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.
Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.
Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).
For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:
$s_i$ to $s_6$ with probability $t cdot p_6$
$s_i$ to $s_7$ with probability $t cdot p_7$
For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.
If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.
However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.
Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.
In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.
New contributor
It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.
Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.
Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).
For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:
$s_i$ to $s_6$ with probability $t cdot p_6$
$s_i$ to $s_7$ with probability $t cdot p_7$
For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.
If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.
However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.
Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.
In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.
New contributor
edited 7 mins ago
New contributor
answered 52 mins ago
Vincent B. Lortie
212
212
New contributor
New contributor
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
add a comment |Â
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
1
1
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
â jbowman
26 mins ago
1
1
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
â jbowman
25 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
@jbowman you are correct, I've changed it.
â Vincent B. Lortie
5 mins ago
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f368845%2fis-a-gmm-hmm-equivalent-to-a-no-mixture-hmm-enriched-with-more-states%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password