Is a GMM-HMM equivalent to a no-mixture HMM enriched with more states?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
2
down vote

favorite












I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.



Am I wrong thinking that the two approaches are equivalent?










share|cite|improve this question



























    up vote
    2
    down vote

    favorite












    I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.



    Am I wrong thinking that the two approaches are equivalent?










    share|cite|improve this question























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.



      Am I wrong thinking that the two approaches are equivalent?










      share|cite|improve this question













      I'm trying to model sequence data that has 5 hidden states. Observation data conditional to each state is gaussian except for one state for which mixture of 2 gaussians seems more appropriate. Unfortunately, the R package that I'm using (depmix) does not seem to support (without extending the package) a GMM as a possible response distribution. So I was considering the possibility of adding a 6th state so I could interpret one of this enriched set of states as a state for which observation distribution is the first gaussian in my above mixture and another one as a state for which observation distribution is the second gaussian.



      Am I wrong thinking that the two approaches are equivalent?







      hidden-markov-model gaussian-mixture






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 1 hour ago









      Patrick

      1093




      1093




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          2
          down vote













          No you are not wrong thinking that.



          If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
          $$
          Y mid X_1, X_2 = 1 sim f_1(y)
          $$

          and
          $$
          Y mid X_1, X_2 = 0 sim f_2(y).
          $$

          This is because
          $$
          f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
          $$

          Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).






          share|cite|improve this answer





























            up vote
            2
            down vote













            It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.



            Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.



            Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).



            For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:




            • $s_i$ to $s_6$ with probability $t cdot p_6$


            • $s_i$ to $s_7$ with probability $t cdot p_7$

            For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.



            If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.



            However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.



            Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.



            In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.






            share|cite|improve this answer










            New contributor




            Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
            Check out our Code of Conduct.













            • 1




              I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
              – jbowman
              26 mins ago






            • 1




              On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
              – jbowman
              25 mins ago











            • @jbowman you are correct, I've changed it.
              – Vincent B. Lortie
              5 mins ago











            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "65"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: false,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f368845%2fis-a-gmm-hmm-equivalent-to-a-no-mixture-hmm-enriched-with-more-states%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            2
            down vote













            No you are not wrong thinking that.



            If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
            $$
            Y mid X_1, X_2 = 1 sim f_1(y)
            $$

            and
            $$
            Y mid X_1, X_2 = 0 sim f_2(y).
            $$

            This is because
            $$
            f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
            $$

            Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).






            share|cite|improve this answer


























              up vote
              2
              down vote













              No you are not wrong thinking that.



              If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
              $$
              Y mid X_1, X_2 = 1 sim f_1(y)
              $$

              and
              $$
              Y mid X_1, X_2 = 0 sim f_2(y).
              $$

              This is because
              $$
              f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
              $$

              Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).






              share|cite|improve this answer
























                up vote
                2
                down vote










                up vote
                2
                down vote









                No you are not wrong thinking that.



                If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
                $$
                Y mid X_1, X_2 = 1 sim f_1(y)
                $$

                and
                $$
                Y mid X_1, X_2 = 0 sim f_2(y).
                $$

                This is because
                $$
                f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
                $$

                Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).






                share|cite|improve this answer














                No you are not wrong thinking that.



                If $Y mid X_1 sim alpha f_1(y) + (1-alpha)f_2(y)$, then you can also let $X_2 sim textBernoulli(alpha)$ independently and say
                $$
                Y mid X_1, X_2 = 1 sim f_1(y)
                $$

                and
                $$
                Y mid X_1, X_2 = 0 sim f_2(y).
                $$

                This is because
                $$
                f_Y(y mid x_1) = sum_i=1^2f_Y(y mid x_1, x_2) f(x_2) = alpha f_1(y) + (1-alpha)f_2(y).
                $$

                Keep in mind the sequence through time $X_2^t_t$ is iid, and so the Markov structure is overkill (but still perfectly fine).







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited 39 mins ago

























                answered 52 mins ago









                Taylor

                10.6k11642




                10.6k11642






















                    up vote
                    2
                    down vote













                    It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.



                    Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.



                    Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).



                    For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:




                    • $s_i$ to $s_6$ with probability $t cdot p_6$


                    • $s_i$ to $s_7$ with probability $t cdot p_7$

                    For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.



                    If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.



                    However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.



                    Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.



                    In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.






                    share|cite|improve this answer










                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.













                    • 1




                      I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                      – jbowman
                      26 mins ago






                    • 1




                      On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                      – jbowman
                      25 mins ago











                    • @jbowman you are correct, I've changed it.
                      – Vincent B. Lortie
                      5 mins ago















                    up vote
                    2
                    down vote













                    It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.



                    Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.



                    Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).



                    For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:




                    • $s_i$ to $s_6$ with probability $t cdot p_6$


                    • $s_i$ to $s_7$ with probability $t cdot p_7$

                    For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.



                    If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.



                    However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.



                    Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.



                    In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.






                    share|cite|improve this answer










                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.













                    • 1




                      I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                      – jbowman
                      26 mins ago






                    • 1




                      On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                      – jbowman
                      25 mins ago











                    • @jbowman you are correct, I've changed it.
                      – Vincent B. Lortie
                      5 mins ago













                    up vote
                    2
                    down vote










                    up vote
                    2
                    down vote









                    It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.



                    Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.



                    Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).



                    For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:




                    • $s_i$ to $s_6$ with probability $t cdot p_6$


                    • $s_i$ to $s_7$ with probability $t cdot p_7$

                    For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.



                    If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.



                    However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.



                    Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.



                    In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.






                    share|cite|improve this answer










                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.









                    It is not exactly equivalent: the 6-state HMM can model everything the GMM-HMM can, but not the other way around.



                    Suppose you start with the GMM-HMM, with $s_5$ being the GMM state, and turn it into the 6-state HMM with states $s_6$ and $s_7$ instead of $s_5$.



                    Let $p_6$ and $p_7$ be the prior probabilities of the two components of the GMM (that are then transformed into states $s_6$ and $s_7$).



                    For every transition from a state $s_i$ to $s_5$ in the GMM-HMM (with probability $t$), create two transition probabilities in the 6-state HMM:




                    • $s_i$ to $s_6$ with probability $t cdot p_6$


                    • $s_i$ to $s_7$ with probability $t cdot p_7$

                    For every transition from $s_5$ to a state $s_i$ in the GMM-HMM (with probability $t$), create two transition probabilities, respectively from $s_6$ and $s_7$, going to $s_i$, both with the same probability $t$.



                    If I am not mistaken, the resulting 6-state HMM is equivalent to the GMM-HMM.



                    However, the other way around doesn't always work. Imagine you are starting the the 6-state HMM.



                    Suppose that the transition probabilities for $s_i rightarrow s_6$ and $s_i rightarrow s_7$ are not equal do not have the same ratio as $p_6$ and $p_7$ (EDIT). You could not carry this information into the GMM-HMM.



                    In short, the 6-state HMM should be able to represent everything the GMM-HMM can, and more.







                    share|cite|improve this answer










                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.









                    share|cite|improve this answer



                    share|cite|improve this answer








                    edited 7 mins ago





















                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.









                    answered 52 mins ago









                    Vincent B. Lortie

                    212




                    212




                    New contributor




                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.





                    New contributor





                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.






                    Vincent B. Lortie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.







                    • 1




                      I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                      – jbowman
                      26 mins ago






                    • 1




                      On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                      – jbowman
                      25 mins ago











                    • @jbowman you are correct, I've changed it.
                      – Vincent B. Lortie
                      5 mins ago













                    • 1




                      I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                      – jbowman
                      26 mins ago






                    • 1




                      On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                      – jbowman
                      25 mins ago











                    • @jbowman you are correct, I've changed it.
                      – Vincent B. Lortie
                      5 mins ago








                    1




                    1




                    I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                    – jbowman
                    26 mins ago




                    I'm not sure that last "suppose" is correct. What if the mixture probabilities are, say, 0.3 and 0.7? Wouldn't that imply that the transition probabilities into $s_6$ and $s_7$ weren't equal?
                    – jbowman
                    26 mins ago




                    1




                    1




                    On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                    – jbowman
                    25 mins ago





                    On the other hand, there is an implication of the GMM that $p_i,6/p_i,7 = p_j,6/p_j,7$ for all $i,j$, well except the zero transition probabilities of course, which doesn't have to be enforced by the 6-state HMM model, so your fundamental point is correct.
                    – jbowman
                    25 mins ago













                    @jbowman you are correct, I've changed it.
                    – Vincent B. Lortie
                    5 mins ago





                    @jbowman you are correct, I've changed it.
                    – Vincent B. Lortie
                    5 mins ago


















                     

                    draft saved


                    draft discarded















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f368845%2fis-a-gmm-hmm-equivalent-to-a-no-mixture-hmm-enriched-with-more-states%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Comments

                    Popular posts from this blog

                    What does second last employer means? [closed]

                    Installing NextGIS Connect into QGIS 3?

                    One-line joke