Is there a *simple* proof that the intersection of a CFL and a regular language is a CFL?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I am following a course on complexity theory where languages are a part of the course. There is a proof that no matter how hard I try to understand, it is till so complex that I cannot make it to half of the proof. Namely, the prooc of the statement that the intersection of a CFL and a regular language is again CFL.



The proof that is provided to us is 2-3 pages of pure text and notations. The ones online are also heavily dependent on much notations and unfortunately, Sipser does not handle it in his book Introduction to the theory of computation. I'm wondering if there's a straight-forward and less-dependent-on-notation proof that someone knows that will contribute to understanding the proof or even reproducing it. Because at this moment, I don't even understand the proof.










share|cite|improve this question



























    up vote
    1
    down vote

    favorite












    I am following a course on complexity theory where languages are a part of the course. There is a proof that no matter how hard I try to understand, it is till so complex that I cannot make it to half of the proof. Namely, the prooc of the statement that the intersection of a CFL and a regular language is again CFL.



    The proof that is provided to us is 2-3 pages of pure text and notations. The ones online are also heavily dependent on much notations and unfortunately, Sipser does not handle it in his book Introduction to the theory of computation. I'm wondering if there's a straight-forward and less-dependent-on-notation proof that someone knows that will contribute to understanding the proof or even reproducing it. Because at this moment, I don't even understand the proof.










    share|cite|improve this question

























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I am following a course on complexity theory where languages are a part of the course. There is a proof that no matter how hard I try to understand, it is till so complex that I cannot make it to half of the proof. Namely, the prooc of the statement that the intersection of a CFL and a regular language is again CFL.



      The proof that is provided to us is 2-3 pages of pure text and notations. The ones online are also heavily dependent on much notations and unfortunately, Sipser does not handle it in his book Introduction to the theory of computation. I'm wondering if there's a straight-forward and less-dependent-on-notation proof that someone knows that will contribute to understanding the proof or even reproducing it. Because at this moment, I don't even understand the proof.










      share|cite|improve this question















      I am following a course on complexity theory where languages are a part of the course. There is a proof that no matter how hard I try to understand, it is till so complex that I cannot make it to half of the proof. Namely, the prooc of the statement that the intersection of a CFL and a regular language is again CFL.



      The proof that is provided to us is 2-3 pages of pure text and notations. The ones online are also heavily dependent on much notations and unfortunately, Sipser does not handle it in his book Introduction to the theory of computation. I'm wondering if there's a straight-forward and less-dependent-on-notation proof that someone knows that will contribute to understanding the proof or even reproducing it. Because at this moment, I don't even understand the proof.







      regular-languages context-free proof-techniques






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited 39 mins ago









      Thinh D. Nguyen

      3,32311366




      3,32311366










      asked 1 hour ago









      Abdul Malek Altawekji

      1185




      1185




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          I don't know of a simple proof, but there is a proof where the underlying idea is simple to understand.



          Suppose you have a PDA $N$ (deterministic or nondeterministic; doesn't matter) which accepts the language $L_M$, and whose set of states is $S$. And further suppose you have a DFA $M$ with $T$ as its set of states which accepts the language $L_M$.



          Conceptually, you can construct a new PDA, whose set of states is the Cartesian product $S times T$, and awhich "runs" both the PDA and the DFA in parallel. A transition from state $(s_i,t_j)$ to state $(s_p,t_q)$ does whatever the PDA does when transitioning from state $s_i$ to $s_p$, but also takes the appropriate transition from $t_j$ to $t_q$.



          An accept state in this new PDA is any state which corresponds to an accept state in both $M$ and $N$. Then this PDA should, assuming we get the details right, accept the language $L_M cap L_N$.



          There are a lot of details that you need to get right, which makes the proof fairly technical. So it's not a short proof. But it's simple to understand the idea behind it.



          If you want to understand how this works, I would suggest trying to write a reasonably formal proof that the intersection of two regular languages is regular using a similar construction.



          And as an additional exercise: Why can't you use this technique to prove that the intersection of two context-free languages is context-free?






          share|cite|improve this answer




















          • For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
            – Thinh D. Nguyen
            15 mins ago










          • This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
            – Thinh D. Nguyen
            12 mins ago

















          up vote
          1
          down vote













          We should imagine the way a pushdown automaton (PDA) works.



          A PDA has a read-only head pointing at the input string. At each moment, it is resting at some index of the input string. And while the PDA is still running, it only moves from left to right once. But it may stay at some indices to "think" for a while.



          Having said that, we can see that there are two kinds of transitions:



          1. Consuming transition: the transitions that moves the input head one step to the right.


          2. Non-consuming transition: the transitions that keeps the input head in place. These ones only manipulate the stack for sure.


          Now, construct the cross product of the PDA $mathbbP$ and the DFA $mathrmA$ as usual, i.e. with the set of states as cartesian product of the set of states of $mathrmP$ and the set of states of $mathrmA$.



          So, for each consuming transition of $mathbbP$, we only allow it in the cross product PDA when the corresponding transition in $mathrmA$ is also true.




          And while the PDA is "thinking" (i.e. performing non-consuming moves), the DFA $mathrmA$'s state must remain unchanged.






          share|cite|improve this answer






















            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "419"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: false,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcs.stackexchange.com%2fquestions%2f99164%2fis-there-a-simple-proof-that-the-intersection-of-a-cfl-and-a-regular-language%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            1
            down vote













            I don't know of a simple proof, but there is a proof where the underlying idea is simple to understand.



            Suppose you have a PDA $N$ (deterministic or nondeterministic; doesn't matter) which accepts the language $L_M$, and whose set of states is $S$. And further suppose you have a DFA $M$ with $T$ as its set of states which accepts the language $L_M$.



            Conceptually, you can construct a new PDA, whose set of states is the Cartesian product $S times T$, and awhich "runs" both the PDA and the DFA in parallel. A transition from state $(s_i,t_j)$ to state $(s_p,t_q)$ does whatever the PDA does when transitioning from state $s_i$ to $s_p$, but also takes the appropriate transition from $t_j$ to $t_q$.



            An accept state in this new PDA is any state which corresponds to an accept state in both $M$ and $N$. Then this PDA should, assuming we get the details right, accept the language $L_M cap L_N$.



            There are a lot of details that you need to get right, which makes the proof fairly technical. So it's not a short proof. But it's simple to understand the idea behind it.



            If you want to understand how this works, I would suggest trying to write a reasonably formal proof that the intersection of two regular languages is regular using a similar construction.



            And as an additional exercise: Why can't you use this technique to prove that the intersection of two context-free languages is context-free?






            share|cite|improve this answer




















            • For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
              – Thinh D. Nguyen
              15 mins ago










            • This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
              – Thinh D. Nguyen
              12 mins ago














            up vote
            1
            down vote













            I don't know of a simple proof, but there is a proof where the underlying idea is simple to understand.



            Suppose you have a PDA $N$ (deterministic or nondeterministic; doesn't matter) which accepts the language $L_M$, and whose set of states is $S$. And further suppose you have a DFA $M$ with $T$ as its set of states which accepts the language $L_M$.



            Conceptually, you can construct a new PDA, whose set of states is the Cartesian product $S times T$, and awhich "runs" both the PDA and the DFA in parallel. A transition from state $(s_i,t_j)$ to state $(s_p,t_q)$ does whatever the PDA does when transitioning from state $s_i$ to $s_p$, but also takes the appropriate transition from $t_j$ to $t_q$.



            An accept state in this new PDA is any state which corresponds to an accept state in both $M$ and $N$. Then this PDA should, assuming we get the details right, accept the language $L_M cap L_N$.



            There are a lot of details that you need to get right, which makes the proof fairly technical. So it's not a short proof. But it's simple to understand the idea behind it.



            If you want to understand how this works, I would suggest trying to write a reasonably formal proof that the intersection of two regular languages is regular using a similar construction.



            And as an additional exercise: Why can't you use this technique to prove that the intersection of two context-free languages is context-free?






            share|cite|improve this answer




















            • For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
              – Thinh D. Nguyen
              15 mins ago










            • This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
              – Thinh D. Nguyen
              12 mins ago












            up vote
            1
            down vote










            up vote
            1
            down vote









            I don't know of a simple proof, but there is a proof where the underlying idea is simple to understand.



            Suppose you have a PDA $N$ (deterministic or nondeterministic; doesn't matter) which accepts the language $L_M$, and whose set of states is $S$. And further suppose you have a DFA $M$ with $T$ as its set of states which accepts the language $L_M$.



            Conceptually, you can construct a new PDA, whose set of states is the Cartesian product $S times T$, and awhich "runs" both the PDA and the DFA in parallel. A transition from state $(s_i,t_j)$ to state $(s_p,t_q)$ does whatever the PDA does when transitioning from state $s_i$ to $s_p$, but also takes the appropriate transition from $t_j$ to $t_q$.



            An accept state in this new PDA is any state which corresponds to an accept state in both $M$ and $N$. Then this PDA should, assuming we get the details right, accept the language $L_M cap L_N$.



            There are a lot of details that you need to get right, which makes the proof fairly technical. So it's not a short proof. But it's simple to understand the idea behind it.



            If you want to understand how this works, I would suggest trying to write a reasonably formal proof that the intersection of two regular languages is regular using a similar construction.



            And as an additional exercise: Why can't you use this technique to prove that the intersection of two context-free languages is context-free?






            share|cite|improve this answer












            I don't know of a simple proof, but there is a proof where the underlying idea is simple to understand.



            Suppose you have a PDA $N$ (deterministic or nondeterministic; doesn't matter) which accepts the language $L_M$, and whose set of states is $S$. And further suppose you have a DFA $M$ with $T$ as its set of states which accepts the language $L_M$.



            Conceptually, you can construct a new PDA, whose set of states is the Cartesian product $S times T$, and awhich "runs" both the PDA and the DFA in parallel. A transition from state $(s_i,t_j)$ to state $(s_p,t_q)$ does whatever the PDA does when transitioning from state $s_i$ to $s_p$, but also takes the appropriate transition from $t_j$ to $t_q$.



            An accept state in this new PDA is any state which corresponds to an accept state in both $M$ and $N$. Then this PDA should, assuming we get the details right, accept the language $L_M cap L_N$.



            There are a lot of details that you need to get right, which makes the proof fairly technical. So it's not a short proof. But it's simple to understand the idea behind it.



            If you want to understand how this works, I would suggest trying to write a reasonably formal proof that the intersection of two regular languages is regular using a similar construction.



            And as an additional exercise: Why can't you use this technique to prove that the intersection of two context-free languages is context-free?







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered 35 mins ago









            Pseudonym

            9,56812041




            9,56812041











            • For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
              – Thinh D. Nguyen
              15 mins ago










            • This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
              – Thinh D. Nguyen
              12 mins ago
















            • For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
              – Thinh D. Nguyen
              15 mins ago










            • This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
              – Thinh D. Nguyen
              12 mins ago















            For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
            – Thinh D. Nguyen
            15 mins ago




            For the additional exercise, it is obviously impossible to merge two stacks, which may differ vastly in length when the two PDAs run.
            – Thinh D. Nguyen
            15 mins ago












            This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
            – Thinh D. Nguyen
            12 mins ago




            This exercise may help the OP if he/she already understands what is going on in the proof for DFA. But of no help, when he/she still has not figured out the cross product in your answer. Because, in that case, there is no second stack to be added to the picture.
            – Thinh D. Nguyen
            12 mins ago










            up vote
            1
            down vote













            We should imagine the way a pushdown automaton (PDA) works.



            A PDA has a read-only head pointing at the input string. At each moment, it is resting at some index of the input string. And while the PDA is still running, it only moves from left to right once. But it may stay at some indices to "think" for a while.



            Having said that, we can see that there are two kinds of transitions:



            1. Consuming transition: the transitions that moves the input head one step to the right.


            2. Non-consuming transition: the transitions that keeps the input head in place. These ones only manipulate the stack for sure.


            Now, construct the cross product of the PDA $mathbbP$ and the DFA $mathrmA$ as usual, i.e. with the set of states as cartesian product of the set of states of $mathrmP$ and the set of states of $mathrmA$.



            So, for each consuming transition of $mathbbP$, we only allow it in the cross product PDA when the corresponding transition in $mathrmA$ is also true.




            And while the PDA is "thinking" (i.e. performing non-consuming moves), the DFA $mathrmA$'s state must remain unchanged.






            share|cite|improve this answer


























              up vote
              1
              down vote













              We should imagine the way a pushdown automaton (PDA) works.



              A PDA has a read-only head pointing at the input string. At each moment, it is resting at some index of the input string. And while the PDA is still running, it only moves from left to right once. But it may stay at some indices to "think" for a while.



              Having said that, we can see that there are two kinds of transitions:



              1. Consuming transition: the transitions that moves the input head one step to the right.


              2. Non-consuming transition: the transitions that keeps the input head in place. These ones only manipulate the stack for sure.


              Now, construct the cross product of the PDA $mathbbP$ and the DFA $mathrmA$ as usual, i.e. with the set of states as cartesian product of the set of states of $mathrmP$ and the set of states of $mathrmA$.



              So, for each consuming transition of $mathbbP$, we only allow it in the cross product PDA when the corresponding transition in $mathrmA$ is also true.




              And while the PDA is "thinking" (i.e. performing non-consuming moves), the DFA $mathrmA$'s state must remain unchanged.






              share|cite|improve this answer
























                up vote
                1
                down vote










                up vote
                1
                down vote









                We should imagine the way a pushdown automaton (PDA) works.



                A PDA has a read-only head pointing at the input string. At each moment, it is resting at some index of the input string. And while the PDA is still running, it only moves from left to right once. But it may stay at some indices to "think" for a while.



                Having said that, we can see that there are two kinds of transitions:



                1. Consuming transition: the transitions that moves the input head one step to the right.


                2. Non-consuming transition: the transitions that keeps the input head in place. These ones only manipulate the stack for sure.


                Now, construct the cross product of the PDA $mathbbP$ and the DFA $mathrmA$ as usual, i.e. with the set of states as cartesian product of the set of states of $mathrmP$ and the set of states of $mathrmA$.



                So, for each consuming transition of $mathbbP$, we only allow it in the cross product PDA when the corresponding transition in $mathrmA$ is also true.




                And while the PDA is "thinking" (i.e. performing non-consuming moves), the DFA $mathrmA$'s state must remain unchanged.






                share|cite|improve this answer














                We should imagine the way a pushdown automaton (PDA) works.



                A PDA has a read-only head pointing at the input string. At each moment, it is resting at some index of the input string. And while the PDA is still running, it only moves from left to right once. But it may stay at some indices to "think" for a while.



                Having said that, we can see that there are two kinds of transitions:



                1. Consuming transition: the transitions that moves the input head one step to the right.


                2. Non-consuming transition: the transitions that keeps the input head in place. These ones only manipulate the stack for sure.


                Now, construct the cross product of the PDA $mathbbP$ and the DFA $mathrmA$ as usual, i.e. with the set of states as cartesian product of the set of states of $mathrmP$ and the set of states of $mathrmA$.



                So, for each consuming transition of $mathbbP$, we only allow it in the cross product PDA when the corresponding transition in $mathrmA$ is also true.




                And while the PDA is "thinking" (i.e. performing non-consuming moves), the DFA $mathrmA$'s state must remain unchanged.







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited 10 mins ago

























                answered 17 mins ago









                Thinh D. Nguyen

                3,32311366




                3,32311366



























                     

                    draft saved


                    draft discarded















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcs.stackexchange.com%2fquestions%2f99164%2fis-there-a-simple-proof-that-the-intersection-of-a-cfl-and-a-regular-language%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Comments

                    Popular posts from this blog

                    Long meetings (6-7 hours a day): Being “babysat” by supervisor

                    Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

                    Confectionery