Linear algebra eigen values

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?










share|cite|improve this question























  • I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
    – dineshdileep
    4 hours ago










  • If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
    – user1551
    46 mins ago















up vote
2
down vote

favorite












Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?










share|cite|improve this question























  • I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
    – dineshdileep
    4 hours ago










  • If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
    – user1551
    46 mins ago













up vote
2
down vote

favorite









up vote
2
down vote

favorite











Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?










share|cite|improve this question















Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?







linear-algebra eigenvalues-eigenvectors






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 3 hours ago









GNUSupporter 8964民主女神 地下教會

12.1k72243




12.1k72243










asked 4 hours ago









Caitlyn

191




191











  • I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
    – dineshdileep
    4 hours ago










  • If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
    – user1551
    46 mins ago

















  • I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
    – dineshdileep
    4 hours ago










  • If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
    – user1551
    46 mins ago
















I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago




I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago












If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago





If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago











3 Answers
3






active

oldest

votes

















up vote
3
down vote













Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.



That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.



The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.



We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$



For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$



Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.






share|cite|improve this answer





























    up vote
    0
    down vote













    By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.






    share|cite|improve this answer



























      up vote
      0
      down vote













      It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have



      $$Avecv = lambda vecv$$



      or equivalently, when $I$ is the identity matrix,



      $$Avecv - lambda vecv = (A - lambda I) vecv = 0$$



      This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."



      An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:



      $$beginbmatrix
      6 & -1\
      2 & 3
      endbmatrixbeginbmatrix
      1 & 0 \
      0 & 1
      endbmatrix
      =beginbmatrix
      6 & -1\
      2 & 3
      endbmatrix ;;; neq ;;; 4beginbmatrix
      1 & 0 \
      0 & 1
      endbmatrix
      =beginbmatrix
      4 & 0\
      0 & 4
      endbmatrix $$



      The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.



      Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.






      share|cite|improve this answer




















        Your Answer





        StackExchange.ifUsing("editor", function ()
        return StackExchange.using("mathjaxEditing", function ()
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        );
        );
        , "mathjax-editing");

        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "69"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );













         

        draft saved


        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2981402%2flinear-algebra-eigen-values%23new-answer', 'question_page');

        );

        Post as a guest






























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes








        up vote
        3
        down vote













        Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.



        That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.



        The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.



        We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$



        For each $k$ muliply by some scalar $c_k$ and add up. This will give us
        $sum_k c_kA^kv =(sum c_klambda^k) v$



        Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.






        share|cite|improve this answer


























          up vote
          3
          down vote













          Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.



          That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.



          The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.



          We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$



          For each $k$ muliply by some scalar $c_k$ and add up. This will give us
          $sum_k c_kA^kv =(sum c_klambda^k) v$



          Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.






          share|cite|improve this answer
























            up vote
            3
            down vote










            up vote
            3
            down vote









            Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.



            That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.



            The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.



            We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$



            For each $k$ muliply by some scalar $c_k$ and add up. This will give us
            $sum_k c_kA^kv =(sum c_klambda^k) v$



            Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.






            share|cite|improve this answer














            Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.



            That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.



            The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.



            We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$



            For each $k$ muliply by some scalar $c_k$ and add up. This will give us
            $sum_k c_kA^kv =(sum c_klambda^k) v$



            Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited 1 hour ago

























            answered 4 hours ago









            P Vanchinathan

            14.4k12036




            14.4k12036




















                up vote
                0
                down vote













                By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.






                share|cite|improve this answer
























                  up vote
                  0
                  down vote













                  By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.






                  share|cite|improve this answer






















                    up vote
                    0
                    down vote










                    up vote
                    0
                    down vote









                    By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.






                    share|cite|improve this answer












                    By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered 4 hours ago









                    Lin Xuelei

                    9810




                    9810




















                        up vote
                        0
                        down vote













                        It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have



                        $$Avecv = lambda vecv$$



                        or equivalently, when $I$ is the identity matrix,



                        $$Avecv - lambda vecv = (A - lambda I) vecv = 0$$



                        This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."



                        An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:



                        $$beginbmatrix
                        6 & -1\
                        2 & 3
                        endbmatrixbeginbmatrix
                        1 & 0 \
                        0 & 1
                        endbmatrix
                        =beginbmatrix
                        6 & -1\
                        2 & 3
                        endbmatrix ;;; neq ;;; 4beginbmatrix
                        1 & 0 \
                        0 & 1
                        endbmatrix
                        =beginbmatrix
                        4 & 0\
                        0 & 4
                        endbmatrix $$



                        The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.



                        Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.






                        share|cite|improve this answer
























                          up vote
                          0
                          down vote













                          It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have



                          $$Avecv = lambda vecv$$



                          or equivalently, when $I$ is the identity matrix,



                          $$Avecv - lambda vecv = (A - lambda I) vecv = 0$$



                          This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."



                          An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:



                          $$beginbmatrix
                          6 & -1\
                          2 & 3
                          endbmatrixbeginbmatrix
                          1 & 0 \
                          0 & 1
                          endbmatrix
                          =beginbmatrix
                          6 & -1\
                          2 & 3
                          endbmatrix ;;; neq ;;; 4beginbmatrix
                          1 & 0 \
                          0 & 1
                          endbmatrix
                          =beginbmatrix
                          4 & 0\
                          0 & 4
                          endbmatrix $$



                          The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.



                          Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.






                          share|cite|improve this answer






















                            up vote
                            0
                            down vote










                            up vote
                            0
                            down vote









                            It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have



                            $$Avecv = lambda vecv$$



                            or equivalently, when $I$ is the identity matrix,



                            $$Avecv - lambda vecv = (A - lambda I) vecv = 0$$



                            This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."



                            An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:



                            $$beginbmatrix
                            6 & -1\
                            2 & 3
                            endbmatrixbeginbmatrix
                            1 & 0 \
                            0 & 1
                            endbmatrix
                            =beginbmatrix
                            6 & -1\
                            2 & 3
                            endbmatrix ;;; neq ;;; 4beginbmatrix
                            1 & 0 \
                            0 & 1
                            endbmatrix
                            =beginbmatrix
                            4 & 0\
                            0 & 4
                            endbmatrix $$



                            The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.



                            Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.






                            share|cite|improve this answer












                            It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have



                            $$Avecv = lambda vecv$$



                            or equivalently, when $I$ is the identity matrix,



                            $$Avecv - lambda vecv = (A - lambda I) vecv = 0$$



                            This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."



                            An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:



                            $$beginbmatrix
                            6 & -1\
                            2 & 3
                            endbmatrixbeginbmatrix
                            1 & 0 \
                            0 & 1
                            endbmatrix
                            =beginbmatrix
                            6 & -1\
                            2 & 3
                            endbmatrix ;;; neq ;;; 4beginbmatrix
                            1 & 0 \
                            0 & 1
                            endbmatrix
                            =beginbmatrix
                            4 & 0\
                            0 & 4
                            endbmatrix $$



                            The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.



                            Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered 4 hours ago









                            Eevee Trainer

                            3438




                            3438



























                                 

                                draft saved


                                draft discarded















































                                 


                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2981402%2flinear-algebra-eigen-values%23new-answer', 'question_page');

                                );

                                Post as a guest













































































                                Comments

                                Popular posts from this blog

                                What does second last employer means? [closed]

                                List of Gilmore Girls characters

                                Confectionery