Showing that given matrix does not have negative eigenvalues without using the knowledge that it is positive definite.

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
4
down vote

favorite












Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then



Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..



My efforts



We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.



We can see, this matrix is symmetric.



As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.



Conclusion so far The given matrix has only real eigenvalues.



Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.



So not all eigenvalues can be negative.



So we are left with two choices



  1. all eigenvalues are positive


  2. Eigenvalues of A are either positive or negative.


I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.



My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.










share|cite|improve this question





















  • Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
    – phdmba7of12
    3 hours ago










  • Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
    – StammeringMathematician
    3 hours ago










  • Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
    – metamorphy
    3 hours ago











  • As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
    – StammeringMathematician
    3 hours ago











  • Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
    – amsmath
    3 hours ago














up vote
4
down vote

favorite












Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then



Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..



My efforts



We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.



We can see, this matrix is symmetric.



As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.



Conclusion so far The given matrix has only real eigenvalues.



Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.



So not all eigenvalues can be negative.



So we are left with two choices



  1. all eigenvalues are positive


  2. Eigenvalues of A are either positive or negative.


I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.



My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.










share|cite|improve this question





















  • Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
    – phdmba7of12
    3 hours ago










  • Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
    – StammeringMathematician
    3 hours ago










  • Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
    – metamorphy
    3 hours ago











  • As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
    – StammeringMathematician
    3 hours ago











  • Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
    – amsmath
    3 hours ago












up vote
4
down vote

favorite









up vote
4
down vote

favorite











Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then



Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..



My efforts



We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.



We can see, this matrix is symmetric.



As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.



Conclusion so far The given matrix has only real eigenvalues.



Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.



So not all eigenvalues can be negative.



So we are left with two choices



  1. all eigenvalues are positive


  2. Eigenvalues of A are either positive or negative.


I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.



My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.










share|cite|improve this question













Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then



Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..



My efforts



We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.



We can see, this matrix is symmetric.



As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.



Conclusion so far The given matrix has only real eigenvalues.



Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.



So not all eigenvalues can be negative.



So we are left with two choices



  1. all eigenvalues are positive


  2. Eigenvalues of A are either positive or negative.


I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.



My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 3 hours ago









StammeringMathematician

49213




49213











  • Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
    – phdmba7of12
    3 hours ago










  • Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
    – StammeringMathematician
    3 hours ago










  • Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
    – metamorphy
    3 hours ago











  • As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
    – StammeringMathematician
    3 hours ago











  • Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
    – amsmath
    3 hours ago
















  • Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
    – phdmba7of12
    3 hours ago










  • Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
    – StammeringMathematician
    3 hours ago










  • Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
    – metamorphy
    3 hours ago











  • As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
    – StammeringMathematician
    3 hours ago











  • Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
    – amsmath
    3 hours ago















Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
– phdmba7of12
3 hours ago




Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
– phdmba7of12
3 hours ago












Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
– StammeringMathematician
3 hours ago




Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
– StammeringMathematician
3 hours ago












Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
– metamorphy
3 hours ago





Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
– metamorphy
3 hours ago













As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
– StammeringMathematician
3 hours ago





As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
– StammeringMathematician
3 hours ago













Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
– amsmath
3 hours ago




Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
– amsmath
3 hours ago










3 Answers
3






active

oldest

votes

















up vote
4
down vote













The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.






share|cite|improve this answer



























    up vote
    2
    down vote













    You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$



    $$
    R =
    left(
    beginarrayccc
    1& -b & frac-aca-b^2 \
    0&1 & fracbca-b^2 \
    0&0 &1 \
    endarray
    right)
    $$
    and a "congruence diagonalization" $R^T AR = D$
    $$
    left(
    beginarrayccc
    1&0 &0 \
    -b&1 &0 \
    frac-aca-b^2& fracbca-b^2&1 \
    endarray
    right)
    left(
    beginarrayccc
    1&b &c \
    b& a&0 \
    c& 0&1 \
    endarray
    right)
    left(
    beginarrayccc
    1& -b & frac-aca-b^2 \
    0&1 & fracbca-b^2 \
    0&0 &1 \
    endarray
    right) =
    left(
    beginarrayccc
    1&0 &0 \
    0&a-b^2 &0 \
    0&0 & fraca-ac^2 - b^2a-b^2\
    endarray
    right)
    $$



    I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr






    share|cite|improve this answer





























      up vote
      0
      down vote













      Your argument is very constructive.
      beginequation
      A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
      endequation
      beginequation
      det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
      endequation
      Now you know that both the trace and determinant are positive. That leaves you with two choices:



      1) Either all three eigenvalues are positive.



      2) Or Two are negative and one is positive.



      The eigenvalue/eigenvector relation give ($Av = lambda v$)
      beginalign
      v_1 + bv_2 + cv_3 &= lambda v_1\
      bv_1 + av_2 &= lambda v_2\
      cv_1 + v_3 &= lambda v_3
      endalign
      which gives
      beginequation
      v_3 = fracclambda -1v_1
      endequation
      beginequation
      v_2 = fracblambda - av_1
      endequation
      beginequation
      fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
      endequation
      If $lambda < 0$, then $lambda - 1<-1$. But
      beginequation
      fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
      endequation
      This means that
      beginequation
      underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
      endequation
      CONTRADICTION






      share|cite|improve this answer






















        Your Answer




        StackExchange.ifUsing("editor", function ()
        return StackExchange.using("mathjaxEditing", function ()
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        );
        );
        , "mathjax-editing");

        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "69"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        convertImagesToLinks: true,
        noModals: false,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );













         

        draft saved


        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2916826%2fshowing-that-given-matrix-does-not-have-negative-eigenvalues-without-using-the-k%23new-answer', 'question_page');

        );

        Post as a guest






























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes








        up vote
        4
        down vote













        The characteristic polynomial of your matrix is
        $$
        p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
        $$
        Now, if $x < 0$, then
        beginalign*
        p(x)
        &> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
        &> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
        endalign*
        Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.






        share|cite|improve this answer
























          up vote
          4
          down vote













          The characteristic polynomial of your matrix is
          $$
          p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
          $$
          Now, if $x < 0$, then
          beginalign*
          p(x)
          &> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
          &> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
          endalign*
          Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.






          share|cite|improve this answer






















            up vote
            4
            down vote










            up vote
            4
            down vote









            The characteristic polynomial of your matrix is
            $$
            p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
            $$
            Now, if $x < 0$, then
            beginalign*
            p(x)
            &> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
            &> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
            endalign*
            Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.






            share|cite|improve this answer












            The characteristic polynomial of your matrix is
            $$
            p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
            $$
            Now, if $x < 0$, then
            beginalign*
            p(x)
            &> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
            &> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
            endalign*
            Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered 3 hours ago









            amsmath

            2,525114




            2,525114




















                up vote
                2
                down vote













                You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$



                $$
                R =
                left(
                beginarrayccc
                1& -b & frac-aca-b^2 \
                0&1 & fracbca-b^2 \
                0&0 &1 \
                endarray
                right)
                $$
                and a "congruence diagonalization" $R^T AR = D$
                $$
                left(
                beginarrayccc
                1&0 &0 \
                -b&1 &0 \
                frac-aca-b^2& fracbca-b^2&1 \
                endarray
                right)
                left(
                beginarrayccc
                1&b &c \
                b& a&0 \
                c& 0&1 \
                endarray
                right)
                left(
                beginarrayccc
                1& -b & frac-aca-b^2 \
                0&1 & fracbca-b^2 \
                0&0 &1 \
                endarray
                right) =
                left(
                beginarrayccc
                1&0 &0 \
                0&a-b^2 &0 \
                0&0 & fraca-ac^2 - b^2a-b^2\
                endarray
                right)
                $$



                I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr






                share|cite|improve this answer


























                  up vote
                  2
                  down vote













                  You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$



                  $$
                  R =
                  left(
                  beginarrayccc
                  1& -b & frac-aca-b^2 \
                  0&1 & fracbca-b^2 \
                  0&0 &1 \
                  endarray
                  right)
                  $$
                  and a "congruence diagonalization" $R^T AR = D$
                  $$
                  left(
                  beginarrayccc
                  1&0 &0 \
                  -b&1 &0 \
                  frac-aca-b^2& fracbca-b^2&1 \
                  endarray
                  right)
                  left(
                  beginarrayccc
                  1&b &c \
                  b& a&0 \
                  c& 0&1 \
                  endarray
                  right)
                  left(
                  beginarrayccc
                  1& -b & frac-aca-b^2 \
                  0&1 & fracbca-b^2 \
                  0&0 &1 \
                  endarray
                  right) =
                  left(
                  beginarrayccc
                  1&0 &0 \
                  0&a-b^2 &0 \
                  0&0 & fraca-ac^2 - b^2a-b^2\
                  endarray
                  right)
                  $$



                  I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr






                  share|cite|improve this answer
























                    up vote
                    2
                    down vote










                    up vote
                    2
                    down vote









                    You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$



                    $$
                    R =
                    left(
                    beginarrayccc
                    1& -b & frac-aca-b^2 \
                    0&1 & fracbca-b^2 \
                    0&0 &1 \
                    endarray
                    right)
                    $$
                    and a "congruence diagonalization" $R^T AR = D$
                    $$
                    left(
                    beginarrayccc
                    1&0 &0 \
                    -b&1 &0 \
                    frac-aca-b^2& fracbca-b^2&1 \
                    endarray
                    right)
                    left(
                    beginarrayccc
                    1&b &c \
                    b& a&0 \
                    c& 0&1 \
                    endarray
                    right)
                    left(
                    beginarrayccc
                    1& -b & frac-aca-b^2 \
                    0&1 & fracbca-b^2 \
                    0&0 &1 \
                    endarray
                    right) =
                    left(
                    beginarrayccc
                    1&0 &0 \
                    0&a-b^2 &0 \
                    0&0 & fraca-ac^2 - b^2a-b^2\
                    endarray
                    right)
                    $$



                    I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr






                    share|cite|improve this answer














                    You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$



                    $$
                    R =
                    left(
                    beginarrayccc
                    1& -b & frac-aca-b^2 \
                    0&1 & fracbca-b^2 \
                    0&0 &1 \
                    endarray
                    right)
                    $$
                    and a "congruence diagonalization" $R^T AR = D$
                    $$
                    left(
                    beginarrayccc
                    1&0 &0 \
                    -b&1 &0 \
                    frac-aca-b^2& fracbca-b^2&1 \
                    endarray
                    right)
                    left(
                    beginarrayccc
                    1&b &c \
                    b& a&0 \
                    c& 0&1 \
                    endarray
                    right)
                    left(
                    beginarrayccc
                    1& -b & frac-aca-b^2 \
                    0&1 & fracbca-b^2 \
                    0&0 &1 \
                    endarray
                    right) =
                    left(
                    beginarrayccc
                    1&0 &0 \
                    0&a-b^2 &0 \
                    0&0 & fraca-ac^2 - b^2a-b^2\
                    endarray
                    right)
                    $$



                    I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr







                    share|cite|improve this answer














                    share|cite|improve this answer



                    share|cite|improve this answer








                    edited 16 mins ago

























                    answered 25 mins ago









                    Will Jagy

                    97.9k595196




                    97.9k595196




















                        up vote
                        0
                        down vote













                        Your argument is very constructive.
                        beginequation
                        A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
                        endequation
                        beginequation
                        det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
                        endequation
                        Now you know that both the trace and determinant are positive. That leaves you with two choices:



                        1) Either all three eigenvalues are positive.



                        2) Or Two are negative and one is positive.



                        The eigenvalue/eigenvector relation give ($Av = lambda v$)
                        beginalign
                        v_1 + bv_2 + cv_3 &= lambda v_1\
                        bv_1 + av_2 &= lambda v_2\
                        cv_1 + v_3 &= lambda v_3
                        endalign
                        which gives
                        beginequation
                        v_3 = fracclambda -1v_1
                        endequation
                        beginequation
                        v_2 = fracblambda - av_1
                        endequation
                        beginequation
                        fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
                        endequation
                        If $lambda < 0$, then $lambda - 1<-1$. But
                        beginequation
                        fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
                        endequation
                        This means that
                        beginequation
                        underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
                        endequation
                        CONTRADICTION






                        share|cite|improve this answer


























                          up vote
                          0
                          down vote













                          Your argument is very constructive.
                          beginequation
                          A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
                          endequation
                          beginequation
                          det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
                          endequation
                          Now you know that both the trace and determinant are positive. That leaves you with two choices:



                          1) Either all three eigenvalues are positive.



                          2) Or Two are negative and one is positive.



                          The eigenvalue/eigenvector relation give ($Av = lambda v$)
                          beginalign
                          v_1 + bv_2 + cv_3 &= lambda v_1\
                          bv_1 + av_2 &= lambda v_2\
                          cv_1 + v_3 &= lambda v_3
                          endalign
                          which gives
                          beginequation
                          v_3 = fracclambda -1v_1
                          endequation
                          beginequation
                          v_2 = fracblambda - av_1
                          endequation
                          beginequation
                          fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
                          endequation
                          If $lambda < 0$, then $lambda - 1<-1$. But
                          beginequation
                          fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
                          endequation
                          This means that
                          beginequation
                          underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
                          endequation
                          CONTRADICTION






                          share|cite|improve this answer
























                            up vote
                            0
                            down vote










                            up vote
                            0
                            down vote









                            Your argument is very constructive.
                            beginequation
                            A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
                            endequation
                            beginequation
                            det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
                            endequation
                            Now you know that both the trace and determinant are positive. That leaves you with two choices:



                            1) Either all three eigenvalues are positive.



                            2) Or Two are negative and one is positive.



                            The eigenvalue/eigenvector relation give ($Av = lambda v$)
                            beginalign
                            v_1 + bv_2 + cv_3 &= lambda v_1\
                            bv_1 + av_2 &= lambda v_2\
                            cv_1 + v_3 &= lambda v_3
                            endalign
                            which gives
                            beginequation
                            v_3 = fracclambda -1v_1
                            endequation
                            beginequation
                            v_2 = fracblambda - av_1
                            endequation
                            beginequation
                            fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
                            endequation
                            If $lambda < 0$, then $lambda - 1<-1$. But
                            beginequation
                            fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
                            endequation
                            This means that
                            beginequation
                            underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
                            endequation
                            CONTRADICTION






                            share|cite|improve this answer














                            Your argument is very constructive.
                            beginequation
                            A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
                            endequation
                            beginequation
                            det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
                            endequation
                            Now you know that both the trace and determinant are positive. That leaves you with two choices:



                            1) Either all three eigenvalues are positive.



                            2) Or Two are negative and one is positive.



                            The eigenvalue/eigenvector relation give ($Av = lambda v$)
                            beginalign
                            v_1 + bv_2 + cv_3 &= lambda v_1\
                            bv_1 + av_2 &= lambda v_2\
                            cv_1 + v_3 &= lambda v_3
                            endalign
                            which gives
                            beginequation
                            v_3 = fracclambda -1v_1
                            endequation
                            beginequation
                            v_2 = fracblambda - av_1
                            endequation
                            beginequation
                            fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
                            endequation
                            If $lambda < 0$, then $lambda - 1<-1$. But
                            beginequation
                            fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
                            endequation
                            This means that
                            beginequation
                            underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
                            endequation
                            CONTRADICTION







                            share|cite|improve this answer














                            share|cite|improve this answer



                            share|cite|improve this answer








                            edited 2 hours ago

























                            answered 3 hours ago









                            Ahmad Bazzi

                            5,2161623




                            5,2161623



























                                 

                                draft saved


                                draft discarded















































                                 


                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2916826%2fshowing-that-given-matrix-does-not-have-negative-eigenvalues-without-using-the-k%23new-answer', 'question_page');

                                );

                                Post as a guest













































































                                Comments

                                Popular posts from this blog

                                What does second last employer means? [closed]

                                Installing NextGIS Connect into QGIS 3?

                                One-line joke