Showing that given matrix does not have negative eigenvalues without using the knowledge that it is positive definite.
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then
Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..
My efforts
We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.
We can see, this matrix is symmetric.
As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.
Conclusion so far The given matrix has only real eigenvalues.
Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.
So not all eigenvalues can be negative.
So we are left with two choices
all eigenvalues are positive
Eigenvalues of A are either positive or negative.
I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.
My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.
linear-algebra
 |Â
show 1 more comment
up vote
4
down vote
favorite
Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then
Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..
My efforts
We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.
We can see, this matrix is symmetric.
As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.
Conclusion so far The given matrix has only real eigenvalues.
Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.
So not all eigenvalues can be negative.
So we are left with two choices
all eigenvalues are positive
Eigenvalues of A are either positive or negative.
I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.
My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.
linear-algebra
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago
 |Â
show 1 more comment
up vote
4
down vote
favorite
up vote
4
down vote
favorite
Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then
Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..
My efforts
We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.
We can see, this matrix is symmetric.
As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.
Conclusion so far The given matrix has only real eigenvalues.
Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.
So not all eigenvalues can be negative.
So we are left with two choices
all eigenvalues are positive
Eigenvalues of A are either positive or negative.
I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.
My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.
linear-algebra
Let $a,b,c$ be a positive real number such that $b^2+c^2<a<1$. Let
$A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix$. Then
Consider the above matrix. I want to comment about the nature of eigenvalues of the matrix in the sense that, they are all positive, all negative, mix of positive or negative, non zero, real or non real etc..
My efforts
We look at the matrix first and see if it looks like one of the familiar type introduced in standard Linear Algebra Text.
We can see, this matrix is symmetric.
As soon we hear the term "symmetric matrix" and there is already the term "eigenvalue" in the question. We go the next standard result which says that a real symmetric matrix is diagonalizable with all eigenvalues real.
Conclusion so far The given matrix has only real eigenvalues.
Another standard result is the sum of eigenvalues is equal to the trace of the matrix. Trace is positive here due to the conditions specified.
So not all eigenvalues can be negative.
So we are left with two choices
all eigenvalues are positive
Eigenvalues of A are either positive or negative.
I know this matrix is positive definite(I have already proved it, by showing that all sub determinant are positive) so all eigenvalues are positive.
My aim is to show that there are no negative eigenvalues without going into the theory of positive definite matrices.
linear-algebra
linear-algebra
asked 3 hours ago
StammeringMathematician
49213
49213
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago
 |Â
show 1 more comment
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago
 |Â
show 1 more comment
3 Answers
3
active
oldest
votes
up vote
4
down vote
The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.
add a comment |Â
up vote
2
down vote
You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$
$$
R =
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right)
$$
and a "congruence diagonalization" $R^T AR = D$
$$
left(
beginarrayccc
1&0 &0 \
-b&1 &0 \
frac-aca-b^2& fracbca-b^2&1 \
endarray
right)
left(
beginarrayccc
1&b &c \
b& a&0 \
c& 0&1 \
endarray
right)
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right) =
left(
beginarrayccc
1&0 &0 \
0&a-b^2 &0 \
0&0 & fraca-ac^2 - b^2a-b^2\
endarray
right)
$$
I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
add a comment |Â
up vote
0
down vote
Your argument is very constructive.
beginequation
A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
endequation
beginequation
det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
endequation
Now you know that both the trace and determinant are positive. That leaves you with two choices:
1) Either all three eigenvalues are positive.
2) Or Two are negative and one is positive.
The eigenvalue/eigenvector relation give ($Av = lambda v$)
beginalign
v_1 + bv_2 + cv_3 &= lambda v_1\
bv_1 + av_2 &= lambda v_2\
cv_1 + v_3 &= lambda v_3
endalign
which gives
beginequation
v_3 = fracclambda -1v_1
endequation
beginequation
v_2 = fracblambda - av_1
endequation
beginequation
fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
endequation
If $lambda < 0$, then $lambda - 1<-1$. But
beginequation
fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
endequation
This means that
beginequation
underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
endequation
CONTRADICTION
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.
add a comment |Â
up vote
4
down vote
The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.
The characteristic polynomial of your matrix is
$$
p(x) = (a-x)(1-x)^2-c^2(a-x)-b^2(1-x).
$$
Now, if $x < 0$, then
beginalign*
p(x)
&> (a-x)(1-x)^2-c^2(1-x)-b^2(1-x) = (1-x)cdot[(a-x)(1-x)-c^2-b^2]\
&> (1-x)cdot[x^2-(a+1)x] = x(1-x)(x-a-1) > 0.
endalign*
Therefore, $p$ cannot have zeros in $(-infty,0)$. Also, $p(0) = det A > 0$. Thus, the eigenvalues of $A$ are positive.
answered 3 hours ago
amsmath
2,525114
2,525114
add a comment |Â
add a comment |Â
up vote
2
down vote
You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$
$$
R =
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right)
$$
and a "congruence diagonalization" $R^T AR = D$
$$
left(
beginarrayccc
1&0 &0 \
-b&1 &0 \
frac-aca-b^2& fracbca-b^2&1 \
endarray
right)
left(
beginarrayccc
1&b &c \
b& a&0 \
c& 0&1 \
endarray
right)
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right) =
left(
beginarrayccc
1&0 &0 \
0&a-b^2 &0 \
0&0 & fraca-ac^2 - b^2a-b^2\
endarray
right)
$$
I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
add a comment |Â
up vote
2
down vote
You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$
$$
R =
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right)
$$
and a "congruence diagonalization" $R^T AR = D$
$$
left(
beginarrayccc
1&0 &0 \
-b&1 &0 \
frac-aca-b^2& fracbca-b^2&1 \
endarray
right)
left(
beginarrayccc
1&b &c \
b& a&0 \
c& 0&1 \
endarray
right)
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right) =
left(
beginarrayccc
1&0 &0 \
0&a-b^2 &0 \
0&0 & fraca-ac^2 - b^2a-b^2\
endarray
right)
$$
I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
add a comment |Â
up vote
2
down vote
up vote
2
down vote
You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$
$$
R =
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right)
$$
and a "congruence diagonalization" $R^T AR = D$
$$
left(
beginarrayccc
1&0 &0 \
-b&1 &0 \
frac-aca-b^2& fracbca-b^2&1 \
endarray
right)
left(
beginarrayccc
1&b &c \
b& a&0 \
c& 0&1 \
endarray
right)
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right) =
left(
beginarrayccc
1&0 &0 \
0&a-b^2 &0 \
0&0 & fraca-ac^2 - b^2a-b^2\
endarray
right)
$$
I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
You may use Sylvester's Law of Inertia. Here is a matrix of determinant $1$
$$
R =
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right)
$$
and a "congruence diagonalization" $R^T AR = D$
$$
left(
beginarrayccc
1&0 &0 \
-b&1 &0 \
frac-aca-b^2& fracbca-b^2&1 \
endarray
right)
left(
beginarrayccc
1&b &c \
b& a&0 \
c& 0&1 \
endarray
right)
left(
beginarrayccc
1& -b & frac-aca-b^2 \
0&1 & fracbca-b^2 \
0&0 &1 \
endarray
right) =
left(
beginarrayccc
1&0 &0 \
0&a-b^2 &0 \
0&0 & fraca-ac^2 - b^2a-b^2\
endarray
right)
$$
I found $R$ using a fairly clean algorithm, see http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr
edited 16 mins ago
answered 25 mins ago
Will Jagy
97.9k595196
97.9k595196
add a comment |Â
add a comment |Â
up vote
0
down vote
Your argument is very constructive.
beginequation
A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
endequation
beginequation
det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
endequation
Now you know that both the trace and determinant are positive. That leaves you with two choices:
1) Either all three eigenvalues are positive.
2) Or Two are negative and one is positive.
The eigenvalue/eigenvector relation give ($Av = lambda v$)
beginalign
v_1 + bv_2 + cv_3 &= lambda v_1\
bv_1 + av_2 &= lambda v_2\
cv_1 + v_3 &= lambda v_3
endalign
which gives
beginequation
v_3 = fracclambda -1v_1
endequation
beginequation
v_2 = fracblambda - av_1
endequation
beginequation
fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
endequation
If $lambda < 0$, then $lambda - 1<-1$. But
beginequation
fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
endequation
This means that
beginequation
underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
endequation
CONTRADICTION
add a comment |Â
up vote
0
down vote
Your argument is very constructive.
beginequation
A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
endequation
beginequation
det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
endequation
Now you know that both the trace and determinant are positive. That leaves you with two choices:
1) Either all three eigenvalues are positive.
2) Or Two are negative and one is positive.
The eigenvalue/eigenvector relation give ($Av = lambda v$)
beginalign
v_1 + bv_2 + cv_3 &= lambda v_1\
bv_1 + av_2 &= lambda v_2\
cv_1 + v_3 &= lambda v_3
endalign
which gives
beginequation
v_3 = fracclambda -1v_1
endequation
beginequation
v_2 = fracblambda - av_1
endequation
beginequation
fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
endequation
If $lambda < 0$, then $lambda - 1<-1$. But
beginequation
fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
endequation
This means that
beginequation
underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
endequation
CONTRADICTION
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Your argument is very constructive.
beginequation
A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
endequation
beginequation
det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
endequation
Now you know that both the trace and determinant are positive. That leaves you with two choices:
1) Either all three eigenvalues are positive.
2) Or Two are negative and one is positive.
The eigenvalue/eigenvector relation give ($Av = lambda v$)
beginalign
v_1 + bv_2 + cv_3 &= lambda v_1\
bv_1 + av_2 &= lambda v_2\
cv_1 + v_3 &= lambda v_3
endalign
which gives
beginequation
v_3 = fracclambda -1v_1
endequation
beginequation
v_2 = fracblambda - av_1
endequation
beginequation
fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
endequation
If $lambda < 0$, then $lambda - 1<-1$. But
beginequation
fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
endequation
This means that
beginequation
underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
endequation
CONTRADICTION
Your argument is very constructive.
beginequation
A=beginbmatrix 1&b&c\ b&a & 0\ c & 0 & 1endbmatrix
endequation
beginequation
det A = c(-ac) +(a-b^2) = a - b^2 -c^2+c^2 -ac^2 = underbracea -(b^2+c^2)_>0 +c^2(underbrace1-a_>0)
endequation
Now you know that both the trace and determinant are positive. That leaves you with two choices:
1) Either all three eigenvalues are positive.
2) Or Two are negative and one is positive.
The eigenvalue/eigenvector relation give ($Av = lambda v$)
beginalign
v_1 + bv_2 + cv_3 &= lambda v_1\
bv_1 + av_2 &= lambda v_2\
cv_1 + v_3 &= lambda v_3
endalign
which gives
beginequation
v_3 = fracclambda -1v_1
endequation
beginequation
v_2 = fracblambda - av_1
endequation
beginequation
fracb^2lambda - a+ fracc^2lambda-1 = lambda -1
endequation
If $lambda < 0$, then $lambda - 1<-1$. But
beginequation
fracb^2lambda - a+ fracc^2lambda-1 > fracb^2lambda - 1+ fracc^2lambda-1 = fracb^2 + c^2lambda - 1 > -(b^2 + c^2) > -a > - 1
endequation
This means that
beginequation
underbracefracb^2lambda - a+ fracc^2lambda-1_>-1 = underbracelambda -1 _< -1
endequation
CONTRADICTION
edited 2 hours ago
answered 3 hours ago
Ahmad Bazzi
5,2161623
5,2161623
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2916826%2fshowing-that-given-matrix-does-not-have-negative-eigenvalues-without-using-the-k%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Assume the opposite of what you're trying to prove and demonstrate that that assumption leads to a contradiction.
â phdmba7of12
3 hours ago
Suppose it has some negative eigenvalue how can I effectively say that sum of all eigenvalues will negative. @phdmba
â StammeringMathematician
3 hours ago
Reproduce a proof of the claim for positive definite matrices. I wonder whether you can simplify it (in your particular case).
â metamorphy
3 hours ago
As far I know if we add two matrices eigenvalues need not add up. Right? @amsmath
â StammeringMathematician
3 hours ago
Forget it. The matrix $A_1$ is not positive semi-definite. I was mistaken.
â amsmath
3 hours ago