Linear algebra eigen values
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?
linear-algebra eigenvalues-eigenvectors
add a comment |Â
up vote
2
down vote
favorite
Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?
linear-algebra eigenvalues-eigenvectors
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?
linear-algebra eigenvalues-eigenvectors
Why is is that if a matrix satisfies a particular equation then the eigen values of that matrix also satisfy that equation?
linear-algebra eigenvalues-eigenvectors
linear-algebra eigenvalues-eigenvectors
edited 3 hours ago
GNUSupporter 8964民主女神 地下教會
12.1k72243
12.1k72243
asked 4 hours ago
Caitlyn
191
191
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago
add a comment |Â
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
3
down vote
Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.
That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.
The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.
We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$
For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$
Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.
add a comment |Â
up vote
0
down vote
By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.
add a comment |Â
up vote
0
down vote
It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have
$$Avecv = lambda vecv$$
or equivalently, when $I$ is the identity matrix,
$$Avecv - lambda vecv = (A - lambda I) vecv = 0$$
This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."
An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:
$$beginbmatrix
6 & -1\
2 & 3
endbmatrixbeginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
6 & -1\
2 & 3
endbmatrix ;;; neq ;;; 4beginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
4 & 0\
0 & 4
endbmatrix $$
The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.
Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.
That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.
The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.
We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$
For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$
Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.
add a comment |Â
up vote
3
down vote
Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.
That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.
The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.
We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$
For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$
Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.
add a comment |Â
up vote
3
down vote
up vote
3
down vote
Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.
That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.
The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.
We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$
For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$
Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.
Take a (non-zero!) eigen vector $v$ corresponding to an eigenvalue $lambda$ of a matrix $A$.
That means $Av =lambda v$. Applying $A$ to both sides one gets $A^2v=A (Av) = A(lambda v) = lambda Av= lambda^2 v$.
The hardest part is over. Now similarly one can conclude $A^3v=lambda^3 v$ etc.
We have a sequence of equalities $A^kv= lambda ^k v$ for $k=1,2,3,ldots$
For each $k$ muliply by some scalar $c_k$ and add up. This will give us
$sum_k c_kA^kv =(sum c_klambda^k) v$
Note that in the final equation RHS is a scalar multiple of the non-zero vector $v$. So if for some choices $c_k$ if the LHS is zero (that is the meaning of relation satisfied by the matrix stated in your question) we see same equation holds for $lambda$.
edited 1 hour ago
answered 4 hours ago
P Vanchinathan
14.4k12036
14.4k12036
add a comment |Â
add a comment |Â
up vote
0
down vote
By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.
add a comment |Â
up vote
0
down vote
By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.
By Jordan Decomposition, or by SVD of hermitian matrix, the matrices formed by basis of eigenspace might be cancelled, and only the median part (diagonal matrix for Hermitian matrix, bidiagonal matrix for general matrix) is retained, which contains eigenvalues. In this fashion, the eigenvalues may also satisfy the matrix equation.
answered 4 hours ago


Lin Xuelei
9810
9810
add a comment |Â
add a comment |Â
up vote
0
down vote
It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have
$$Avecv = lambda vecv$$
or equivalently, when $I$ is the identity matrix,
$$Avecv - lambda vecv = (A - lambda I) vecv = 0$$
This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."
An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:
$$beginbmatrix
6 & -1\
2 & 3
endbmatrixbeginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
6 & -1\
2 & 3
endbmatrix ;;; neq ;;; 4beginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
4 & 0\
0 & 4
endbmatrix $$
The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.
Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.
add a comment |Â
up vote
0
down vote
It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have
$$Avecv = lambda vecv$$
or equivalently, when $I$ is the identity matrix,
$$Avecv - lambda vecv = (A - lambda I) vecv = 0$$
This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."
An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:
$$beginbmatrix
6 & -1\
2 & 3
endbmatrixbeginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
6 & -1\
2 & 3
endbmatrix ;;; neq ;;; 4beginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
4 & 0\
0 & 4
endbmatrix $$
The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.
Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have
$$Avecv = lambda vecv$$
or equivalently, when $I$ is the identity matrix,
$$Avecv - lambda vecv = (A - lambda I) vecv = 0$$
This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."
An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:
$$beginbmatrix
6 & -1\
2 & 3
endbmatrixbeginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
6 & -1\
2 & 3
endbmatrix ;;; neq ;;; 4beginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
4 & 0\
0 & 4
endbmatrix $$
The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.
Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.
It follows from the derivation and overarching idea of the eigenvalues: if $A$ is a matrix, and $lambda$ is an eigenvalue of $A$, then for any vector $vecv$ where the multiplication with $A$ is defined, we have
$$Avecv = lambda vecv$$
or equivalently, when $I$ is the identity matrix,
$$Avecv - lambda vecv = (A - lambda I) vecv = 0$$
This doesn't mean it works for "every" equation. Otherwise linear algebra as a field of study would just be reduced to finding eigenvalues. The eigenvalue basically means "multiplying vectors by this matrix is no different than scaling the vector up by one of the matrix's eigenvalues."
An example where the notion of it working being false is a matrix multiplied by the identity matrix. A simple example:
$$beginbmatrix
6 & -1\
2 & 3
endbmatrixbeginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
6 & -1\
2 & 3
endbmatrix ;;; neq ;;; 4beginbmatrix
1 & 0 \
0 & 1
endbmatrix
=beginbmatrix
4 & 0\
0 & 4
endbmatrix $$
The left matrix has eigenvalues $4$ (and $5$) but clearly these products are not equal. Eigenvalues have some pretty neat properties and applications, but they're not quite a cure-all.
Of course I might be playing dumb by pulling such a trivial example out of my hat - so if there's some relevant context you think appropriate it might help explain what you're trying to get at.
answered 4 hours ago


Eevee Trainer
3438
3438
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2981402%2flinear-algebra-eigen-values%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
I am not sure if this is true beyond polynomial equations and some more with exponential terms. I hope some one answers that.
– dineshdileep
4 hours ago
If $Ain M_n(mathbb C)$, $operatornameimag(A)=0$ does not imply that $operatornameimag(lambda(A))=0$.
– user1551
46 mins ago