Showing that every rational eigenvalue of a graph is integral
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.
I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!
linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices
add a comment |Â
up vote
2
down vote
favorite
(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.
I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!
linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.
I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!
linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices
(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.
I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!
linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices
linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices
asked 4 hours ago
user496634
68719
68719
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago
add a comment |Â
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
3
down vote
accepted
Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.
First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.
Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.
If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.
add a comment |Â
up vote
2
down vote
More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.
First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.
Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.
If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.
add a comment |Â
up vote
3
down vote
accepted
Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.
First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.
Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.
If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.
add a comment |Â
up vote
3
down vote
accepted
up vote
3
down vote
accepted
Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.
First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.
Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.
If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.
Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.
First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.
Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.
If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.
answered 3 hours ago
Misha Lavrov
38.8k55195
38.8k55195
add a comment |Â
add a comment |Â
up vote
2
down vote
More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).
add a comment |Â
up vote
2
down vote
More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).
add a comment |Â
up vote
2
down vote
up vote
2
down vote
More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).
More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).
answered 1 hour ago
Robert Israel
308k22201444
308k22201444
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2936232%2fshowing-that-every-rational-eigenvalue-of-a-graph-is-integral%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago
I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago
The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago
You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago
I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago