Showing that every rational eigenvalue of a graph is integral

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.



I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!










share|cite|improve this question





















  • The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
    – Misha Lavrov
    3 hours ago










  • I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
    – user496634
    3 hours ago










  • The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
    – user10354138
    3 hours ago










  • You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
    – Misha Lavrov
    3 hours ago










  • I see, thanks. I will update the question to reflect that.
    – user496634
    3 hours ago














up vote
2
down vote

favorite












(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.



I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!










share|cite|improve this question





















  • The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
    – Misha Lavrov
    3 hours ago










  • I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
    – user496634
    3 hours ago










  • The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
    – user10354138
    3 hours ago










  • You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
    – Misha Lavrov
    3 hours ago










  • I see, thanks. I will update the question to reflect that.
    – user496634
    3 hours ago












up vote
2
down vote

favorite









up vote
2
down vote

favorite











(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.



I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!










share|cite|improve this question













(This is taken from the exercises in Bondy and Murty's Graph Theory.) Let the adjacency matrix of a graph $G$ be denoted by $mathbfA$. The eigenvalues $lambda$ of $G$ are defined as the roots of the characteristic polynomial of $mathbfA$, $lvert mathbf A-lambdamathbf Irvert=0$. I am asked to show that, if $lambdainmathbb Q$, then $lambdainmathbb Z$.



I have already managed to show that $lambda$ is real. This is because $mathbf A$ is symmetric by definition, then it becomes a simple linear algebra fact that its eigenvalues are real. Then, if $G$ is simple, all diagonal entries of $mathbf A$ are $0$, and since the eigenvalues of a matrix have the same signs as the diagonal entries (I hope!) then all the eigenvalues are $0$. However, I'm not given that $G$ is simple and so this line of reasoning doesn't work, and I have no idea where to use the fact that $lambda$ is rational in the proof anyway. Any help is appreciated!







linear-algebra matrices graph-theory eigenvalues-eigenvectors symmetric-matrices






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 4 hours ago









user496634

68719




68719











  • The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
    – Misha Lavrov
    3 hours ago










  • I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
    – user496634
    3 hours ago










  • The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
    – user10354138
    3 hours ago










  • You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
    – Misha Lavrov
    3 hours ago










  • I see, thanks. I will update the question to reflect that.
    – user496634
    3 hours ago
















  • The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
    – Misha Lavrov
    3 hours ago










  • I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
    – user496634
    3 hours ago










  • The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
    – user10354138
    3 hours ago










  • You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
    – Misha Lavrov
    3 hours ago










  • I see, thanks. I will update the question to reflect that.
    – user496634
    3 hours ago















The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago




The eigenvalues of a matrix do not necessarily have the same signs as the diagonal entries.
– Misha Lavrov
3 hours ago












I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago




I remember learning about something like that in linear algebra... Or is it only true for a tridiagonal matrix or something?
– user496634
3 hours ago












The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago




The characteristic polynomial $det(t I-A)inmathbbZ[t]$ is monic, so ...
– user10354138
3 hours ago












You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago




You can argue that for a symmetric matrix $A$, if some of the diagonal entries are negative, then the eigenvalues can't all be positive (or vice versa), which might be what you're thinking of. But that's a much weaker claim.
– Misha Lavrov
3 hours ago












I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago




I see, thanks. I will update the question to reflect that.
– user496634
3 hours ago










2 Answers
2






active

oldest

votes

















up vote
3
down vote



accepted










Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.



First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.



Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.



If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.






share|cite|improve this answer



























    up vote
    2
    down vote













    More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).






    share|cite|improve this answer




















      Your Answer




      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2936232%2fshowing-that-every-rational-eigenvalue-of-a-graph-is-integral%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      3
      down vote



      accepted










      Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.



      First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.



      Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.



      If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.






      share|cite|improve this answer
























        up vote
        3
        down vote



        accepted










        Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.



        First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.



        Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.



        If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.






        share|cite|improve this answer






















          up vote
          3
          down vote



          accepted







          up vote
          3
          down vote



          accepted






          Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.



          First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.



          Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.



          If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.






          share|cite|improve this answer












          Let $mathbf x$ be an eigenvector of a rational eigenvalue $lambda$: that is, $(mathbf A - lambda mathbf I)mathbf x = mathbf 0$.



          First, we can argue that if $lambda$ is rational, then the entries of $mathbf x$ can be all rational. There's probably some clever argument here, but basically the reason is that if you solve this system of equations with Gaussian elimination, then you never introduce irrational numbers, and all the entries of $mathbf A - lambda mathbf I$ start out rational.



          Nest, we can assume that the entries of $mathbf x$ are integers with greatest common divisor $1$, because we can scale $mathbf x$ to get another eigenvector to make this the case. (Stop if you think you see where this is going.) We have $mathbf A mathbf x = lambda mathbf x$, and here at least $mathbf A mathbf x$ is an all-integer vector. So $lambda mathbf x$ is also an all-integer vector.



          If $lambda = frac pq$ had a denominator $q$ other than $pm 1$, then we could find an entry $x_i$ of $mathbf x$ not divisible by $q$ (since the entries have GCD $1$), and then $lambda x_i$ would not be an integer. Since we know this doesn't happen, $lambda$ must be an integer.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 3 hours ago









          Misha Lavrov

          38.8k55195




          38.8k55195




















              up vote
              2
              down vote













              More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).






              share|cite|improve this answer
























                up vote
                2
                down vote













                More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).






                share|cite|improve this answer






















                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).






                  share|cite|improve this answer












                  More generally, if $A$ is a square matrix whose entries are integers, the characteristic polynomial of $A$ is a monic polynomial with integer coefficients, so the eigenvalues of $A$ are algebraic integers. Every algebraic integer that is a rational number is an ordinary integer (see e.g. here).







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 1 hour ago









                  Robert Israel

                  308k22201444




                  308k22201444



























                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2936232%2fshowing-that-every-rational-eigenvalue-of-a-graph-is-integral%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Comments

                      Popular posts from this blog

                      What does second last employer means? [closed]

                      List of Gilmore Girls characters

                      One-line joke