Can a set of vectors be linearly independent in one vector space, but be linearly dependent in another vector space?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












For example,



let S = x_1 in mathbb R$ be a subspace of $mathbb R^2$.



By definition, dim(S) = 1, and dim($mathbb R^2$) = 2.



Then the set (1, 1) only has one vector, so is it linearly independent in S, but is linearly dependent in $mathbb R^2$?



I know this doesn't make any sense, but we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.



Since (1,1) is in both S and $mathbb R^2$, but the set (1, 1) only spans S, how come it is not only linearly independent in S and linearly dependent in $mathbb R^2$ (since (1,1) does not span $mathbb R^2$).



Sorry for this stupid question










share|cite|improve this question





















  • Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
    – CyclotomicField
    1 hour ago






  • 2




    If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
    – egreg
    1 hour ago






  • 2




    we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
    – dxiv
    1 hour ago










  • $ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
    – Daniel Schepler
    1 hour ago










  • On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
    – Daniel Schepler
    1 hour ago















up vote
2
down vote

favorite












For example,



let S = x_1 in mathbb R$ be a subspace of $mathbb R^2$.



By definition, dim(S) = 1, and dim($mathbb R^2$) = 2.



Then the set (1, 1) only has one vector, so is it linearly independent in S, but is linearly dependent in $mathbb R^2$?



I know this doesn't make any sense, but we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.



Since (1,1) is in both S and $mathbb R^2$, but the set (1, 1) only spans S, how come it is not only linearly independent in S and linearly dependent in $mathbb R^2$ (since (1,1) does not span $mathbb R^2$).



Sorry for this stupid question










share|cite|improve this question





















  • Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
    – CyclotomicField
    1 hour ago






  • 2




    If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
    – egreg
    1 hour ago






  • 2




    we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
    – dxiv
    1 hour ago










  • $ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
    – Daniel Schepler
    1 hour ago










  • On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
    – Daniel Schepler
    1 hour ago













up vote
2
down vote

favorite









up vote
2
down vote

favorite











For example,



let S = x_1 in mathbb R$ be a subspace of $mathbb R^2$.



By definition, dim(S) = 1, and dim($mathbb R^2$) = 2.



Then the set (1, 1) only has one vector, so is it linearly independent in S, but is linearly dependent in $mathbb R^2$?



I know this doesn't make any sense, but we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.



Since (1,1) is in both S and $mathbb R^2$, but the set (1, 1) only spans S, how come it is not only linearly independent in S and linearly dependent in $mathbb R^2$ (since (1,1) does not span $mathbb R^2$).



Sorry for this stupid question










share|cite|improve this question













For example,



let S = x_1 in mathbb R$ be a subspace of $mathbb R^2$.



By definition, dim(S) = 1, and dim($mathbb R^2$) = 2.



Then the set (1, 1) only has one vector, so is it linearly independent in S, but is linearly dependent in $mathbb R^2$?



I know this doesn't make any sense, but we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.



Since (1,1) is in both S and $mathbb R^2$, but the set (1, 1) only spans S, how come it is not only linearly independent in S and linearly dependent in $mathbb R^2$ (since (1,1) does not span $mathbb R^2$).



Sorry for this stupid question







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 1 hour ago









Tim Weah

484




484











  • Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
    – CyclotomicField
    1 hour ago






  • 2




    If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
    – egreg
    1 hour ago






  • 2




    we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
    – dxiv
    1 hour ago










  • $ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
    – Daniel Schepler
    1 hour ago










  • On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
    – Daniel Schepler
    1 hour ago

















  • Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
    – CyclotomicField
    1 hour ago






  • 2




    If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
    – egreg
    1 hour ago






  • 2




    we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
    – dxiv
    1 hour ago










  • $ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
    – Daniel Schepler
    1 hour ago










  • On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
    – Daniel Schepler
    1 hour ago
















Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
– CyclotomicField
1 hour ago




Vectors are linearly independent of other vectors in the same space so it doesn't make sense to discuss them in different spaces.
– CyclotomicField
1 hour ago




2




2




If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
– egreg
1 hour ago




If the set $v_1,dots,v_n$ doesn't span $V$, then it need not be linearly dependent, unless $n=dim V$. In your case the set consists of one vector and $mathbbR^2$ has dimension $2$.
– egreg
1 hour ago




2




2




we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
– dxiv
1 hour ago




we learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in You must have misunderstood something. Two non-parallel vectors are linearly independent in $,Bbb R^3,$, for example, but they certainly don't span $,Bbb R^3,$.
– dxiv
1 hour ago












$ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
– Daniel Schepler
1 hour ago




$ 1 $ is linearly independent in $mathbbR$ with the usual vector $mathbbR$-vector space structure; but if instead you use the vector space structure $x oplus y = x + y - 1$, $lambda odot x = lambda x - lambda + 1$ then $ 1 $ is linearly dependent with respect to this structure.
– Daniel Schepler
1 hour ago












On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
– Daniel Schepler
1 hour ago





On the other hand, if $W$ is a subspace of $V$ and $S subseteq W$, then $S$ is linearly independent as a subset of the vector space $W$ if and only if $S$ is linearly independent as a subset of the vector space $V$.
– Daniel Schepler
1 hour ago











5 Answers
5






active

oldest

votes

















up vote
2
down vote













Suppose $v_1,v_2,dots,v_n$ is a basis of $V$. Then
$$
v_2,dots,v_n
$$

doesn't span $V$, but it is linearly independent.



A set of vectors may fail to span $V$, but it can still be linearly independent.



The only case when you can infer linear dependence from the fact that a set fails to span $V$ is when the set has the same number of elements as the dimension of $V$.



In your case this condition is not satisfied, because $(1,1)$ consists of one element, but the dimension of $mathbbR^2$ is two.






share|cite|improve this answer



























    up vote
    2
    down vote













    You said you “learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.” This isn’t correct, unless “the vector space that it is in” means “the smallest vector space that it is in,” which would not be a typical reader’s understanding. But even with that understanding, the statement is not useful, because the fact is not special to linearly independent sets. Every set of vectors spans the smallest vector space that contains them: a set of vectors spans its span. (That’s practically the definition of span.)



    I suspect what you were supposed to learn in class was that “A set of vectors in a vector space $V$ can only be a basis for $V$ if the set spans $V$.” (In addition, it must be a linearly independent set of vectors.)



    In particular, the set $(1,1)$ is a linearly independent set, whether it is considered as a set of vectors in $mathbb R^2$ or as a set of vectors in what you call $S$. The fact that $(1,1)$ does not span $mathbb R^2$ does not tell you anything about the linear independence of the set. (And by the way, any set containing only one nonzero vector is a linearly independent set of vectors.)






    share|cite|improve this answer




















    • Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
      – Chris Custer
      46 mins ago

















    up vote
    0
    down vote













    As egreg explained, your example is wrong.



    Suppose $V$ and $W$ are both subspaces of a vector space $X$.
    Then any set of vectors $v_1, ldots, v_k$ in the intersection $V cap W$ that is linearly dependent as a subset of vector space $V$ is also linearly dependent as a subset of $W$. This is because both linear dependence statements are equivalent to the existence of scalars $c_1, ldots, c_k$, not all $0$, such that $c_1 v_1 + ldots + c_k v_k = 0$.






    share|cite|improve this answer



























      up vote
      0
      down vote













      Firstly, the definition




      ... that a set of vectors can only be linearly independent if it spans the vector space that it is in.




      is not correct.



      A set $S$ of vectors is linearly dependent if there is some $v in S$ that can be written as a linear combination of other vectors from $S$. It is linearly independent otherwise.



      More formally, a set of vectors $S = v_1,ldots,v_k$ is linearly independent if
      $$sum_i=1^kalpha_iv_i = 0 quad Rightarrow quad alpha_1= cdots = alpha_k=0$$
      That is, the only way we can make zero from vectors in $S$ is by multiplying every vector by zero.



      It is easy to see that if we remove vectors from linearly independent $S$, it stays linearly independent. Conversely, if we add a vector to $S$, it may become dependent. The maximum cardinality (number of elements) of a linearly independent set of vectors in a $n$-dimensional vectors space $V$ is $n$ and this set is then called a basis for $V$. It has the property that it spans entire $V$, that is - every $v in V$ can be represented as a linear combination of vectors from basis.



      Note that the cardinality of a set of vectors tells us nothing precise about its linear (in)dependence.



      example 1: $(1,1),(2,2)$ is linearly dependent in $mathbbR^2$ because $2 cdot (1,1) = (2,2)$.
      example 2: $(1,1)$ is linearly independent in $mathbbR^2$ but not a basis for $mathbbR^2$ because $|S| = 1 neq 2 = dim mathbbR^2$..
      example 3: $(1,1),(2,3)$ is linearly independent in $mathbbR^2$ and also a basis for $mathbbR^2$ because $|S| = 2 = dim mathbbR^2$.
      example 4: $(1,1),(2,3),v$ must be linearly dependent in $mathbbR^2$, for any $v in mathbbR^2$.
      example 5: $(1,1)$ is linearly independent in your space $S := (x_1,x_1)$ and a basis for it.






      share|cite|improve this answer





























        up vote
        0
        down vote













        I think it's safe to say that if a set of vectors is linearly independent in one space, then it will be in any space that contains it.



        This seems clear for any superspace, $Wsupset V$...



        Also, when a different space contains our space... For instance, the spaces $V=x$-axis, $W=xy$-plane and $U=xz$-plane. We simply get that $(1,0)$ is linearly independent in (actually) all three spaces.



        The reason is that the defining condition, namely that only the trivial linear combination is zero, is independent of the surrounding space...



        One could say, for instance, that a set is linearly independent $iff$ it is a basis for the space it spans. Again, notice there is no mention of any (larger) surrounding space.



        As pointed out by @egreg, if there are $n$ vectors, and if $operatornamedim V=n$, then it is true (that the set is linearly independent iff it spans $V$).



        In light of @Daniel Schepler's comments, we should note that changing the base field, or the vector space "structure", we can get some different results. This shows your question may be a little better than you originally thought (though it was an error that led you to it)...






        share|cite|improve this answer






















          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2925959%2fcan-a-set-of-vectors-be-linearly-independent-in-one-vector-space-but-be-linearl%23new-answer', 'question_page');

          );

          Post as a guest






























          5 Answers
          5






          active

          oldest

          votes








          5 Answers
          5






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          2
          down vote













          Suppose $v_1,v_2,dots,v_n$ is a basis of $V$. Then
          $$
          v_2,dots,v_n
          $$

          doesn't span $V$, but it is linearly independent.



          A set of vectors may fail to span $V$, but it can still be linearly independent.



          The only case when you can infer linear dependence from the fact that a set fails to span $V$ is when the set has the same number of elements as the dimension of $V$.



          In your case this condition is not satisfied, because $(1,1)$ consists of one element, but the dimension of $mathbbR^2$ is two.






          share|cite|improve this answer
























            up vote
            2
            down vote













            Suppose $v_1,v_2,dots,v_n$ is a basis of $V$. Then
            $$
            v_2,dots,v_n
            $$

            doesn't span $V$, but it is linearly independent.



            A set of vectors may fail to span $V$, but it can still be linearly independent.



            The only case when you can infer linear dependence from the fact that a set fails to span $V$ is when the set has the same number of elements as the dimension of $V$.



            In your case this condition is not satisfied, because $(1,1)$ consists of one element, but the dimension of $mathbbR^2$ is two.






            share|cite|improve this answer






















              up vote
              2
              down vote










              up vote
              2
              down vote









              Suppose $v_1,v_2,dots,v_n$ is a basis of $V$. Then
              $$
              v_2,dots,v_n
              $$

              doesn't span $V$, but it is linearly independent.



              A set of vectors may fail to span $V$, but it can still be linearly independent.



              The only case when you can infer linear dependence from the fact that a set fails to span $V$ is when the set has the same number of elements as the dimension of $V$.



              In your case this condition is not satisfied, because $(1,1)$ consists of one element, but the dimension of $mathbbR^2$ is two.






              share|cite|improve this answer












              Suppose $v_1,v_2,dots,v_n$ is a basis of $V$. Then
              $$
              v_2,dots,v_n
              $$

              doesn't span $V$, but it is linearly independent.



              A set of vectors may fail to span $V$, but it can still be linearly independent.



              The only case when you can infer linear dependence from the fact that a set fails to span $V$ is when the set has the same number of elements as the dimension of $V$.



              In your case this condition is not satisfied, because $(1,1)$ consists of one element, but the dimension of $mathbbR^2$ is two.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered 1 hour ago









              egreg

              167k1281190




              167k1281190




















                  up vote
                  2
                  down vote













                  You said you “learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.” This isn’t correct, unless “the vector space that it is in” means “the smallest vector space that it is in,” which would not be a typical reader’s understanding. But even with that understanding, the statement is not useful, because the fact is not special to linearly independent sets. Every set of vectors spans the smallest vector space that contains them: a set of vectors spans its span. (That’s practically the definition of span.)



                  I suspect what you were supposed to learn in class was that “A set of vectors in a vector space $V$ can only be a basis for $V$ if the set spans $V$.” (In addition, it must be a linearly independent set of vectors.)



                  In particular, the set $(1,1)$ is a linearly independent set, whether it is considered as a set of vectors in $mathbb R^2$ or as a set of vectors in what you call $S$. The fact that $(1,1)$ does not span $mathbb R^2$ does not tell you anything about the linear independence of the set. (And by the way, any set containing only one nonzero vector is a linearly independent set of vectors.)






                  share|cite|improve this answer




















                  • Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                    – Chris Custer
                    46 mins ago














                  up vote
                  2
                  down vote













                  You said you “learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.” This isn’t correct, unless “the vector space that it is in” means “the smallest vector space that it is in,” which would not be a typical reader’s understanding. But even with that understanding, the statement is not useful, because the fact is not special to linearly independent sets. Every set of vectors spans the smallest vector space that contains them: a set of vectors spans its span. (That’s practically the definition of span.)



                  I suspect what you were supposed to learn in class was that “A set of vectors in a vector space $V$ can only be a basis for $V$ if the set spans $V$.” (In addition, it must be a linearly independent set of vectors.)



                  In particular, the set $(1,1)$ is a linearly independent set, whether it is considered as a set of vectors in $mathbb R^2$ or as a set of vectors in what you call $S$. The fact that $(1,1)$ does not span $mathbb R^2$ does not tell you anything about the linear independence of the set. (And by the way, any set containing only one nonzero vector is a linearly independent set of vectors.)






                  share|cite|improve this answer




















                  • Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                    – Chris Custer
                    46 mins ago












                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  You said you “learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.” This isn’t correct, unless “the vector space that it is in” means “the smallest vector space that it is in,” which would not be a typical reader’s understanding. But even with that understanding, the statement is not useful, because the fact is not special to linearly independent sets. Every set of vectors spans the smallest vector space that contains them: a set of vectors spans its span. (That’s practically the definition of span.)



                  I suspect what you were supposed to learn in class was that “A set of vectors in a vector space $V$ can only be a basis for $V$ if the set spans $V$.” (In addition, it must be a linearly independent set of vectors.)



                  In particular, the set $(1,1)$ is a linearly independent set, whether it is considered as a set of vectors in $mathbb R^2$ or as a set of vectors in what you call $S$. The fact that $(1,1)$ does not span $mathbb R^2$ does not tell you anything about the linear independence of the set. (And by the way, any set containing only one nonzero vector is a linearly independent set of vectors.)






                  share|cite|improve this answer












                  You said you “learned in class that a set of vectors can only be linearly independent if it spans the vector space that it is in.” This isn’t correct, unless “the vector space that it is in” means “the smallest vector space that it is in,” which would not be a typical reader’s understanding. But even with that understanding, the statement is not useful, because the fact is not special to linearly independent sets. Every set of vectors spans the smallest vector space that contains them: a set of vectors spans its span. (That’s practically the definition of span.)



                  I suspect what you were supposed to learn in class was that “A set of vectors in a vector space $V$ can only be a basis for $V$ if the set spans $V$.” (In addition, it must be a linearly independent set of vectors.)



                  In particular, the set $(1,1)$ is a linearly independent set, whether it is considered as a set of vectors in $mathbb R^2$ or as a set of vectors in what you call $S$. The fact that $(1,1)$ does not span $mathbb R^2$ does not tell you anything about the linear independence of the set. (And by the way, any set containing only one nonzero vector is a linearly independent set of vectors.)







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 1 hour ago









                  Steve Kass

                  10.5k11428




                  10.5k11428











                  • Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                    – Chris Custer
                    46 mins ago
















                  • Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                    – Chris Custer
                    46 mins ago















                  Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                  – Chris Custer
                  46 mins ago




                  Slight correction in first paragraph: A set is not linearly independent if it spans the smallest space that it's contained in...
                  – Chris Custer
                  46 mins ago










                  up vote
                  0
                  down vote













                  As egreg explained, your example is wrong.



                  Suppose $V$ and $W$ are both subspaces of a vector space $X$.
                  Then any set of vectors $v_1, ldots, v_k$ in the intersection $V cap W$ that is linearly dependent as a subset of vector space $V$ is also linearly dependent as a subset of $W$. This is because both linear dependence statements are equivalent to the existence of scalars $c_1, ldots, c_k$, not all $0$, such that $c_1 v_1 + ldots + c_k v_k = 0$.






                  share|cite|improve this answer
























                    up vote
                    0
                    down vote













                    As egreg explained, your example is wrong.



                    Suppose $V$ and $W$ are both subspaces of a vector space $X$.
                    Then any set of vectors $v_1, ldots, v_k$ in the intersection $V cap W$ that is linearly dependent as a subset of vector space $V$ is also linearly dependent as a subset of $W$. This is because both linear dependence statements are equivalent to the existence of scalars $c_1, ldots, c_k$, not all $0$, such that $c_1 v_1 + ldots + c_k v_k = 0$.






                    share|cite|improve this answer






















                      up vote
                      0
                      down vote










                      up vote
                      0
                      down vote









                      As egreg explained, your example is wrong.



                      Suppose $V$ and $W$ are both subspaces of a vector space $X$.
                      Then any set of vectors $v_1, ldots, v_k$ in the intersection $V cap W$ that is linearly dependent as a subset of vector space $V$ is also linearly dependent as a subset of $W$. This is because both linear dependence statements are equivalent to the existence of scalars $c_1, ldots, c_k$, not all $0$, such that $c_1 v_1 + ldots + c_k v_k = 0$.






                      share|cite|improve this answer












                      As egreg explained, your example is wrong.



                      Suppose $V$ and $W$ are both subspaces of a vector space $X$.
                      Then any set of vectors $v_1, ldots, v_k$ in the intersection $V cap W$ that is linearly dependent as a subset of vector space $V$ is also linearly dependent as a subset of $W$. This is because both linear dependence statements are equivalent to the existence of scalars $c_1, ldots, c_k$, not all $0$, such that $c_1 v_1 + ldots + c_k v_k = 0$.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered 1 hour ago









                      Robert Israel

                      308k22201444




                      308k22201444




















                          up vote
                          0
                          down vote













                          Firstly, the definition




                          ... that a set of vectors can only be linearly independent if it spans the vector space that it is in.




                          is not correct.



                          A set $S$ of vectors is linearly dependent if there is some $v in S$ that can be written as a linear combination of other vectors from $S$. It is linearly independent otherwise.



                          More formally, a set of vectors $S = v_1,ldots,v_k$ is linearly independent if
                          $$sum_i=1^kalpha_iv_i = 0 quad Rightarrow quad alpha_1= cdots = alpha_k=0$$
                          That is, the only way we can make zero from vectors in $S$ is by multiplying every vector by zero.



                          It is easy to see that if we remove vectors from linearly independent $S$, it stays linearly independent. Conversely, if we add a vector to $S$, it may become dependent. The maximum cardinality (number of elements) of a linearly independent set of vectors in a $n$-dimensional vectors space $V$ is $n$ and this set is then called a basis for $V$. It has the property that it spans entire $V$, that is - every $v in V$ can be represented as a linear combination of vectors from basis.



                          Note that the cardinality of a set of vectors tells us nothing precise about its linear (in)dependence.



                          example 1: $(1,1),(2,2)$ is linearly dependent in $mathbbR^2$ because $2 cdot (1,1) = (2,2)$.
                          example 2: $(1,1)$ is linearly independent in $mathbbR^2$ but not a basis for $mathbbR^2$ because $|S| = 1 neq 2 = dim mathbbR^2$..
                          example 3: $(1,1),(2,3)$ is linearly independent in $mathbbR^2$ and also a basis for $mathbbR^2$ because $|S| = 2 = dim mathbbR^2$.
                          example 4: $(1,1),(2,3),v$ must be linearly dependent in $mathbbR^2$, for any $v in mathbbR^2$.
                          example 5: $(1,1)$ is linearly independent in your space $S := (x_1,x_1)$ and a basis for it.






                          share|cite|improve this answer


























                            up vote
                            0
                            down vote













                            Firstly, the definition




                            ... that a set of vectors can only be linearly independent if it spans the vector space that it is in.




                            is not correct.



                            A set $S$ of vectors is linearly dependent if there is some $v in S$ that can be written as a linear combination of other vectors from $S$. It is linearly independent otherwise.



                            More formally, a set of vectors $S = v_1,ldots,v_k$ is linearly independent if
                            $$sum_i=1^kalpha_iv_i = 0 quad Rightarrow quad alpha_1= cdots = alpha_k=0$$
                            That is, the only way we can make zero from vectors in $S$ is by multiplying every vector by zero.



                            It is easy to see that if we remove vectors from linearly independent $S$, it stays linearly independent. Conversely, if we add a vector to $S$, it may become dependent. The maximum cardinality (number of elements) of a linearly independent set of vectors in a $n$-dimensional vectors space $V$ is $n$ and this set is then called a basis for $V$. It has the property that it spans entire $V$, that is - every $v in V$ can be represented as a linear combination of vectors from basis.



                            Note that the cardinality of a set of vectors tells us nothing precise about its linear (in)dependence.



                            example 1: $(1,1),(2,2)$ is linearly dependent in $mathbbR^2$ because $2 cdot (1,1) = (2,2)$.
                            example 2: $(1,1)$ is linearly independent in $mathbbR^2$ but not a basis for $mathbbR^2$ because $|S| = 1 neq 2 = dim mathbbR^2$..
                            example 3: $(1,1),(2,3)$ is linearly independent in $mathbbR^2$ and also a basis for $mathbbR^2$ because $|S| = 2 = dim mathbbR^2$.
                            example 4: $(1,1),(2,3),v$ must be linearly dependent in $mathbbR^2$, for any $v in mathbbR^2$.
                            example 5: $(1,1)$ is linearly independent in your space $S := (x_1,x_1)$ and a basis for it.






                            share|cite|improve this answer
























                              up vote
                              0
                              down vote










                              up vote
                              0
                              down vote









                              Firstly, the definition




                              ... that a set of vectors can only be linearly independent if it spans the vector space that it is in.




                              is not correct.



                              A set $S$ of vectors is linearly dependent if there is some $v in S$ that can be written as a linear combination of other vectors from $S$. It is linearly independent otherwise.



                              More formally, a set of vectors $S = v_1,ldots,v_k$ is linearly independent if
                              $$sum_i=1^kalpha_iv_i = 0 quad Rightarrow quad alpha_1= cdots = alpha_k=0$$
                              That is, the only way we can make zero from vectors in $S$ is by multiplying every vector by zero.



                              It is easy to see that if we remove vectors from linearly independent $S$, it stays linearly independent. Conversely, if we add a vector to $S$, it may become dependent. The maximum cardinality (number of elements) of a linearly independent set of vectors in a $n$-dimensional vectors space $V$ is $n$ and this set is then called a basis for $V$. It has the property that it spans entire $V$, that is - every $v in V$ can be represented as a linear combination of vectors from basis.



                              Note that the cardinality of a set of vectors tells us nothing precise about its linear (in)dependence.



                              example 1: $(1,1),(2,2)$ is linearly dependent in $mathbbR^2$ because $2 cdot (1,1) = (2,2)$.
                              example 2: $(1,1)$ is linearly independent in $mathbbR^2$ but not a basis for $mathbbR^2$ because $|S| = 1 neq 2 = dim mathbbR^2$..
                              example 3: $(1,1),(2,3)$ is linearly independent in $mathbbR^2$ and also a basis for $mathbbR^2$ because $|S| = 2 = dim mathbbR^2$.
                              example 4: $(1,1),(2,3),v$ must be linearly dependent in $mathbbR^2$, for any $v in mathbbR^2$.
                              example 5: $(1,1)$ is linearly independent in your space $S := (x_1,x_1)$ and a basis for it.






                              share|cite|improve this answer














                              Firstly, the definition




                              ... that a set of vectors can only be linearly independent if it spans the vector space that it is in.




                              is not correct.



                              A set $S$ of vectors is linearly dependent if there is some $v in S$ that can be written as a linear combination of other vectors from $S$. It is linearly independent otherwise.



                              More formally, a set of vectors $S = v_1,ldots,v_k$ is linearly independent if
                              $$sum_i=1^kalpha_iv_i = 0 quad Rightarrow quad alpha_1= cdots = alpha_k=0$$
                              That is, the only way we can make zero from vectors in $S$ is by multiplying every vector by zero.



                              It is easy to see that if we remove vectors from linearly independent $S$, it stays linearly independent. Conversely, if we add a vector to $S$, it may become dependent. The maximum cardinality (number of elements) of a linearly independent set of vectors in a $n$-dimensional vectors space $V$ is $n$ and this set is then called a basis for $V$. It has the property that it spans entire $V$, that is - every $v in V$ can be represented as a linear combination of vectors from basis.



                              Note that the cardinality of a set of vectors tells us nothing precise about its linear (in)dependence.



                              example 1: $(1,1),(2,2)$ is linearly dependent in $mathbbR^2$ because $2 cdot (1,1) = (2,2)$.
                              example 2: $(1,1)$ is linearly independent in $mathbbR^2$ but not a basis for $mathbbR^2$ because $|S| = 1 neq 2 = dim mathbbR^2$..
                              example 3: $(1,1),(2,3)$ is linearly independent in $mathbbR^2$ and also a basis for $mathbbR^2$ because $|S| = 2 = dim mathbbR^2$.
                              example 4: $(1,1),(2,3),v$ must be linearly dependent in $mathbbR^2$, for any $v in mathbbR^2$.
                              example 5: $(1,1)$ is linearly independent in your space $S := (x_1,x_1)$ and a basis for it.







                              share|cite|improve this answer














                              share|cite|improve this answer



                              share|cite|improve this answer








                              edited 1 hour ago

























                              answered 1 hour ago









                              Sandro Lovnički

                              20615




                              20615




















                                  up vote
                                  0
                                  down vote













                                  I think it's safe to say that if a set of vectors is linearly independent in one space, then it will be in any space that contains it.



                                  This seems clear for any superspace, $Wsupset V$...



                                  Also, when a different space contains our space... For instance, the spaces $V=x$-axis, $W=xy$-plane and $U=xz$-plane. We simply get that $(1,0)$ is linearly independent in (actually) all three spaces.



                                  The reason is that the defining condition, namely that only the trivial linear combination is zero, is independent of the surrounding space...



                                  One could say, for instance, that a set is linearly independent $iff$ it is a basis for the space it spans. Again, notice there is no mention of any (larger) surrounding space.



                                  As pointed out by @egreg, if there are $n$ vectors, and if $operatornamedim V=n$, then it is true (that the set is linearly independent iff it spans $V$).



                                  In light of @Daniel Schepler's comments, we should note that changing the base field, or the vector space "structure", we can get some different results. This shows your question may be a little better than you originally thought (though it was an error that led you to it)...






                                  share|cite|improve this answer


























                                    up vote
                                    0
                                    down vote













                                    I think it's safe to say that if a set of vectors is linearly independent in one space, then it will be in any space that contains it.



                                    This seems clear for any superspace, $Wsupset V$...



                                    Also, when a different space contains our space... For instance, the spaces $V=x$-axis, $W=xy$-plane and $U=xz$-plane. We simply get that $(1,0)$ is linearly independent in (actually) all three spaces.



                                    The reason is that the defining condition, namely that only the trivial linear combination is zero, is independent of the surrounding space...



                                    One could say, for instance, that a set is linearly independent $iff$ it is a basis for the space it spans. Again, notice there is no mention of any (larger) surrounding space.



                                    As pointed out by @egreg, if there are $n$ vectors, and if $operatornamedim V=n$, then it is true (that the set is linearly independent iff it spans $V$).



                                    In light of @Daniel Schepler's comments, we should note that changing the base field, or the vector space "structure", we can get some different results. This shows your question may be a little better than you originally thought (though it was an error that led you to it)...






                                    share|cite|improve this answer
























                                      up vote
                                      0
                                      down vote










                                      up vote
                                      0
                                      down vote









                                      I think it's safe to say that if a set of vectors is linearly independent in one space, then it will be in any space that contains it.



                                      This seems clear for any superspace, $Wsupset V$...



                                      Also, when a different space contains our space... For instance, the spaces $V=x$-axis, $W=xy$-plane and $U=xz$-plane. We simply get that $(1,0)$ is linearly independent in (actually) all three spaces.



                                      The reason is that the defining condition, namely that only the trivial linear combination is zero, is independent of the surrounding space...



                                      One could say, for instance, that a set is linearly independent $iff$ it is a basis for the space it spans. Again, notice there is no mention of any (larger) surrounding space.



                                      As pointed out by @egreg, if there are $n$ vectors, and if $operatornamedim V=n$, then it is true (that the set is linearly independent iff it spans $V$).



                                      In light of @Daniel Schepler's comments, we should note that changing the base field, or the vector space "structure", we can get some different results. This shows your question may be a little better than you originally thought (though it was an error that led you to it)...






                                      share|cite|improve this answer














                                      I think it's safe to say that if a set of vectors is linearly independent in one space, then it will be in any space that contains it.



                                      This seems clear for any superspace, $Wsupset V$...



                                      Also, when a different space contains our space... For instance, the spaces $V=x$-axis, $W=xy$-plane and $U=xz$-plane. We simply get that $(1,0)$ is linearly independent in (actually) all three spaces.



                                      The reason is that the defining condition, namely that only the trivial linear combination is zero, is independent of the surrounding space...



                                      One could say, for instance, that a set is linearly independent $iff$ it is a basis for the space it spans. Again, notice there is no mention of any (larger) surrounding space.



                                      As pointed out by @egreg, if there are $n$ vectors, and if $operatornamedim V=n$, then it is true (that the set is linearly independent iff it spans $V$).



                                      In light of @Daniel Schepler's comments, we should note that changing the base field, or the vector space "structure", we can get some different results. This shows your question may be a little better than you originally thought (though it was an error that led you to it)...







                                      share|cite|improve this answer














                                      share|cite|improve this answer



                                      share|cite|improve this answer








                                      edited 11 mins ago

























                                      answered 57 mins ago









                                      Chris Custer

                                      6,5992622




                                      6,5992622



























                                           

                                          draft saved


                                          draft discarded















































                                           


                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2925959%2fcan-a-set-of-vectors-be-linearly-independent-in-one-vector-space-but-be-linearl%23new-answer', 'question_page');

                                          );

                                          Post as a guest













































































                                          Comments

                                          Popular posts from this blog

                                          Long meetings (6-7 hours a day): Being “babysat” by supervisor

                                          Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

                                          Confectionery