Show that, for square matrices $A$ and $B$, $A+B=AB$ implies $AB=BA$.
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Let $A$ and $B$ be two $n$-by-$n$ real matrices such that $A+B = AB$.
How do I prove that $AB= BA$?
I have tried using the trace function on $A+B-AB$. But I could not get any Ideas.
Kindly provide me with hints.
linear-algebra matrices
add a comment |Â
up vote
2
down vote
favorite
Let $A$ and $B$ be two $n$-by-$n$ real matrices such that $A+B = AB$.
How do I prove that $AB= BA$?
I have tried using the trace function on $A+B-AB$. But I could not get any Ideas.
Kindly provide me with hints.
linear-algebra matrices
1
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
1
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
1
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Let $A$ and $B$ be two $n$-by-$n$ real matrices such that $A+B = AB$.
How do I prove that $AB= BA$?
I have tried using the trace function on $A+B-AB$. But I could not get any Ideas.
Kindly provide me with hints.
linear-algebra matrices
Let $A$ and $B$ be two $n$-by-$n$ real matrices such that $A+B = AB$.
How do I prove that $AB= BA$?
I have tried using the trace function on $A+B-AB$. But I could not get any Ideas.
Kindly provide me with hints.
linear-algebra matrices
linear-algebra matrices
edited 49 secs ago
Batominovski
27.5k22881
27.5k22881
asked 24 mins ago
tony
1489
1489
1
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
1
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
1
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago
add a comment |Â
1
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
1
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
1
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago
1
1
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
1
1
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
1
1
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
7
down vote
$A+B=AB$ is equivalent to $(I-A)(I-B)=I$. As $I-A$ and $I-B$ are square, this implies $(I-B)(I-A)=I$, etc.
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
add a comment |Â
up vote
-3
down vote
Addition of matrices is always commutative, so $$BA = B + A = A + B = AB$$
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
7
down vote
$A+B=AB$ is equivalent to $(I-A)(I-B)=I$. As $I-A$ and $I-B$ are square, this implies $(I-B)(I-A)=I$, etc.
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
add a comment |Â
up vote
7
down vote
$A+B=AB$ is equivalent to $(I-A)(I-B)=I$. As $I-A$ and $I-B$ are square, this implies $(I-B)(I-A)=I$, etc.
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
add a comment |Â
up vote
7
down vote
up vote
7
down vote
$A+B=AB$ is equivalent to $(I-A)(I-B)=I$. As $I-A$ and $I-B$ are square, this implies $(I-B)(I-A)=I$, etc.
$A+B=AB$ is equivalent to $(I-A)(I-B)=I$. As $I-A$ and $I-B$ are square, this implies $(I-B)(I-A)=I$, etc.
answered 22 mins ago
Lord Shark the Unknown
92k955118
92k955118
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
add a comment |Â
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
@tony then mark it as answered.
â user25959
14 mins ago
@tony then mark it as answered.
â user25959
14 mins ago
how do i do that?
â tony
13 mins ago
how do i do that?
â tony
13 mins ago
1
1
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
@user25959 You cannot accept an answer before 15 min. after the post.
â cansomeonehelpmeout
12 mins ago
add a comment |Â
up vote
-3
down vote
Addition of matrices is always commutative, so $$BA = B + A = A + B = AB$$
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
add a comment |Â
up vote
-3
down vote
Addition of matrices is always commutative, so $$BA = B + A = A + B = AB$$
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
add a comment |Â
up vote
-3
down vote
up vote
-3
down vote
Addition of matrices is always commutative, so $$BA = B + A = A + B = AB$$
Addition of matrices is always commutative, so $$BA = B + A = A + B = AB$$
answered 19 mins ago
Lukas Kofler
7931518
7931518
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
add a comment |Â
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
5
5
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
How do you know that $BA = B + A$?
â Theo Bendit
18 mins ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
...I don't think that this deserves the down vote, as the commutative addition $A + B = B + A$ implies that swapping the values does not change the result.
â simon.watts
1 min ago
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2952894%2fshow-that-for-square-matrices-a-and-b-ab-ab-implies-ab-ba%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
I want to remark that, for a vector space $V$ over a field $K$ and for two $K$-linear operators $A,B: Vto V$ such that $A+B=Acirc B$, if $V$ is finite-dimensional over $K$, then $A$ and $B$ commute (i.e., $Acirc B=Bcirc A$). However, the result does not hold if $V$ is infinite-dimensional over $K$.
â Batominovski
6 mins ago
1
Take, for example, $V$ to be the vector space of infinite sequences $mathbfx:=left(x_iright)_iinmathbbZ_>0$ of elements $x_1,x_2,x_3,ldotsin K$. For each $mathbfx:=left(x_iright)_iinmathbbZ_>0in V$, define $$A(mathbfx):=left(x_1-x_2,x_2-x_3,x_3-x_4,x_4-x_5,x_5-x_6,ldotsright)$$ and $$B(mathbfx):=left(x_1,x_2-x_1,x_3-x_2,x_4-x_3,x_5-x_4,ldotsright),.$$
â Batominovski
6 mins ago
1
Then, for every $mathbfx:=(x_i)_iinmathbbZ_>0$ in $V$, $$(A+B)(mathbfx)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright)$$ and $$(Acirc B)(mathbfx)=Aleft(x_1,x_2-x_1,x_3-x_2,ldotsright)=left(2x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),,$$ but $$(Bcirc A)(mathbfx)=Bleft(x_1-x_2,x_2-x_3,x_3-x_4,ldotsright)=left(x_1-x_2,2x_2-x_1-x_3,2x_3-x_2-x_4,ldotsright),.$$ Thus, $$A+B=Acirc Btext but Acirc Bneq Bcirc A,.$$
â Batominovski
5 mins ago