Proof verification for Identity matrices
up vote
7
down vote
favorite
So I have the following question:
Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]
Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.
'Proof'.
Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)
True.
My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.
Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)
True.
My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.
Step 3: $A+I_n=0$ or $A-I_n=0$
I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.
Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)
Is what I am doing right so far or am I messing up somewhere?
linear-algebra matrices proof-verification
add a comment |
up vote
7
down vote
favorite
So I have the following question:
Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]
Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.
'Proof'.
Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)
True.
My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.
Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)
True.
My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.
Step 3: $A+I_n=0$ or $A-I_n=0$
I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.
Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)
Is what I am doing right so far or am I messing up somewhere?
linear-algebra matrices proof-verification
1
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55
add a comment |
up vote
7
down vote
favorite
up vote
7
down vote
favorite
So I have the following question:
Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]
Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.
'Proof'.
Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)
True.
My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.
Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)
True.
My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.
Step 3: $A+I_n=0$ or $A-I_n=0$
I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.
Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)
Is what I am doing right so far or am I messing up somewhere?
linear-algebra matrices proof-verification
So I have the following question:
Analyze the following 'Claim' (which may or may not be true) and the corresponding 'Proof', by writing 'TRUE' or 'FALSE' (together with the reason) for each step. [Note: $I_n$ is the $n times n$ identity matrix.]
Claim: Let $A$ be any $n times n$ matrix satisfying $A^2=I_n$. Then either $A=I_n$ or $A=-I_n$.
'Proof'.
Step 1: $A$ satisfies $A^2-I_n = 0$ (True or False)
True.
My reasoning: Clearly, this is true. $A^2=I_n$ is not always true, but because it is true, I should have no problem moving the Identity matrix the the LHS.
Step 2: So $(A+I_n)(A-I_n)=0$ (True or false)
True.
My reasoning: Because $I_n$ is the identity matrix, there should be no issues with factoring just like normal algebra.
Step 3: $A+I_n=0$ or $A-I_n=0$
I'm not sure about this part. I'm very tempted to say this is fine but I am not sure how I can justify this, if I even can.
Therefore $A=-I_n$ or $A=I_n$. (End of 'Proof'.)
Is what I am doing right so far or am I messing up somewhere?
linear-algebra matrices proof-verification
linear-algebra matrices proof-verification
asked Nov 30 at 7:51
Future Math person
857717
857717
1
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55
add a comment |
1
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55
1
1
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55
add a comment |
3 Answers
3
active
oldest
votes
up vote
12
down vote
accepted
Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.
We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.
In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
In particular,
$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
|
show 1 more comment
up vote
9
down vote
Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.
add a comment |
up vote
2
down vote
Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:
$$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$
These matrices are called involuntory
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
12
down vote
accepted
Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.
We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.
In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
In particular,
$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
|
show 1 more comment
up vote
12
down vote
accepted
Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.
We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.
In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
In particular,
$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
|
show 1 more comment
up vote
12
down vote
accepted
up vote
12
down vote
accepted
Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.
We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.
In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
In particular,
$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.
Rather than saying that moving the identiy to the LHS, it is due to we add $-I$ to both sides.
We have $A^2-I=(A-I)(A+I)$, we just have to expand the right hand side to verify that.
In matrices, $AB=0$ doesn't imply that $A=0$ or $B=0$. For example $$begin{bmatrix} 2 & 0 \ 0 & 0end{bmatrix}begin{bmatrix} 0 & 0 \ 0 & -2end{bmatrix}= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
In particular,
$$left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}+begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)left(begin{bmatrix} 1 & 0 \ 0 & -1end{bmatrix}-begin{bmatrix} 1 & 0 \ 0 & 1end{bmatrix}right)= begin{bmatrix} 0 & 0 \ 0 & 0end{bmatrix}$$
that is we cant' conclude that $(A+I)(A-I)=0$ implies $A+I=0$ or $A-I=0$ as well.
edited Dec 1 at 5:08
answered Nov 30 at 7:57
Siong Thye Goh
96.4k1462116
96.4k1462116
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
|
show 1 more comment
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
Aha. I knew something looked fishy with that last part. Thanks!
– Future Math person
Nov 30 at 7:59
2
2
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
When expanding the right hand side, be sure to respect the non-commutativity of matrix multiplication. $$(A+I)(A-I) = (A+I)A - (A+I)I$$ $$qquad = A^2 + IA - AI -I^2 = A^2 - I^2.$$ Although the difference in this particular problem is negligible, it is not generally so.
– Eric Towers
Nov 30 at 13:36
1
1
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
In the third point it should be "that $A=0$ or $B=0$".
– Lonidard
Nov 30 at 14:29
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
thanks for pointing that out.
– Siong Thye Goh
Nov 30 at 14:33
1
1
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
I would like this answer better if $A$ and $B$ were of the form $Apm I$ so that it matches OP's proof. Using the other answer, you could take $A=diag(2,0)$ and $B=diag(0,-2)$, so that they are $diag(1,-1)pm I$.
– Teepeemm
Nov 30 at 18:45
|
show 1 more comment
up vote
9
down vote
Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.
add a comment |
up vote
9
down vote
Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.
add a comment |
up vote
9
down vote
up vote
9
down vote
Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.
Consider any diagonal matrix with diagonal elements $pm 1$. Show that$A^{2}=I_n$. you get $2^{n}$ matrices whose square is $I_n$.
answered Nov 30 at 7:53
Kavi Rama Murthy
45.1k31852
45.1k31852
add a comment |
add a comment |
up vote
2
down vote
Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:
$$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$
These matrices are called involuntory
add a comment |
up vote
2
down vote
Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:
$$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$
These matrices are called involuntory
add a comment |
up vote
2
down vote
up vote
2
down vote
Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:
$$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$
These matrices are called involuntory
Furthermore if you want a concrete example of a matrix whose square is the identity but not itself a simple matrix consider for example this one:
$$begin{bmatrix}frac{1}{2} & frac{3}{4} \ 1 & -frac{1}{2}end{bmatrix}$$
These matrices are called involuntory
answered Nov 30 at 15:43
gota
394315
394315
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019798%2fproof-verification-for-identity-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
What's the square of $A = begin{pmatrix}1 & 0 \ 0 & -1end{pmatrix}$?
– md2perpe
Nov 30 at 17:55