Is it possible to solve and find out the square of the matrix value answer without calculating?
There is a matrix $A =$
begin{pmatrix} 0 & a & b \ 1 & -b & -b\ -1 & a & a end{pmatrix}
where $a-b = 1$ . $A^2$ is an Identity matrix of order $3$ . Is it possible to find $A^2$ without even calculating $A^2$ ? Please help .
matrices matrix-equations
add a comment |
There is a matrix $A =$
begin{pmatrix} 0 & a & b \ 1 & -b & -b\ -1 & a & a end{pmatrix}
where $a-b = 1$ . $A^2$ is an Identity matrix of order $3$ . Is it possible to find $A^2$ without even calculating $A^2$ ? Please help .
matrices matrix-equations
2
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58
add a comment |
There is a matrix $A =$
begin{pmatrix} 0 & a & b \ 1 & -b & -b\ -1 & a & a end{pmatrix}
where $a-b = 1$ . $A^2$ is an Identity matrix of order $3$ . Is it possible to find $A^2$ without even calculating $A^2$ ? Please help .
matrices matrix-equations
There is a matrix $A =$
begin{pmatrix} 0 & a & b \ 1 & -b & -b\ -1 & a & a end{pmatrix}
where $a-b = 1$ . $A^2$ is an Identity matrix of order $3$ . Is it possible to find $A^2$ without even calculating $A^2$ ? Please help .
matrices matrix-equations
matrices matrix-equations
edited Oct 29 '18 at 17:13
asked Oct 27 '18 at 2:35
Nilabja Saha
13
13
2
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58
add a comment |
2
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58
2
2
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58
add a comment |
4 Answers
4
active
oldest
votes
Yes, there's another way. It's certainly not easier, but it works. I suspect it uses a number of ideas that you have not yet encountered as well, but that's how life is sometimes. Anyhow, here goes:
First, you can compute the characteristic polynomial, $c(x) = det(A - xI)$, which is
begin{align}
c(x)
&= det pmatrix{-x & a & b \ 1 & -b - x & -b \ -1 & a & a - x}\
&= (-x) left( (-b-x)(a-x) + abright) - a left( (a-x) -b right) + b left(a + (-b-x) right)\
&= x left( (b+x)(a-x) - abright) - a left( a-x -b right) + b left(a -b-x right) & text{substitute $a-b = 1$ to get}\
&= x left( (b+x)(a-x) - abright) - a left( 1-x right) + b left(1-x right) \
&= x left( ba + (a-b)x - x^2 - abright) - a left( 1-x right) + b left(1-x right) \
&= x left(x - x^2 right) - a left( 1-x right) + b left(1-x right) \
&= x^2 - x^3 + (b-a)left( 1-x right) \
&= x^2 - x^3 -1left( 1-x right) \
&= x^2 - x^3 -1 + x \
&= -x^3 + x^2 + x -1 \
&= -(x-1)^2(x+1)
end{align}
That means that the eigenvalues of $A$ are $1, 1, -1$. Hence the Jordan normal form of $A$ is either
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
or
$$
A= P^{-1} pmatrix{1& 1& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
for some invertible matrix $P$.
In the first case, there are two eigenvectors corresponding to the eigenvalue $1$; in the second there's only one. We can check which of these happens by looking at the nullspace of $A - 1 cdot I$, whose dimension is the number of eigenvectors for $+1$:
begin{align}
A - 1 cdot I =
pmatrix{-1 & a & b \
1 & -b-1 & -b \
-1 & a & a-1} & text{substitute $a = b+1; a-1 = b$}\
=pmatrix{-1 & b+1 & b \
1 & -(b+1) & -b \
-1 & b+1 & b}
end{align}
The second and third columns are obviously multiples of the first, so $A - 1 cdot I$ has rank $1$, i.e., there are two eigenvectors for the eigenvalue $1$. That means that for some matrix $P$, we have
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
whence
begin{align}
A^2
&=
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P)
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P) \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} (P
P^{-1}) pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1}^2 P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & 1} P \
&=
P^{-1} I P \
&=
P^{-1} P \
&=
I.
end{align}
It seems to me that it would have been a great deal easier to just compute $A^2$ directly. :)
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
add a comment |
Here is another approach. It is not as nice as the other answer, but it is shorter. Perform the elementary row operations (in C programming language's notation):
row 3 += row 2; col 2 -= col 3
. We see that $A$ is similar to
$$
B=pmatrix{
0&1&b\
1&0&-b\
0&0&1}.
$$
One can verify that $pmatrix{1\ 1\ 0},pmatrix{1\ -1\ 0}$ and $pmatrix{b\ -b\ 2}$ are eigenvectors of $B$ corresponding to the eigenvalues $1,-1$ and $1$ respectively. When the underlying field is not of characteristic $2$, these eigenvectors are linearly independent. Therefore $B$ and in turn $A$ are similar to $operatorname{diag}(1,-1,1)$. Hence their squares are equal to the identity matrix.
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
add a comment |
What?? If "$A^2$ is the identity matrix" then $A^3= A(A^2)= A$. No "calculation" at all required!
add a comment |
A solution without calculation.
from $tr(A)=a-b=1$, we deduce that
$A^2=I$ IFF $A$ is diagonalizable and ${1,1}subset spectrum(A)$
IFF $rank(A-I)=1$.
Clearly, we see that $rank(A-I)=rankbegin{pmatrix}-1&b+1&b\1&-b-1&-b\-1&b+1&bend{pmatrix}=1$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2972931%2fis-it-possible-to-solve-and-find-out-the-square-of-the-matrix-value-answer-witho%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
Yes, there's another way. It's certainly not easier, but it works. I suspect it uses a number of ideas that you have not yet encountered as well, but that's how life is sometimes. Anyhow, here goes:
First, you can compute the characteristic polynomial, $c(x) = det(A - xI)$, which is
begin{align}
c(x)
&= det pmatrix{-x & a & b \ 1 & -b - x & -b \ -1 & a & a - x}\
&= (-x) left( (-b-x)(a-x) + abright) - a left( (a-x) -b right) + b left(a + (-b-x) right)\
&= x left( (b+x)(a-x) - abright) - a left( a-x -b right) + b left(a -b-x right) & text{substitute $a-b = 1$ to get}\
&= x left( (b+x)(a-x) - abright) - a left( 1-x right) + b left(1-x right) \
&= x left( ba + (a-b)x - x^2 - abright) - a left( 1-x right) + b left(1-x right) \
&= x left(x - x^2 right) - a left( 1-x right) + b left(1-x right) \
&= x^2 - x^3 + (b-a)left( 1-x right) \
&= x^2 - x^3 -1left( 1-x right) \
&= x^2 - x^3 -1 + x \
&= -x^3 + x^2 + x -1 \
&= -(x-1)^2(x+1)
end{align}
That means that the eigenvalues of $A$ are $1, 1, -1$. Hence the Jordan normal form of $A$ is either
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
or
$$
A= P^{-1} pmatrix{1& 1& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
for some invertible matrix $P$.
In the first case, there are two eigenvectors corresponding to the eigenvalue $1$; in the second there's only one. We can check which of these happens by looking at the nullspace of $A - 1 cdot I$, whose dimension is the number of eigenvectors for $+1$:
begin{align}
A - 1 cdot I =
pmatrix{-1 & a & b \
1 & -b-1 & -b \
-1 & a & a-1} & text{substitute $a = b+1; a-1 = b$}\
=pmatrix{-1 & b+1 & b \
1 & -(b+1) & -b \
-1 & b+1 & b}
end{align}
The second and third columns are obviously multiples of the first, so $A - 1 cdot I$ has rank $1$, i.e., there are two eigenvectors for the eigenvalue $1$. That means that for some matrix $P$, we have
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
whence
begin{align}
A^2
&=
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P)
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P) \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} (P
P^{-1}) pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1}^2 P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & 1} P \
&=
P^{-1} I P \
&=
P^{-1} P \
&=
I.
end{align}
It seems to me that it would have been a great deal easier to just compute $A^2$ directly. :)
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
add a comment |
Yes, there's another way. It's certainly not easier, but it works. I suspect it uses a number of ideas that you have not yet encountered as well, but that's how life is sometimes. Anyhow, here goes:
First, you can compute the characteristic polynomial, $c(x) = det(A - xI)$, which is
begin{align}
c(x)
&= det pmatrix{-x & a & b \ 1 & -b - x & -b \ -1 & a & a - x}\
&= (-x) left( (-b-x)(a-x) + abright) - a left( (a-x) -b right) + b left(a + (-b-x) right)\
&= x left( (b+x)(a-x) - abright) - a left( a-x -b right) + b left(a -b-x right) & text{substitute $a-b = 1$ to get}\
&= x left( (b+x)(a-x) - abright) - a left( 1-x right) + b left(1-x right) \
&= x left( ba + (a-b)x - x^2 - abright) - a left( 1-x right) + b left(1-x right) \
&= x left(x - x^2 right) - a left( 1-x right) + b left(1-x right) \
&= x^2 - x^3 + (b-a)left( 1-x right) \
&= x^2 - x^3 -1left( 1-x right) \
&= x^2 - x^3 -1 + x \
&= -x^3 + x^2 + x -1 \
&= -(x-1)^2(x+1)
end{align}
That means that the eigenvalues of $A$ are $1, 1, -1$. Hence the Jordan normal form of $A$ is either
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
or
$$
A= P^{-1} pmatrix{1& 1& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
for some invertible matrix $P$.
In the first case, there are two eigenvectors corresponding to the eigenvalue $1$; in the second there's only one. We can check which of these happens by looking at the nullspace of $A - 1 cdot I$, whose dimension is the number of eigenvectors for $+1$:
begin{align}
A - 1 cdot I =
pmatrix{-1 & a & b \
1 & -b-1 & -b \
-1 & a & a-1} & text{substitute $a = b+1; a-1 = b$}\
=pmatrix{-1 & b+1 & b \
1 & -(b+1) & -b \
-1 & b+1 & b}
end{align}
The second and third columns are obviously multiples of the first, so $A - 1 cdot I$ has rank $1$, i.e., there are two eigenvectors for the eigenvalue $1$. That means that for some matrix $P$, we have
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
whence
begin{align}
A^2
&=
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P)
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P) \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} (P
P^{-1}) pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1}^2 P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & 1} P \
&=
P^{-1} I P \
&=
P^{-1} P \
&=
I.
end{align}
It seems to me that it would have been a great deal easier to just compute $A^2$ directly. :)
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
add a comment |
Yes, there's another way. It's certainly not easier, but it works. I suspect it uses a number of ideas that you have not yet encountered as well, but that's how life is sometimes. Anyhow, here goes:
First, you can compute the characteristic polynomial, $c(x) = det(A - xI)$, which is
begin{align}
c(x)
&= det pmatrix{-x & a & b \ 1 & -b - x & -b \ -1 & a & a - x}\
&= (-x) left( (-b-x)(a-x) + abright) - a left( (a-x) -b right) + b left(a + (-b-x) right)\
&= x left( (b+x)(a-x) - abright) - a left( a-x -b right) + b left(a -b-x right) & text{substitute $a-b = 1$ to get}\
&= x left( (b+x)(a-x) - abright) - a left( 1-x right) + b left(1-x right) \
&= x left( ba + (a-b)x - x^2 - abright) - a left( 1-x right) + b left(1-x right) \
&= x left(x - x^2 right) - a left( 1-x right) + b left(1-x right) \
&= x^2 - x^3 + (b-a)left( 1-x right) \
&= x^2 - x^3 -1left( 1-x right) \
&= x^2 - x^3 -1 + x \
&= -x^3 + x^2 + x -1 \
&= -(x-1)^2(x+1)
end{align}
That means that the eigenvalues of $A$ are $1, 1, -1$. Hence the Jordan normal form of $A$ is either
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
or
$$
A= P^{-1} pmatrix{1& 1& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
for some invertible matrix $P$.
In the first case, there are two eigenvectors corresponding to the eigenvalue $1$; in the second there's only one. We can check which of these happens by looking at the nullspace of $A - 1 cdot I$, whose dimension is the number of eigenvectors for $+1$:
begin{align}
A - 1 cdot I =
pmatrix{-1 & a & b \
1 & -b-1 & -b \
-1 & a & a-1} & text{substitute $a = b+1; a-1 = b$}\
=pmatrix{-1 & b+1 & b \
1 & -(b+1) & -b \
-1 & b+1 & b}
end{align}
The second and third columns are obviously multiples of the first, so $A - 1 cdot I$ has rank $1$, i.e., there are two eigenvectors for the eigenvalue $1$. That means that for some matrix $P$, we have
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
whence
begin{align}
A^2
&=
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P)
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P) \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} (P
P^{-1}) pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1}^2 P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & 1} P \
&=
P^{-1} I P \
&=
P^{-1} P \
&=
I.
end{align}
It seems to me that it would have been a great deal easier to just compute $A^2$ directly. :)
Yes, there's another way. It's certainly not easier, but it works. I suspect it uses a number of ideas that you have not yet encountered as well, but that's how life is sometimes. Anyhow, here goes:
First, you can compute the characteristic polynomial, $c(x) = det(A - xI)$, which is
begin{align}
c(x)
&= det pmatrix{-x & a & b \ 1 & -b - x & -b \ -1 & a & a - x}\
&= (-x) left( (-b-x)(a-x) + abright) - a left( (a-x) -b right) + b left(a + (-b-x) right)\
&= x left( (b+x)(a-x) - abright) - a left( a-x -b right) + b left(a -b-x right) & text{substitute $a-b = 1$ to get}\
&= x left( (b+x)(a-x) - abright) - a left( 1-x right) + b left(1-x right) \
&= x left( ba + (a-b)x - x^2 - abright) - a left( 1-x right) + b left(1-x right) \
&= x left(x - x^2 right) - a left( 1-x right) + b left(1-x right) \
&= x^2 - x^3 + (b-a)left( 1-x right) \
&= x^2 - x^3 -1left( 1-x right) \
&= x^2 - x^3 -1 + x \
&= -x^3 + x^2 + x -1 \
&= -(x-1)^2(x+1)
end{align}
That means that the eigenvalues of $A$ are $1, 1, -1$. Hence the Jordan normal form of $A$ is either
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
or
$$
A= P^{-1} pmatrix{1& 1& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
for some invertible matrix $P$.
In the first case, there are two eigenvectors corresponding to the eigenvalue $1$; in the second there's only one. We can check which of these happens by looking at the nullspace of $A - 1 cdot I$, whose dimension is the number of eigenvectors for $+1$:
begin{align}
A - 1 cdot I =
pmatrix{-1 & a & b \
1 & -b-1 & -b \
-1 & a & a-1} & text{substitute $a = b+1; a-1 = b$}\
=pmatrix{-1 & b+1 & b \
1 & -(b+1) & -b \
-1 & b+1 & b}
end{align}
The second and third columns are obviously multiples of the first, so $A - 1 cdot I$ has rank $1$, i.e., there are two eigenvectors for the eigenvalue $1$. That means that for some matrix $P$, we have
$$
A= P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P
$$
whence
begin{align}
A^2
&=
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P)
(P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P) \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} (P
P^{-1}) pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1} P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & -1}^2 P \
&=
P^{-1} pmatrix{1& 0& 0 \ 0 & 1 & 0 \ 0 & 0 & 1} P \
&=
P^{-1} I P \
&=
P^{-1} P \
&=
I.
end{align}
It seems to me that it would have been a great deal easier to just compute $A^2$ directly. :)
edited Oct 27 '18 at 12:54
answered Oct 27 '18 at 6:27
John Hughes
62.4k24090
62.4k24090
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
add a comment |
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
what is $P$ here ?
– Nilabja Saha
Oct 27 '18 at 6:37
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
In the Jordan Normal form, a theorem guarantees that there's some (square) matrix $P$ with the property that $P^{-1} A P$ is in "Jordan normal form". The exact nature of the matrix $P$ doesn't matter for this particular line of argument, however. I've added a phrase to clarify this.
– John Hughes
Oct 27 '18 at 12:53
add a comment |
Here is another approach. It is not as nice as the other answer, but it is shorter. Perform the elementary row operations (in C programming language's notation):
row 3 += row 2; col 2 -= col 3
. We see that $A$ is similar to
$$
B=pmatrix{
0&1&b\
1&0&-b\
0&0&1}.
$$
One can verify that $pmatrix{1\ 1\ 0},pmatrix{1\ -1\ 0}$ and $pmatrix{b\ -b\ 2}$ are eigenvectors of $B$ corresponding to the eigenvalues $1,-1$ and $1$ respectively. When the underlying field is not of characteristic $2$, these eigenvectors are linearly independent. Therefore $B$ and in turn $A$ are similar to $operatorname{diag}(1,-1,1)$. Hence their squares are equal to the identity matrix.
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
add a comment |
Here is another approach. It is not as nice as the other answer, but it is shorter. Perform the elementary row operations (in C programming language's notation):
row 3 += row 2; col 2 -= col 3
. We see that $A$ is similar to
$$
B=pmatrix{
0&1&b\
1&0&-b\
0&0&1}.
$$
One can verify that $pmatrix{1\ 1\ 0},pmatrix{1\ -1\ 0}$ and $pmatrix{b\ -b\ 2}$ are eigenvectors of $B$ corresponding to the eigenvalues $1,-1$ and $1$ respectively. When the underlying field is not of characteristic $2$, these eigenvectors are linearly independent. Therefore $B$ and in turn $A$ are similar to $operatorname{diag}(1,-1,1)$. Hence their squares are equal to the identity matrix.
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
add a comment |
Here is another approach. It is not as nice as the other answer, but it is shorter. Perform the elementary row operations (in C programming language's notation):
row 3 += row 2; col 2 -= col 3
. We see that $A$ is similar to
$$
B=pmatrix{
0&1&b\
1&0&-b\
0&0&1}.
$$
One can verify that $pmatrix{1\ 1\ 0},pmatrix{1\ -1\ 0}$ and $pmatrix{b\ -b\ 2}$ are eigenvectors of $B$ corresponding to the eigenvalues $1,-1$ and $1$ respectively. When the underlying field is not of characteristic $2$, these eigenvectors are linearly independent. Therefore $B$ and in turn $A$ are similar to $operatorname{diag}(1,-1,1)$. Hence their squares are equal to the identity matrix.
Here is another approach. It is not as nice as the other answer, but it is shorter. Perform the elementary row operations (in C programming language's notation):
row 3 += row 2; col 2 -= col 3
. We see that $A$ is similar to
$$
B=pmatrix{
0&1&b\
1&0&-b\
0&0&1}.
$$
One can verify that $pmatrix{1\ 1\ 0},pmatrix{1\ -1\ 0}$ and $pmatrix{b\ -b\ 2}$ are eigenvectors of $B$ corresponding to the eigenvalues $1,-1$ and $1$ respectively. When the underlying field is not of characteristic $2$, these eigenvectors are linearly independent. Therefore $B$ and in turn $A$ are similar to $operatorname{diag}(1,-1,1)$. Hence their squares are equal to the identity matrix.
answered Oct 27 '18 at 9:59
user1551
71.7k566125
71.7k566125
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
add a comment |
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
Kindly explain "When the underlying field is not of characteristic 2, these eigenvectors are linearly independent. Therefore B and in turn A are similar to diag(1,−1,1) Hence their squares are equal to the identity matrix."
– Nilabja Saha
Oct 27 '18 at 13:35
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
@NilabjaSaha The first two vectors have zero last entries, therefore the third vector cannot be a linear combination of the first two. And the first two vectors are not parallel to each other. Hence all three vectors are linearly independent.
– user1551
Oct 27 '18 at 20:01
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
"Therefore B and in turn A are similar to diag(1,−1,1) . Hence their squares are equal to the identity matrix." - Please explain this .
– Nilabja Saha
Oct 28 '18 at 6:19
add a comment |
What?? If "$A^2$ is the identity matrix" then $A^3= A(A^2)= A$. No "calculation" at all required!
add a comment |
What?? If "$A^2$ is the identity matrix" then $A^3= A(A^2)= A$. No "calculation" at all required!
add a comment |
What?? If "$A^2$ is the identity matrix" then $A^3= A(A^2)= A$. No "calculation" at all required!
What?? If "$A^2$ is the identity matrix" then $A^3= A(A^2)= A$. No "calculation" at all required!
answered Oct 27 '18 at 10:06
user247327
10.4k1515
10.4k1515
add a comment |
add a comment |
A solution without calculation.
from $tr(A)=a-b=1$, we deduce that
$A^2=I$ IFF $A$ is diagonalizable and ${1,1}subset spectrum(A)$
IFF $rank(A-I)=1$.
Clearly, we see that $rank(A-I)=rankbegin{pmatrix}-1&b+1&b\1&-b-1&-b\-1&b+1&bend{pmatrix}=1$.
add a comment |
A solution without calculation.
from $tr(A)=a-b=1$, we deduce that
$A^2=I$ IFF $A$ is diagonalizable and ${1,1}subset spectrum(A)$
IFF $rank(A-I)=1$.
Clearly, we see that $rank(A-I)=rankbegin{pmatrix}-1&b+1&b\1&-b-1&-b\-1&b+1&bend{pmatrix}=1$.
add a comment |
A solution without calculation.
from $tr(A)=a-b=1$, we deduce that
$A^2=I$ IFF $A$ is diagonalizable and ${1,1}subset spectrum(A)$
IFF $rank(A-I)=1$.
Clearly, we see that $rank(A-I)=rankbegin{pmatrix}-1&b+1&b\1&-b-1&-b\-1&b+1&bend{pmatrix}=1$.
A solution without calculation.
from $tr(A)=a-b=1$, we deduce that
$A^2=I$ IFF $A$ is diagonalizable and ${1,1}subset spectrum(A)$
IFF $rank(A-I)=1$.
Clearly, we see that $rank(A-I)=rankbegin{pmatrix}-1&b+1&b\1&-b-1&-b\-1&b+1&bend{pmatrix}=1$.
answered Nov 21 '18 at 23:17
loup blanc
22.5k21750
22.5k21750
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2972931%2fis-it-possible-to-solve-and-find-out-the-square-of-the-matrix-value-answer-witho%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
Can you tell us why you don't want to calculate $A^2$? It seems to me like the most direct way to answer the problem.
– JonathanZ
Oct 27 '18 at 2:50
I want to know if there is any other way to find $A^2$ using elementary row operations or any matrix rules
– Nilabja Saha
Oct 27 '18 at 4:49
It looks like you wrote the answer in your question already ("A^2 is an identity matrix of order 3").
– D.B.
Oct 27 '18 at 4:58