Eigenvalues and Eigenvectors of Sum of Symmetric Matrix
Question:
Let A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}
Find all eigenvalues and eigenvectors of the martrix:
$$sum_{n=1}^{100} A^n = A^{100} +A^{99} +...+A^2+A$$
I know that the eigenvectors of A are begin{bmatrix} 1 \ 1 end{bmatrix} and begin{bmatrix} 1 \ -1 end{bmatrix}
But I do not see any sort of correlation with the sum term and A's eigenvectors.
linear-algebra eigenvalues-eigenvectors symmetric-matrices
add a comment |
Question:
Let A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}
Find all eigenvalues and eigenvectors of the martrix:
$$sum_{n=1}^{100} A^n = A^{100} +A^{99} +...+A^2+A$$
I know that the eigenvectors of A are begin{bmatrix} 1 \ 1 end{bmatrix} and begin{bmatrix} 1 \ -1 end{bmatrix}
But I do not see any sort of correlation with the sum term and A's eigenvectors.
linear-algebra eigenvalues-eigenvectors symmetric-matrices
3
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24
add a comment |
Question:
Let A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}
Find all eigenvalues and eigenvectors of the martrix:
$$sum_{n=1}^{100} A^n = A^{100} +A^{99} +...+A^2+A$$
I know that the eigenvectors of A are begin{bmatrix} 1 \ 1 end{bmatrix} and begin{bmatrix} 1 \ -1 end{bmatrix}
But I do not see any sort of correlation with the sum term and A's eigenvectors.
linear-algebra eigenvalues-eigenvectors symmetric-matrices
Question:
Let A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}
Find all eigenvalues and eigenvectors of the martrix:
$$sum_{n=1}^{100} A^n = A^{100} +A^{99} +...+A^2+A$$
I know that the eigenvectors of A are begin{bmatrix} 1 \ 1 end{bmatrix} and begin{bmatrix} 1 \ -1 end{bmatrix}
But I do not see any sort of correlation with the sum term and A's eigenvectors.
linear-algebra eigenvalues-eigenvectors symmetric-matrices
linear-algebra eigenvalues-eigenvectors symmetric-matrices
asked Dec 11 '18 at 19:13
mhall14
283
283
3
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24
add a comment |
3
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24
3
3
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24
add a comment |
6 Answers
6
active
oldest
votes
Hint: If $$A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}$$then we have $$A^2 = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}=begin{bmatrix} 2 & 2 \ 2 & 2 \ end{bmatrix}\A^3=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 2&2 \ 2&2 \ end{bmatrix}=begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}\A^4=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}=begin{bmatrix}8&8 \ 8&8 \ end{bmatrix}\.\.\.\.$$and you can prove by induction that $$A^k=begin{bmatrix} 2^{k-1}&2^{k-1} \ 2^{k-1}&2^{k-1}\ end{bmatrix}$$can you finish now?
add a comment |
By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(lambda)$; see this question.
For instance, in this case, if $Av=lambda v$, then $A^nv=lambda^nv$, and $(sum_{n=1}^{100}A^n)v=sum_{n=1}^{100}(A^nv )=sum_{n=1}^{100}(lambda^nv)=(sum_{n=1}^{100}lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $sum_{n=1}^{100}lambda^n$. $A$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2^{101}-1$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
add a comment |
Hint :
Recall the Cayley-Hamilton Theorem (by Wikipedia) :
For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity :
$$p(A) = A^n + c_{n-1}A^{n-1} + dots + cA + (-1)^ndet(A)I_n = O$$
The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues:
$$s_k = sum_{i=1}^n lambda_i^k = text{tr}(A^k)$$
add a comment |
You can explicitly compute $sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.
Now
begin{align}
sum_{i=1}^{100}A^i&=sum_{i=1}^{100}PD^iP^{-1}\&=Pleft(sum_{i=1}^{100} D^iright)P^{-1}
end{align}
Notice that
$$(D-I)left(sum_{i=1}^{100}D^iright)=D^{101}-I.$$
SInce $D-I$ is invertible (you can check it)
$$sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$
Therefore
$$sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$
add a comment |
It is easy to prove that for $kin Bbb{N},$ $$A^k=begin{bmatrix} 2^{k-1} & 2^{k-1} \ 2^{k-1} & 2^{k-1} \ end{bmatrix}.$$ The sum is
$$Sigma=begin{bmatrix} 2^{100}-1 & 2^{100}-1 \ 2^{100}-1 & 2^{100}-1 \ end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$
Each matrix $A^k, k=1,dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $Sigma.$
add a comment |
Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3035684%2feigenvalues-and-eigenvectors-of-sum-of-symmetric-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
Hint: If $$A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}$$then we have $$A^2 = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}=begin{bmatrix} 2 & 2 \ 2 & 2 \ end{bmatrix}\A^3=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 2&2 \ 2&2 \ end{bmatrix}=begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}\A^4=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}=begin{bmatrix}8&8 \ 8&8 \ end{bmatrix}\.\.\.\.$$and you can prove by induction that $$A^k=begin{bmatrix} 2^{k-1}&2^{k-1} \ 2^{k-1}&2^{k-1}\ end{bmatrix}$$can you finish now?
add a comment |
Hint: If $$A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}$$then we have $$A^2 = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}=begin{bmatrix} 2 & 2 \ 2 & 2 \ end{bmatrix}\A^3=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 2&2 \ 2&2 \ end{bmatrix}=begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}\A^4=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}=begin{bmatrix}8&8 \ 8&8 \ end{bmatrix}\.\.\.\.$$and you can prove by induction that $$A^k=begin{bmatrix} 2^{k-1}&2^{k-1} \ 2^{k-1}&2^{k-1}\ end{bmatrix}$$can you finish now?
add a comment |
Hint: If $$A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}$$then we have $$A^2 = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}=begin{bmatrix} 2 & 2 \ 2 & 2 \ end{bmatrix}\A^3=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 2&2 \ 2&2 \ end{bmatrix}=begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}\A^4=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}=begin{bmatrix}8&8 \ 8&8 \ end{bmatrix}\.\.\.\.$$and you can prove by induction that $$A^k=begin{bmatrix} 2^{k-1}&2^{k-1} \ 2^{k-1}&2^{k-1}\ end{bmatrix}$$can you finish now?
Hint: If $$A = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}$$then we have $$A^2 = begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}=begin{bmatrix} 2 & 2 \ 2 & 2 \ end{bmatrix}\A^3=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 2&2 \ 2&2 \ end{bmatrix}=begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}\A^4=begin{bmatrix} 1 & 1 \ 1 & 1 \ end{bmatrix}begin{bmatrix} 4&4 \ 4&4 \ end{bmatrix}=begin{bmatrix}8&8 \ 8&8 \ end{bmatrix}\.\.\.\.$$and you can prove by induction that $$A^k=begin{bmatrix} 2^{k-1}&2^{k-1} \ 2^{k-1}&2^{k-1}\ end{bmatrix}$$can you finish now?
answered Dec 11 '18 at 19:27
Mostafa Ayaz
13.6k3836
13.6k3836
add a comment |
add a comment |
By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(lambda)$; see this question.
For instance, in this case, if $Av=lambda v$, then $A^nv=lambda^nv$, and $(sum_{n=1}^{100}A^n)v=sum_{n=1}^{100}(A^nv )=sum_{n=1}^{100}(lambda^nv)=(sum_{n=1}^{100}lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $sum_{n=1}^{100}lambda^n$. $A$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2^{101}-1$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
add a comment |
By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(lambda)$; see this question.
For instance, in this case, if $Av=lambda v$, then $A^nv=lambda^nv$, and $(sum_{n=1}^{100}A^n)v=sum_{n=1}^{100}(A^nv )=sum_{n=1}^{100}(lambda^nv)=(sum_{n=1}^{100}lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $sum_{n=1}^{100}lambda^n$. $A$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2^{101}-1$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
add a comment |
By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(lambda)$; see this question.
For instance, in this case, if $Av=lambda v$, then $A^nv=lambda^nv$, and $(sum_{n=1}^{100}A^n)v=sum_{n=1}^{100}(A^nv )=sum_{n=1}^{100}(lambda^nv)=(sum_{n=1}^{100}lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $sum_{n=1}^{100}lambda^n$. $A$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2^{101}-1$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$
By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(lambda)$; see this question.
For instance, in this case, if $Av=lambda v$, then $A^nv=lambda^nv$, and $(sum_{n=1}^{100}A^n)v=sum_{n=1}^{100}(A^nv )=sum_{n=1}^{100}(lambda^nv)=(sum_{n=1}^{100}lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $sum_{n=1}^{100}lambda^n$. $A$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=begin{bmatrix} 1 \ 1 end{bmatrix} $ $lambda=2^{101}-1$ and $v=begin{bmatrix} 1 \ -1 end{bmatrix} $ $lambda=0$
answered Dec 11 '18 at 22:46
Acccumulation
6,7962617
6,7962617
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
add a comment |
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
This is the simplest way to go.
– Giuseppe Negro
Dec 12 '18 at 22:14
add a comment |
Hint :
Recall the Cayley-Hamilton Theorem (by Wikipedia) :
For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity :
$$p(A) = A^n + c_{n-1}A^{n-1} + dots + cA + (-1)^ndet(A)I_n = O$$
The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues:
$$s_k = sum_{i=1}^n lambda_i^k = text{tr}(A^k)$$
add a comment |
Hint :
Recall the Cayley-Hamilton Theorem (by Wikipedia) :
For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity :
$$p(A) = A^n + c_{n-1}A^{n-1} + dots + cA + (-1)^ndet(A)I_n = O$$
The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues:
$$s_k = sum_{i=1}^n lambda_i^k = text{tr}(A^k)$$
add a comment |
Hint :
Recall the Cayley-Hamilton Theorem (by Wikipedia) :
For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity :
$$p(A) = A^n + c_{n-1}A^{n-1} + dots + cA + (-1)^ndet(A)I_n = O$$
The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues:
$$s_k = sum_{i=1}^n lambda_i^k = text{tr}(A^k)$$
Hint :
Recall the Cayley-Hamilton Theorem (by Wikipedia) :
For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity :
$$p(A) = A^n + c_{n-1}A^{n-1} + dots + cA + (-1)^ndet(A)I_n = O$$
The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues:
$$s_k = sum_{i=1}^n lambda_i^k = text{tr}(A^k)$$
answered Dec 11 '18 at 19:20
Rebellos
14.4k31245
14.4k31245
add a comment |
add a comment |
You can explicitly compute $sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.
Now
begin{align}
sum_{i=1}^{100}A^i&=sum_{i=1}^{100}PD^iP^{-1}\&=Pleft(sum_{i=1}^{100} D^iright)P^{-1}
end{align}
Notice that
$$(D-I)left(sum_{i=1}^{100}D^iright)=D^{101}-I.$$
SInce $D-I$ is invertible (you can check it)
$$sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$
Therefore
$$sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$
add a comment |
You can explicitly compute $sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.
Now
begin{align}
sum_{i=1}^{100}A^i&=sum_{i=1}^{100}PD^iP^{-1}\&=Pleft(sum_{i=1}^{100} D^iright)P^{-1}
end{align}
Notice that
$$(D-I)left(sum_{i=1}^{100}D^iright)=D^{101}-I.$$
SInce $D-I$ is invertible (you can check it)
$$sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$
Therefore
$$sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$
add a comment |
You can explicitly compute $sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.
Now
begin{align}
sum_{i=1}^{100}A^i&=sum_{i=1}^{100}PD^iP^{-1}\&=Pleft(sum_{i=1}^{100} D^iright)P^{-1}
end{align}
Notice that
$$(D-I)left(sum_{i=1}^{100}D^iright)=D^{101}-I.$$
SInce $D-I$ is invertible (you can check it)
$$sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$
Therefore
$$sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$
You can explicitly compute $sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.
Now
begin{align}
sum_{i=1}^{100}A^i&=sum_{i=1}^{100}PD^iP^{-1}\&=Pleft(sum_{i=1}^{100} D^iright)P^{-1}
end{align}
Notice that
$$(D-I)left(sum_{i=1}^{100}D^iright)=D^{101}-I.$$
SInce $D-I$ is invertible (you can check it)
$$sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$
Therefore
$$sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$
answered Dec 11 '18 at 19:37
user9077
1,279612
1,279612
add a comment |
add a comment |
It is easy to prove that for $kin Bbb{N},$ $$A^k=begin{bmatrix} 2^{k-1} & 2^{k-1} \ 2^{k-1} & 2^{k-1} \ end{bmatrix}.$$ The sum is
$$Sigma=begin{bmatrix} 2^{100}-1 & 2^{100}-1 \ 2^{100}-1 & 2^{100}-1 \ end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$
Each matrix $A^k, k=1,dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $Sigma.$
add a comment |
It is easy to prove that for $kin Bbb{N},$ $$A^k=begin{bmatrix} 2^{k-1} & 2^{k-1} \ 2^{k-1} & 2^{k-1} \ end{bmatrix}.$$ The sum is
$$Sigma=begin{bmatrix} 2^{100}-1 & 2^{100}-1 \ 2^{100}-1 & 2^{100}-1 \ end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$
Each matrix $A^k, k=1,dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $Sigma.$
add a comment |
It is easy to prove that for $kin Bbb{N},$ $$A^k=begin{bmatrix} 2^{k-1} & 2^{k-1} \ 2^{k-1} & 2^{k-1} \ end{bmatrix}.$$ The sum is
$$Sigma=begin{bmatrix} 2^{100}-1 & 2^{100}-1 \ 2^{100}-1 & 2^{100}-1 \ end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$
Each matrix $A^k, k=1,dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $Sigma.$
It is easy to prove that for $kin Bbb{N},$ $$A^k=begin{bmatrix} 2^{k-1} & 2^{k-1} \ 2^{k-1} & 2^{k-1} \ end{bmatrix}.$$ The sum is
$$Sigma=begin{bmatrix} 2^{100}-1 & 2^{100}-1 \ 2^{100}-1 & 2^{100}-1 \ end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$
Each matrix $A^k, k=1,dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $Sigma.$
edited Dec 11 '18 at 23:13
answered Dec 11 '18 at 19:55
user376343
2,8382822
2,8382822
add a comment |
add a comment |
Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.
add a comment |
Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.
add a comment |
Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.
Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.
answered Dec 11 '18 at 19:23
Javi
3829
3829
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3035684%2feigenvalues-and-eigenvectors-of-sum-of-symmetric-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
Try evaluating $(sum_{n=1}^{100} A^n) begin{bmatrix} 1 \ 1 end{bmatrix}$ and do the same with the other eigenvector. What happens?
– Giuseppe Negro
Dec 11 '18 at 19:24