Show that $e^{tA} = sumlimits_{k=0}^{n-1} f_k(t)A^k$
$begingroup$
Let $A$ be a $ntimes n$ matrix such that the characteristic polynomial of $A$ is $$P(lambda)=lambda^n+a_{n-1}lambda^{n-1}+...+a_1lambda+a_0$$
Now consider the nth order differential equation $$frac{d^nx}{dt^n}+a_{n-1}frac{d^{n-1}x}{dt^{n-1}}+...+a_1frac{dx}{dt}+a_0x=0$$ and let $f_0(t),: f_1(t),: ...,: f_{n-1}(t)$ be the unique solutions satisfying the inital conditions $$f_j^l(0)= begin{cases} 1 & j=l \ 0 & j neq lend{cases}$$ for $0 leq j,: lleq n-1$
Show that $e^{tA} = f_0(t)I+f_1(t)A+f_2(t)A^2+...+f_{n-1}(t)A^{n-1}$
I know that $e^{tA}$ is the function $M(t)$ such that $frac{dM}{dt}=AM$ and $M(0) = I$. So all I need to do is show that the right hand side of the equation satisfies these two conditions. The second one is easy enough to prove; however, the first one presents some trouble. My plan is to show that $frac{d^n}{dt^n}f_j(t) = A^nf_j(t)$ for every unique solution $f_j$, and based on the definitions of $A$ and the solutions to the differential equations, I suspect that Cayley-Hamilton theorem could be used here. However, I haven't been able to engineer it in a way to prove the above statement.
linear-algebra ordinary-differential-equations cayley-hamilton
$endgroup$
add a comment |
$begingroup$
Let $A$ be a $ntimes n$ matrix such that the characteristic polynomial of $A$ is $$P(lambda)=lambda^n+a_{n-1}lambda^{n-1}+...+a_1lambda+a_0$$
Now consider the nth order differential equation $$frac{d^nx}{dt^n}+a_{n-1}frac{d^{n-1}x}{dt^{n-1}}+...+a_1frac{dx}{dt}+a_0x=0$$ and let $f_0(t),: f_1(t),: ...,: f_{n-1}(t)$ be the unique solutions satisfying the inital conditions $$f_j^l(0)= begin{cases} 1 & j=l \ 0 & j neq lend{cases}$$ for $0 leq j,: lleq n-1$
Show that $e^{tA} = f_0(t)I+f_1(t)A+f_2(t)A^2+...+f_{n-1}(t)A^{n-1}$
I know that $e^{tA}$ is the function $M(t)$ such that $frac{dM}{dt}=AM$ and $M(0) = I$. So all I need to do is show that the right hand side of the equation satisfies these two conditions. The second one is easy enough to prove; however, the first one presents some trouble. My plan is to show that $frac{d^n}{dt^n}f_j(t) = A^nf_j(t)$ for every unique solution $f_j$, and based on the definitions of $A$ and the solutions to the differential equations, I suspect that Cayley-Hamilton theorem could be used here. However, I haven't been able to engineer it in a way to prove the above statement.
linear-algebra ordinary-differential-equations cayley-hamilton
$endgroup$
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25
add a comment |
$begingroup$
Let $A$ be a $ntimes n$ matrix such that the characteristic polynomial of $A$ is $$P(lambda)=lambda^n+a_{n-1}lambda^{n-1}+...+a_1lambda+a_0$$
Now consider the nth order differential equation $$frac{d^nx}{dt^n}+a_{n-1}frac{d^{n-1}x}{dt^{n-1}}+...+a_1frac{dx}{dt}+a_0x=0$$ and let $f_0(t),: f_1(t),: ...,: f_{n-1}(t)$ be the unique solutions satisfying the inital conditions $$f_j^l(0)= begin{cases} 1 & j=l \ 0 & j neq lend{cases}$$ for $0 leq j,: lleq n-1$
Show that $e^{tA} = f_0(t)I+f_1(t)A+f_2(t)A^2+...+f_{n-1}(t)A^{n-1}$
I know that $e^{tA}$ is the function $M(t)$ such that $frac{dM}{dt}=AM$ and $M(0) = I$. So all I need to do is show that the right hand side of the equation satisfies these two conditions. The second one is easy enough to prove; however, the first one presents some trouble. My plan is to show that $frac{d^n}{dt^n}f_j(t) = A^nf_j(t)$ for every unique solution $f_j$, and based on the definitions of $A$ and the solutions to the differential equations, I suspect that Cayley-Hamilton theorem could be used here. However, I haven't been able to engineer it in a way to prove the above statement.
linear-algebra ordinary-differential-equations cayley-hamilton
$endgroup$
Let $A$ be a $ntimes n$ matrix such that the characteristic polynomial of $A$ is $$P(lambda)=lambda^n+a_{n-1}lambda^{n-1}+...+a_1lambda+a_0$$
Now consider the nth order differential equation $$frac{d^nx}{dt^n}+a_{n-1}frac{d^{n-1}x}{dt^{n-1}}+...+a_1frac{dx}{dt}+a_0x=0$$ and let $f_0(t),: f_1(t),: ...,: f_{n-1}(t)$ be the unique solutions satisfying the inital conditions $$f_j^l(0)= begin{cases} 1 & j=l \ 0 & j neq lend{cases}$$ for $0 leq j,: lleq n-1$
Show that $e^{tA} = f_0(t)I+f_1(t)A+f_2(t)A^2+...+f_{n-1}(t)A^{n-1}$
I know that $e^{tA}$ is the function $M(t)$ such that $frac{dM}{dt}=AM$ and $M(0) = I$. So all I need to do is show that the right hand side of the equation satisfies these two conditions. The second one is easy enough to prove; however, the first one presents some trouble. My plan is to show that $frac{d^n}{dt^n}f_j(t) = A^nf_j(t)$ for every unique solution $f_j$, and based on the definitions of $A$ and the solutions to the differential equations, I suspect that Cayley-Hamilton theorem could be used here. However, I haven't been able to engineer it in a way to prove the above statement.
linear-algebra ordinary-differential-equations cayley-hamilton
linear-algebra ordinary-differential-equations cayley-hamilton
edited Dec 13 '18 at 13:48
Martin Sleziak
44.7k10119272
44.7k10119272
asked Dec 4 '18 at 19:11
Ryan GreylingRyan Greyling
3499
3499
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25
add a comment |
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in
$$
A,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1})
\
=-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1}
$$
This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus
begin{align}
f_0'(t)&=-a_0f_{n-1}(t)\
f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\
&~~vdots\
f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\
f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t)
end{align}
Inserting backwards one finds
$$
0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)
$$
with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.
Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.
In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.
Thus if $M(t)=sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $sum_{k=0}^{n-1}P(D)f_k(t),A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.
$M(0)=I$ implies $f_k(0)=delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=delta_{j,k}$.
Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.
$endgroup$
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
add a comment |
$begingroup$
Hint (but not a full answer): Using the definition of the matrix exponential, we write
$$
e^{tA} = sum_{k=0}^infty frac{1}{k!}(tA)^k .
$$
Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, dots, A^{n-1}$:
$$
begin{aligned}
A^n &= -a_{n-1}A^{n-1} - dots - a_1 A - a_0 I \
A^{n+1} &= -a_{n-1}A^{n} - dots - a_1 A^2 - a_0 A \
&;; vdots
end{aligned}
$$
which we may be written
$$
A^k = sum_{i=0}^{n-1} b_i^{(k)} A^i , qquad kgeq n.
$$
Finally,
$$
begin{aligned}
e^{tA} &= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^infty frac{1}{k!}(tA)^k \
&= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^{infty} frac{1}{k!}t^k sum_{i=0}^{n-1} b_i^{(k)} A^i \
&= sum_{k=0}^{n-1} left( frac{1}{k!}t^k + sum_{p=n}^{infty} frac{1}{p!}t^p b_k^{(p)} right) A^k
end{aligned}
$$
which is of the expected form.
$endgroup$
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3025981%2fshow-that-eta-sum-limits-k-0n-1-f-ktak%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in
$$
A,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1})
\
=-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1}
$$
This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus
begin{align}
f_0'(t)&=-a_0f_{n-1}(t)\
f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\
&~~vdots\
f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\
f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t)
end{align}
Inserting backwards one finds
$$
0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)
$$
with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.
Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.
In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.
Thus if $M(t)=sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $sum_{k=0}^{n-1}P(D)f_k(t),A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.
$M(0)=I$ implies $f_k(0)=delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=delta_{j,k}$.
Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.
$endgroup$
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
add a comment |
$begingroup$
In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in
$$
A,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1})
\
=-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1}
$$
This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus
begin{align}
f_0'(t)&=-a_0f_{n-1}(t)\
f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\
&~~vdots\
f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\
f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t)
end{align}
Inserting backwards one finds
$$
0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)
$$
with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.
Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.
In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.
Thus if $M(t)=sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $sum_{k=0}^{n-1}P(D)f_k(t),A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.
$M(0)=I$ implies $f_k(0)=delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=delta_{j,k}$.
Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.
$endgroup$
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
add a comment |
$begingroup$
In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in
$$
A,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1})
\
=-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1}
$$
This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus
begin{align}
f_0'(t)&=-a_0f_{n-1}(t)\
f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\
&~~vdots\
f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\
f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t)
end{align}
Inserting backwards one finds
$$
0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)
$$
with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.
Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.
In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.
Thus if $M(t)=sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $sum_{k=0}^{n-1}P(D)f_k(t),A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.
$M(0)=I$ implies $f_k(0)=delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=delta_{j,k}$.
Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.
$endgroup$
In the space of matrices $c(A)=c_0I+c_1A+...+c_{n-1}A^{n-1}$, the multiplication with $A$ results in
$$
A,c(A)=c_0A+c_1A^2+...+c_{n-2}A^{n-1}-c_{n-1}(a_0I+a_1A+...+a_{n-1}A^{n-1})
\
=-a_0c_{n-1}I+(c_0-a_1c_{n-1})A+...+(c_{n-2}-a_{n-1}c_{n-1})A^{n-1}
$$
This proves that the algebra generated by $A$ is spanned by the powers $I,A,...,A^{n-1}$. This means that there are function coefficients $f_0(t),...,f_{n-1}(t)$ with $e^{tA}=f_0(t)I+f_1(t)A+...+f_{n-1}(t)A^{n-1}$. From the differential equation $(e^{tA})'=Ae^{tA}$ we get thus
begin{align}
f_0'(t)&=-a_0f_{n-1}(t)\
f_1'(t)&=f_0(t)-a_1f_{n-1}(t)\
&~~vdots\
f_{n-2}'(t)&=f_{n-3}(t)-a_{n-2}f_{n-1}(t)\
f_{n-1}'(t)&=f_{n-2}(t)-a_{n-1}f_{n-1}(t)
end{align}
Inserting backwards one finds
$$
0=f_{n-1}^{(n)}(t)+a_{n-1}f_{n-1}^{(n-1)}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)
$$
with $0=f_{n-1}(0)=f_{n-1}'(0)=f_{n-1}^{(n-2)}(0)$ and $f_{n-1}^{(n-1)}(0)=1$.
Then conclude on the properties of $f_{n-2}(t)=f_{n-1}'(t)+a_{n-1}f_{n-1}(t)$ which is a solution of the same differential equation with $f_{n-2}^{(k)}(0)=0$ for $k<n-2$, $f_{n-2}^{(n-2)}(0)=1$ and $f_{n-2}^{(n-1)}(0)=f_{n-1}^{(n)}(0)+a_{n-1}=0$ and so on.
In other terms, assume that $P$ is not only the characteristic, but also the minimal polynomial of $A$. Set $$Q(x,y)=frac{P(x)-P(y)}{x-y},$$ $Q$ is a polynomial, so that $$P(D)I-P(A)=Q(D,A)(DI-A)$$ for the differentiation operator $D=frac{d}{dt}$ and identity matrix $I$. As $P(A)=0$ per Cayley-Hamilton, we get that any solution of $Dx=Ax$ also satisfies $P(D)x=0$.
Thus if $M(t)=sum_{k=0}^{n-1} f_k(t)A^k$ is a matrix solution, we get $sum_{k=0}^{n-1}P(D)f_k(t),A^k=0$, and as these powers of $A$ are linearly independent, $$P(D)f_k(t)=0$$ separately.
$M(0)=I$ implies $f_k(0)=delta_{0,k}$, $M^{(j)}(0)=A^j$ similarly implies $f_k^{(j)}(0)=delta_{j,k}$.
Now argue that $Q(D,A)$ is invertible on the solution space of the ODE system $Dx-Ax$ by the minimality of $P$ to find that a so constructed function $M$ reversely also satisfies $DM=AM$. Then use continuation from a dense set to also include matrices $A$ without the minimality of the characteristic polynomial.
edited Dec 13 '18 at 14:36
answered Dec 5 '18 at 14:21
LutzLLutzL
59.1k42056
59.1k42056
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
add a comment |
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
$begingroup$
How do you know that $0=f_{n-1}(0)=f_{n-2}'(0)=f_{n-1}^{n-2}(0)$ and $f_{n-1}^{n-1}(0)=1$? Also, how exactly are you "inserting backwards" to obtain $0=f_{n-1}^n(t)+a_{n-1}f_{n-1}^{n-1}(t)+...+a_1f_{n-1}'(t)+a_0f_{n-1}(t)$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:33
add a comment |
$begingroup$
Hint (but not a full answer): Using the definition of the matrix exponential, we write
$$
e^{tA} = sum_{k=0}^infty frac{1}{k!}(tA)^k .
$$
Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, dots, A^{n-1}$:
$$
begin{aligned}
A^n &= -a_{n-1}A^{n-1} - dots - a_1 A - a_0 I \
A^{n+1} &= -a_{n-1}A^{n} - dots - a_1 A^2 - a_0 A \
&;; vdots
end{aligned}
$$
which we may be written
$$
A^k = sum_{i=0}^{n-1} b_i^{(k)} A^i , qquad kgeq n.
$$
Finally,
$$
begin{aligned}
e^{tA} &= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^infty frac{1}{k!}(tA)^k \
&= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^{infty} frac{1}{k!}t^k sum_{i=0}^{n-1} b_i^{(k)} A^i \
&= sum_{k=0}^{n-1} left( frac{1}{k!}t^k + sum_{p=n}^{infty} frac{1}{p!}t^p b_k^{(p)} right) A^k
end{aligned}
$$
which is of the expected form.
$endgroup$
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
add a comment |
$begingroup$
Hint (but not a full answer): Using the definition of the matrix exponential, we write
$$
e^{tA} = sum_{k=0}^infty frac{1}{k!}(tA)^k .
$$
Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, dots, A^{n-1}$:
$$
begin{aligned}
A^n &= -a_{n-1}A^{n-1} - dots - a_1 A - a_0 I \
A^{n+1} &= -a_{n-1}A^{n} - dots - a_1 A^2 - a_0 A \
&;; vdots
end{aligned}
$$
which we may be written
$$
A^k = sum_{i=0}^{n-1} b_i^{(k)} A^i , qquad kgeq n.
$$
Finally,
$$
begin{aligned}
e^{tA} &= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^infty frac{1}{k!}(tA)^k \
&= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^{infty} frac{1}{k!}t^k sum_{i=0}^{n-1} b_i^{(k)} A^i \
&= sum_{k=0}^{n-1} left( frac{1}{k!}t^k + sum_{p=n}^{infty} frac{1}{p!}t^p b_k^{(p)} right) A^k
end{aligned}
$$
which is of the expected form.
$endgroup$
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
add a comment |
$begingroup$
Hint (but not a full answer): Using the definition of the matrix exponential, we write
$$
e^{tA} = sum_{k=0}^infty frac{1}{k!}(tA)^k .
$$
Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, dots, A^{n-1}$:
$$
begin{aligned}
A^n &= -a_{n-1}A^{n-1} - dots - a_1 A - a_0 I \
A^{n+1} &= -a_{n-1}A^{n} - dots - a_1 A^2 - a_0 A \
&;; vdots
end{aligned}
$$
which we may be written
$$
A^k = sum_{i=0}^{n-1} b_i^{(k)} A^i , qquad kgeq n.
$$
Finally,
$$
begin{aligned}
e^{tA} &= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^infty frac{1}{k!}(tA)^k \
&= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^{infty} frac{1}{k!}t^k sum_{i=0}^{n-1} b_i^{(k)} A^i \
&= sum_{k=0}^{n-1} left( frac{1}{k!}t^k + sum_{p=n}^{infty} frac{1}{p!}t^p b_k^{(p)} right) A^k
end{aligned}
$$
which is of the expected form.
$endgroup$
Hint (but not a full answer): Using the definition of the matrix exponential, we write
$$
e^{tA} = sum_{k=0}^infty frac{1}{k!}(tA)^k .
$$
Then, using the Cayley-Hamilton theorem, we have $P(A) = 0$. Hence, all powers of $A$ with exponent larger or equal to $n$ can be expressed recursively as linear functions of $I, dots, A^{n-1}$:
$$
begin{aligned}
A^n &= -a_{n-1}A^{n-1} - dots - a_1 A - a_0 I \
A^{n+1} &= -a_{n-1}A^{n} - dots - a_1 A^2 - a_0 A \
&;; vdots
end{aligned}
$$
which we may be written
$$
A^k = sum_{i=0}^{n-1} b_i^{(k)} A^i , qquad kgeq n.
$$
Finally,
$$
begin{aligned}
e^{tA} &= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^infty frac{1}{k!}(tA)^k \
&= sum_{k=0}^{n-1} frac{1}{k!}(tA)^k + sum_{k=n}^{infty} frac{1}{k!}t^k sum_{i=0}^{n-1} b_i^{(k)} A^i \
&= sum_{k=0}^{n-1} left( frac{1}{k!}t^k + sum_{p=n}^{infty} frac{1}{p!}t^p b_k^{(p)} right) A^k
end{aligned}
$$
which is of the expected form.
edited Dec 5 '18 at 14:38
answered Dec 5 '18 at 13:44
Harry49Harry49
7,44431340
7,44431340
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
add a comment |
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
I don't see how the scalar of any term $i$ in the expansion of $A^k$ is $b_i^k$ For example, the scalar of $I$ in the expansion of $A^n$ is $-a_0$, but the scalar of $I$ in the expansion of $A^{n+1}$ is $a_{n-1}a_0$ According to your logic, from $A^n$ to $A^{n+1}$ the scalar of $I$ should be multiplied by a factor of $b_0$, so would this mean that $b_0=-a_{n-1}$?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:42
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
After performing the steps in your answer, would I show that for every $k$ $frac{1}{k!}t^k+sum_{p=n}^infty frac{1}{p!}t^pb_k^p$ satisfies the originally stated differential equation for the given initial conditions?
$endgroup$
– Ryan Greyling
Dec 5 '18 at 18:47
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
$begingroup$
@RyanGreyling it should be correct, but the link is not straightforward with the present approach. Cf. LutzL's answer
$endgroup$
– Harry49
Dec 5 '18 at 18:51
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3025981%2fshow-that-eta-sum-limits-k-0n-1-f-ktak%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
You could look at the special case of a diagonal Hermitian matrix.
$endgroup$
– Keith McClary
Dec 5 '18 at 17:25