Linear Algebra: proving a decomposition of vector to orthonormal basis
up vote
0
down vote
favorite
I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.
Which would be,
$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$
How do I prove the above decomposition is correct?
linear-algebra
add a comment |
up vote
0
down vote
favorite
I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.
Which would be,
$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$
How do I prove the above decomposition is correct?
linear-algebra
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.
Which would be,
$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$
How do I prove the above decomposition is correct?
linear-algebra
I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.
Which would be,
$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$
How do I prove the above decomposition is correct?
linear-algebra
linear-algebra
asked Nov 12 at 16:30
hadi k
1263
1263
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09
add a comment |
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09
add a comment |
3 Answers
3
active
oldest
votes
up vote
0
down vote
You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.
That being said, you only need to prove one thing:
$$
sum u_i c_i = v
$$
Which is already done by definition.
add a comment |
up vote
0
down vote
You want to show that
$$
v=sum_i langle v,u_irangle u_i
$$
If you do
$$
leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
$$
A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.
add a comment |
up vote
0
down vote
To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:
$$U mathbf{v}' = mathbf{v}$$
For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
$$mathbf{v}' = U^{-1}mathbf{v}$$
In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:
$$U^TU = I = UU^T$$
Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:
$$mathbf{v}' = U^T mathbf{v}$$
Now I can prove your decomposition in a simple way:
$$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$
Note that this is exactly your formula:
$$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$
Hope this answers your question.
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.
That being said, you only need to prove one thing:
$$
sum u_i c_i = v
$$
Which is already done by definition.
add a comment |
up vote
0
down vote
You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.
That being said, you only need to prove one thing:
$$
sum u_i c_i = v
$$
Which is already done by definition.
add a comment |
up vote
0
down vote
up vote
0
down vote
You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.
That being said, you only need to prove one thing:
$$
sum u_i c_i = v
$$
Which is already done by definition.
You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.
That being said, you only need to prove one thing:
$$
sum u_i c_i = v
$$
Which is already done by definition.
answered Nov 12 at 18:06
Mefitico
825116
825116
add a comment |
add a comment |
up vote
0
down vote
You want to show that
$$
v=sum_i langle v,u_irangle u_i
$$
If you do
$$
leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
$$
A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.
add a comment |
up vote
0
down vote
You want to show that
$$
v=sum_i langle v,u_irangle u_i
$$
If you do
$$
leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
$$
A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.
add a comment |
up vote
0
down vote
up vote
0
down vote
You want to show that
$$
v=sum_i langle v,u_irangle u_i
$$
If you do
$$
leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
$$
A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.
You want to show that
$$
v=sum_i langle v,u_irangle u_i
$$
If you do
$$
leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
$$
A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.
answered Nov 12 at 18:21
egreg
173k1383195
173k1383195
add a comment |
add a comment |
up vote
0
down vote
To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:
$$U mathbf{v}' = mathbf{v}$$
For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
$$mathbf{v}' = U^{-1}mathbf{v}$$
In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:
$$U^TU = I = UU^T$$
Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:
$$mathbf{v}' = U^T mathbf{v}$$
Now I can prove your decomposition in a simple way:
$$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$
Note that this is exactly your formula:
$$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$
Hope this answers your question.
add a comment |
up vote
0
down vote
To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:
$$U mathbf{v}' = mathbf{v}$$
For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
$$mathbf{v}' = U^{-1}mathbf{v}$$
In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:
$$U^TU = I = UU^T$$
Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:
$$mathbf{v}' = U^T mathbf{v}$$
Now I can prove your decomposition in a simple way:
$$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$
Note that this is exactly your formula:
$$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$
Hope this answers your question.
add a comment |
up vote
0
down vote
up vote
0
down vote
To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:
$$U mathbf{v}' = mathbf{v}$$
For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
$$mathbf{v}' = U^{-1}mathbf{v}$$
In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:
$$U^TU = I = UU^T$$
Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:
$$mathbf{v}' = U^T mathbf{v}$$
Now I can prove your decomposition in a simple way:
$$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$
Note that this is exactly your formula:
$$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$
Hope this answers your question.
To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:
$$U mathbf{v}' = mathbf{v}$$
For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
$$mathbf{v}' = U^{-1}mathbf{v}$$
In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:
$$U^TU = I = UU^T$$
Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:
$$mathbf{v}' = U^T mathbf{v}$$
Now I can prove your decomposition in a simple way:
$$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$
Note that this is exactly your formula:
$$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$
Hope this answers your question.
answered Nov 12 at 23:21
pedroth
325
325
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995521%2flinear-algebra-proving-a-decomposition-of-vector-to-orthonormal-basis%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15
To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09