Linear Algebra: proving a decomposition of vector to orthonormal basis











up vote
0
down vote

favorite












I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.



Which would be,



$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$



How do I prove the above decomposition is correct?










share|cite|improve this question






















  • @Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
    – hadi k
    Nov 12 at 17:15










  • To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
    – Mefitico
    Nov 12 at 18:09















up vote
0
down vote

favorite












I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.



Which would be,



$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$



How do I prove the above decomposition is correct?










share|cite|improve this question






















  • @Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
    – hadi k
    Nov 12 at 17:15










  • To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
    – Mefitico
    Nov 12 at 18:09













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.



Which would be,



$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$



How do I prove the above decomposition is correct?










share|cite|improve this question













I want to transpose my vector $v$ to an arbitrary orthonormal basis $U = {u_1,u_2, u_3}$.



Which would be,



$v = sum_i langle u_i cdot v rangle u_i =sum_i u_i^Tvu_i$



How do I prove the above decomposition is correct?







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 12 at 16:30









hadi k

1263




1263












  • @Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
    – hadi k
    Nov 12 at 17:15










  • To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
    – Mefitico
    Nov 12 at 18:09


















  • @Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
    – hadi k
    Nov 12 at 17:15










  • To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
    – Mefitico
    Nov 12 at 18:09
















@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15




@Masacroso hey but sorry, i cannot still connect your suggestion to the solution for above.
– hadi k
Nov 12 at 17:15












To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09




To clarify: You want to prove mathematically? Or are you looking for some counter-validation (i.e. maybe you are programming a function, and is required to unit test your code). The result you provide (given that the basis is orthonormal) is almost the definition of the decomposition, so a bit more context on allowed assumptions would be needed if you are looking for a formal proof of sorts.
– Mefitico
Nov 12 at 18:09










3 Answers
3






active

oldest

votes

















up vote
0
down vote













You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.



That being said, you only need to prove one thing:



$$
sum u_i c_i = v
$$



Which is already done by definition.






share|cite|improve this answer




























    up vote
    0
    down vote













    You want to show that
    $$
    v=sum_i langle v,u_irangle u_i
    $$

    If you do
    $$
    leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
    langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
    $$

    A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.






    share|cite|improve this answer




























      up vote
      0
      down vote













      To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:



      $$U mathbf{v}' = mathbf{v}$$



      For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
      $$mathbf{v}' = U^{-1}mathbf{v}$$



      In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:



      $$U^TU = I = UU^T$$



      Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:



      $$mathbf{v}' = U^T mathbf{v}$$



      Now I can prove your decomposition in a simple way:
      $$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$



      Note that this is exactly your formula:
      $$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$



      Hope this answers your question.






      share|cite|improve this answer





















        Your Answer





        StackExchange.ifUsing("editor", function () {
        return StackExchange.using("mathjaxEditing", function () {
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        });
        });
        }, "mathjax-editing");

        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "69"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














         

        draft saved


        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995521%2flinear-algebra-proving-a-decomposition-of-vector-to-orthonormal-basis%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes








        up vote
        0
        down vote













        You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.



        That being said, you only need to prove one thing:



        $$
        sum u_i c_i = v
        $$



        Which is already done by definition.






        share|cite|improve this answer

























          up vote
          0
          down vote













          You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.



          That being said, you only need to prove one thing:



          $$
          sum u_i c_i = v
          $$



          Which is already done by definition.






          share|cite|improve this answer























            up vote
            0
            down vote










            up vote
            0
            down vote









            You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.



            That being said, you only need to prove one thing:



            $$
            sum u_i c_i = v
            $$



            Which is already done by definition.






            share|cite|improve this answer












            You could have specified the coordinates of the vector in the new base to be $c_i$, with $c_i = <u_i, v>$.



            That being said, you only need to prove one thing:



            $$
            sum u_i c_i = v
            $$



            Which is already done by definition.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Nov 12 at 18:06









            Mefitico

            825116




            825116






















                up vote
                0
                down vote













                You want to show that
                $$
                v=sum_i langle v,u_irangle u_i
                $$

                If you do
                $$
                leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
                langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
                $$

                A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.






                share|cite|improve this answer

























                  up vote
                  0
                  down vote













                  You want to show that
                  $$
                  v=sum_i langle v,u_irangle u_i
                  $$

                  If you do
                  $$
                  leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
                  langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
                  $$

                  A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.






                  share|cite|improve this answer























                    up vote
                    0
                    down vote










                    up vote
                    0
                    down vote









                    You want to show that
                    $$
                    v=sum_i langle v,u_irangle u_i
                    $$

                    If you do
                    $$
                    leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
                    langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
                    $$

                    A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.






                    share|cite|improve this answer












                    You want to show that
                    $$
                    v=sum_i langle v,u_irangle u_i
                    $$

                    If you do
                    $$
                    leftlangle v-sum_i langle v,u_irangle u_i,u_jrightrangle=
                    langle v,u_jrangle-langle v,u_jranglelangle u_j,u_jrangle=0
                    $$

                    A vector $w$ is zero if and only if $langle w,u_jrangle=0$ for every $j$.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Nov 12 at 18:21









                    egreg

                    173k1383195




                    173k1383195






















                        up vote
                        0
                        down vote













                        To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:



                        $$U mathbf{v}' = mathbf{v}$$



                        For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
                        $$mathbf{v}' = U^{-1}mathbf{v}$$



                        In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:



                        $$U^TU = I = UU^T$$



                        Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:



                        $$mathbf{v}' = U^T mathbf{v}$$



                        Now I can prove your decomposition in a simple way:
                        $$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$



                        Note that this is exactly your formula:
                        $$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$



                        Hope this answers your question.






                        share|cite|improve this answer

























                          up vote
                          0
                          down vote













                          To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:



                          $$U mathbf{v}' = mathbf{v}$$



                          For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
                          $$mathbf{v}' = U^{-1}mathbf{v}$$



                          In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:



                          $$U^TU = I = UU^T$$



                          Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:



                          $$mathbf{v}' = U^T mathbf{v}$$



                          Now I can prove your decomposition in a simple way:
                          $$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$



                          Note that this is exactly your formula:
                          $$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$



                          Hope this answers your question.






                          share|cite|improve this answer























                            up vote
                            0
                            down vote










                            up vote
                            0
                            down vote









                            To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:



                            $$U mathbf{v}' = mathbf{v}$$



                            For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
                            $$mathbf{v}' = U^{-1}mathbf{v}$$



                            In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:



                            $$U^TU = I = UU^T$$



                            Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:



                            $$mathbf{v}' = U^T mathbf{v}$$



                            Now I can prove your decomposition in a simple way:
                            $$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$



                            Note that this is exactly your formula:
                            $$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$



                            Hope this answers your question.






                            share|cite|improve this answer












                            To specify the vector $mathbf{v} in mathbb{R}^n$ in a general different basis $U = left [ mathbf{u}_1 dots mathbf{u}_nright ]$, where $mathbf{u}_i in mathbb{R}^n, forall i;$, you need to find a $mathbf{v}'$ such that:



                            $$U mathbf{v}' = mathbf{v}$$



                            For the general basis the coordinates of $mathbf{v}$ in the basis $U$ is given by (multiply by the inverse in both sides):
                            $$mathbf{v}' = U^{-1}mathbf{v}$$



                            In the case where $U$ is an orthonormal basis(means that $U$ is orthogonal matrix), we know that:



                            $$U^TU = I = UU^T$$



                            Hence $U^{-1} = U^T$, therefore $mathbf{v}'$ becomes:



                            $$mathbf{v}' = U^T mathbf{v}$$



                            Now I can prove your decomposition in a simple way:
                            $$mathbf{v} = U mathbf{v}' = Uleft( U^T mathbf{v}right) = UU^T mathbf{v} = mathbf{v}$$



                            Note that this is exactly your formula:
                            $$mathbf{v} = UU^Tmathbf{v} = sum_{i} mathbf{u}_i left< mathbf{u}_i, mathbf{v}right>$$



                            Hope this answers your question.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered Nov 12 at 23:21









                            pedroth

                            325




                            325






























                                 

                                draft saved


                                draft discarded



















































                                 


                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995521%2flinear-algebra-proving-a-decomposition-of-vector-to-orthonormal-basis%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                How to change which sound is reproduced for terminal bell?

                                Can I use Tabulator js library in my java Spring + Thymeleaf project?

                                Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents