Matrix function











up vote
0
down vote

favorite












If we have $n$ by $n$ A matrix I want to ask about the general method to compute the matrix function.
For example how I can compute:



$cos(A)$ or
$sin(A)$ or
$e^{A}$ or
$log(A)$



or any other functions?










share|cite|improve this question




























    up vote
    0
    down vote

    favorite












    If we have $n$ by $n$ A matrix I want to ask about the general method to compute the matrix function.
    For example how I can compute:



    $cos(A)$ or
    $sin(A)$ or
    $e^{A}$ or
    $log(A)$



    or any other functions?










    share|cite|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      If we have $n$ by $n$ A matrix I want to ask about the general method to compute the matrix function.
      For example how I can compute:



      $cos(A)$ or
      $sin(A)$ or
      $e^{A}$ or
      $log(A)$



      or any other functions?










      share|cite|improve this question















      If we have $n$ by $n$ A matrix I want to ask about the general method to compute the matrix function.
      For example how I can compute:



      $cos(A)$ or
      $sin(A)$ or
      $e^{A}$ or
      $log(A)$



      or any other functions?







      matrix-calculus






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 15 at 3:22









      Seth

      42312




      42312










      asked Nov 15 at 2:59









      hmeteir

      163




      163






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $ntimes n$ matrix $A$ can be expressed as a polynomial $p(A)$ of degree at most $n-1$. It’s also the case that if $lambda$ is an eigenvalue of $A$, then $f(lambda)=p(lambda)$. If you know $A$’s eigenvalues, you can therefore generate a system of linear equations in the unknown coefficients of $p$. If there are repeated eigenvalues, this system will be underdetermined, but you can generate additional independent equations by repeatedly differentiating $f$ and $p$.






          share|cite|improve this answer




























            up vote
            1
            down vote













            We can define functions $F: mathcal{M}_{ntimes n}(mathbb{R})rightarrow mathcal{M}_{ntimes n}(mathbb{R})$ analogous to analytic functions $f: mathbb{R}rightarrowmathbb{R}$ in the following way:



            Let $Ainmathcal{M}_{ntimes n}(mathbb{R})$. Define $A^0 = I$. Then, define $A^k = Acdot A^{k-1}$ recursively, via the usual matrix product. For any polynomial $p(x) = sumlimits_{i=1}^k c_ix^i$, we can now define the matrix polynomial $P(A) = sumlimits_{i=1}^k c_iA^i$.



            For functions which are not polynomials, but are analytic, we can use their power series expansion. Let $f(x)$ be an analytic function with power series $sumlimits_{i=1}^infty c_ix^i$. Then, we can define the matrix function $F(A)$ by its power series $ sumlimits_{i=1}^infty c_iA^i$, given that such a series converges to a unique matrix value for each $A$. Here is an example of proving convergence for the matrix exponential, $exp(A) := sumlimits_{i=1}^infty frac{1}{i!}A^i$. A similar method can be used to show convergence of other matrix power series corresponding to real analytic functions.






            share|cite|improve this answer




























              up vote
              0
              down vote













              Calculate the eigenvalue decomposition of the matrix
              $$A=QDQ^{-1}$$ where $D$ is a diagonal matrix whose entries are the eigenvalues of $A$, and the columns of $Q$ are the corresponding eigenvectors.



              With this decomposition in hand, any function can be evaluated as
              $$f(A)=Q,f(D),Q^{-1}$$
              which is very convenient; just evaluate the function at each diagonal element.



              If $A$ cannot be diagonalized or $Q$ is ill-conditioned, add a small random perturbation to the matrix and try again.
              $$eqalign{
              A' &= A+E cr
              |E| &approx |A|cdot 10^{-14} cr
              }$$






              share|cite|improve this answer





















                Your Answer





                StackExchange.ifUsing("editor", function () {
                return StackExchange.using("mathjaxEditing", function () {
                StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
                StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
                });
                });
                }, "mathjax-editing");

                StackExchange.ready(function() {
                var channelOptions = {
                tags: "".split(" "),
                id: "69"
                };
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function() {
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled) {
                StackExchange.using("snippets", function() {
                createEditor();
                });
                }
                else {
                createEditor();
                }
                });

                function createEditor() {
                StackExchange.prepareEditor({
                heartbeatType: 'answer',
                convertImagesToLinks: true,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: 10,
                bindNavPrevention: true,
                postfix: "",
                imageUploader: {
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                },
                noCode: true, onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                });


                }
                });














                 

                draft saved


                draft discarded


















                StackExchange.ready(
                function () {
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999144%2fmatrix-function%23new-answer', 'question_page');
                }
                );

                Post as a guest















                Required, but never shown

























                3 Answers
                3






                active

                oldest

                votes








                3 Answers
                3






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes








                up vote
                1
                down vote



                accepted










                A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $ntimes n$ matrix $A$ can be expressed as a polynomial $p(A)$ of degree at most $n-1$. It’s also the case that if $lambda$ is an eigenvalue of $A$, then $f(lambda)=p(lambda)$. If you know $A$’s eigenvalues, you can therefore generate a system of linear equations in the unknown coefficients of $p$. If there are repeated eigenvalues, this system will be underdetermined, but you can generate additional independent equations by repeatedly differentiating $f$ and $p$.






                share|cite|improve this answer

























                  up vote
                  1
                  down vote



                  accepted










                  A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $ntimes n$ matrix $A$ can be expressed as a polynomial $p(A)$ of degree at most $n-1$. It’s also the case that if $lambda$ is an eigenvalue of $A$, then $f(lambda)=p(lambda)$. If you know $A$’s eigenvalues, you can therefore generate a system of linear equations in the unknown coefficients of $p$. If there are repeated eigenvalues, this system will be underdetermined, but you can generate additional independent equations by repeatedly differentiating $f$ and $p$.






                  share|cite|improve this answer























                    up vote
                    1
                    down vote



                    accepted







                    up vote
                    1
                    down vote



                    accepted






                    A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $ntimes n$ matrix $A$ can be expressed as a polynomial $p(A)$ of degree at most $n-1$. It’s also the case that if $lambda$ is an eigenvalue of $A$, then $f(lambda)=p(lambda)$. If you know $A$’s eigenvalues, you can therefore generate a system of linear equations in the unknown coefficients of $p$. If there are repeated eigenvalues, this system will be underdetermined, but you can generate additional independent equations by repeatedly differentiating $f$ and $p$.






                    share|cite|improve this answer












                    A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $ntimes n$ matrix $A$ can be expressed as a polynomial $p(A)$ of degree at most $n-1$. It’s also the case that if $lambda$ is an eigenvalue of $A$, then $f(lambda)=p(lambda)$. If you know $A$’s eigenvalues, you can therefore generate a system of linear equations in the unknown coefficients of $p$. If there are repeated eigenvalues, this system will be underdetermined, but you can generate additional independent equations by repeatedly differentiating $f$ and $p$.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Nov 15 at 7:38









                    amd

                    28.5k21049




                    28.5k21049






















                        up vote
                        1
                        down vote













                        We can define functions $F: mathcal{M}_{ntimes n}(mathbb{R})rightarrow mathcal{M}_{ntimes n}(mathbb{R})$ analogous to analytic functions $f: mathbb{R}rightarrowmathbb{R}$ in the following way:



                        Let $Ainmathcal{M}_{ntimes n}(mathbb{R})$. Define $A^0 = I$. Then, define $A^k = Acdot A^{k-1}$ recursively, via the usual matrix product. For any polynomial $p(x) = sumlimits_{i=1}^k c_ix^i$, we can now define the matrix polynomial $P(A) = sumlimits_{i=1}^k c_iA^i$.



                        For functions which are not polynomials, but are analytic, we can use their power series expansion. Let $f(x)$ be an analytic function with power series $sumlimits_{i=1}^infty c_ix^i$. Then, we can define the matrix function $F(A)$ by its power series $ sumlimits_{i=1}^infty c_iA^i$, given that such a series converges to a unique matrix value for each $A$. Here is an example of proving convergence for the matrix exponential, $exp(A) := sumlimits_{i=1}^infty frac{1}{i!}A^i$. A similar method can be used to show convergence of other matrix power series corresponding to real analytic functions.






                        share|cite|improve this answer

























                          up vote
                          1
                          down vote













                          We can define functions $F: mathcal{M}_{ntimes n}(mathbb{R})rightarrow mathcal{M}_{ntimes n}(mathbb{R})$ analogous to analytic functions $f: mathbb{R}rightarrowmathbb{R}$ in the following way:



                          Let $Ainmathcal{M}_{ntimes n}(mathbb{R})$. Define $A^0 = I$. Then, define $A^k = Acdot A^{k-1}$ recursively, via the usual matrix product. For any polynomial $p(x) = sumlimits_{i=1}^k c_ix^i$, we can now define the matrix polynomial $P(A) = sumlimits_{i=1}^k c_iA^i$.



                          For functions which are not polynomials, but are analytic, we can use their power series expansion. Let $f(x)$ be an analytic function with power series $sumlimits_{i=1}^infty c_ix^i$. Then, we can define the matrix function $F(A)$ by its power series $ sumlimits_{i=1}^infty c_iA^i$, given that such a series converges to a unique matrix value for each $A$. Here is an example of proving convergence for the matrix exponential, $exp(A) := sumlimits_{i=1}^infty frac{1}{i!}A^i$. A similar method can be used to show convergence of other matrix power series corresponding to real analytic functions.






                          share|cite|improve this answer























                            up vote
                            1
                            down vote










                            up vote
                            1
                            down vote









                            We can define functions $F: mathcal{M}_{ntimes n}(mathbb{R})rightarrow mathcal{M}_{ntimes n}(mathbb{R})$ analogous to analytic functions $f: mathbb{R}rightarrowmathbb{R}$ in the following way:



                            Let $Ainmathcal{M}_{ntimes n}(mathbb{R})$. Define $A^0 = I$. Then, define $A^k = Acdot A^{k-1}$ recursively, via the usual matrix product. For any polynomial $p(x) = sumlimits_{i=1}^k c_ix^i$, we can now define the matrix polynomial $P(A) = sumlimits_{i=1}^k c_iA^i$.



                            For functions which are not polynomials, but are analytic, we can use their power series expansion. Let $f(x)$ be an analytic function with power series $sumlimits_{i=1}^infty c_ix^i$. Then, we can define the matrix function $F(A)$ by its power series $ sumlimits_{i=1}^infty c_iA^i$, given that such a series converges to a unique matrix value for each $A$. Here is an example of proving convergence for the matrix exponential, $exp(A) := sumlimits_{i=1}^infty frac{1}{i!}A^i$. A similar method can be used to show convergence of other matrix power series corresponding to real analytic functions.






                            share|cite|improve this answer












                            We can define functions $F: mathcal{M}_{ntimes n}(mathbb{R})rightarrow mathcal{M}_{ntimes n}(mathbb{R})$ analogous to analytic functions $f: mathbb{R}rightarrowmathbb{R}$ in the following way:



                            Let $Ainmathcal{M}_{ntimes n}(mathbb{R})$. Define $A^0 = I$. Then, define $A^k = Acdot A^{k-1}$ recursively, via the usual matrix product. For any polynomial $p(x) = sumlimits_{i=1}^k c_ix^i$, we can now define the matrix polynomial $P(A) = sumlimits_{i=1}^k c_iA^i$.



                            For functions which are not polynomials, but are analytic, we can use their power series expansion. Let $f(x)$ be an analytic function with power series $sumlimits_{i=1}^infty c_ix^i$. Then, we can define the matrix function $F(A)$ by its power series $ sumlimits_{i=1}^infty c_iA^i$, given that such a series converges to a unique matrix value for each $A$. Here is an example of proving convergence for the matrix exponential, $exp(A) := sumlimits_{i=1}^infty frac{1}{i!}A^i$. A similar method can be used to show convergence of other matrix power series corresponding to real analytic functions.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered Nov 15 at 3:15









                            AlexanderJ93

                            5,279522




                            5,279522






















                                up vote
                                0
                                down vote













                                Calculate the eigenvalue decomposition of the matrix
                                $$A=QDQ^{-1}$$ where $D$ is a diagonal matrix whose entries are the eigenvalues of $A$, and the columns of $Q$ are the corresponding eigenvectors.



                                With this decomposition in hand, any function can be evaluated as
                                $$f(A)=Q,f(D),Q^{-1}$$
                                which is very convenient; just evaluate the function at each diagonal element.



                                If $A$ cannot be diagonalized or $Q$ is ill-conditioned, add a small random perturbation to the matrix and try again.
                                $$eqalign{
                                A' &= A+E cr
                                |E| &approx |A|cdot 10^{-14} cr
                                }$$






                                share|cite|improve this answer

























                                  up vote
                                  0
                                  down vote













                                  Calculate the eigenvalue decomposition of the matrix
                                  $$A=QDQ^{-1}$$ where $D$ is a diagonal matrix whose entries are the eigenvalues of $A$, and the columns of $Q$ are the corresponding eigenvectors.



                                  With this decomposition in hand, any function can be evaluated as
                                  $$f(A)=Q,f(D),Q^{-1}$$
                                  which is very convenient; just evaluate the function at each diagonal element.



                                  If $A$ cannot be diagonalized or $Q$ is ill-conditioned, add a small random perturbation to the matrix and try again.
                                  $$eqalign{
                                  A' &= A+E cr
                                  |E| &approx |A|cdot 10^{-14} cr
                                  }$$






                                  share|cite|improve this answer























                                    up vote
                                    0
                                    down vote










                                    up vote
                                    0
                                    down vote









                                    Calculate the eigenvalue decomposition of the matrix
                                    $$A=QDQ^{-1}$$ where $D$ is a diagonal matrix whose entries are the eigenvalues of $A$, and the columns of $Q$ are the corresponding eigenvectors.



                                    With this decomposition in hand, any function can be evaluated as
                                    $$f(A)=Q,f(D),Q^{-1}$$
                                    which is very convenient; just evaluate the function at each diagonal element.



                                    If $A$ cannot be diagonalized or $Q$ is ill-conditioned, add a small random perturbation to the matrix and try again.
                                    $$eqalign{
                                    A' &= A+E cr
                                    |E| &approx |A|cdot 10^{-14} cr
                                    }$$






                                    share|cite|improve this answer












                                    Calculate the eigenvalue decomposition of the matrix
                                    $$A=QDQ^{-1}$$ where $D$ is a diagonal matrix whose entries are the eigenvalues of $A$, and the columns of $Q$ are the corresponding eigenvectors.



                                    With this decomposition in hand, any function can be evaluated as
                                    $$f(A)=Q,f(D),Q^{-1}$$
                                    which is very convenient; just evaluate the function at each diagonal element.



                                    If $A$ cannot be diagonalized or $Q$ is ill-conditioned, add a small random perturbation to the matrix and try again.
                                    $$eqalign{
                                    A' &= A+E cr
                                    |E| &approx |A|cdot 10^{-14} cr
                                    }$$







                                    share|cite|improve this answer












                                    share|cite|improve this answer



                                    share|cite|improve this answer










                                    answered Nov 15 at 3:50









                                    greg

                                    7,1901719




                                    7,1901719






























                                         

                                        draft saved


                                        draft discarded



















































                                         


                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function () {
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999144%2fmatrix-function%23new-answer', 'question_page');
                                        }
                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

                                        ComboBox Display Member on multiple fields

                                        Is it possible to collect Nectar points via Trainline?