Distance from eigenspace of matrix












2












$begingroup$


In linear algebra, is there a separate name / concept for the notion of distance between linear vector subspaces?



I'm asking this because I'm considering a problem in numerical linear algebra where a Krylov subspace iterative method is used. Since for every subsequent $n$ a Krylov subspace method implicitly generates an additional basis vector in Krylov subspace, which approaches the eigenspace of the matrix for which the problem $$Ax=b$$ is being solved, it must be true that if $b$ is in the span of the eigenspace of $A$ then the convergence will happen faster.



But what if $b$ is very "far" from the eigenspace? I'm trying to think about what the notion of a distance between two vector subspaces could mean or how it could be defined. Would a vector $b$ contained in a subspace "far away" from the eigenspace of $A$ make iteration of a Krylov subspace method take longer than in a general case?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
    $endgroup$
    – Jean Marie
    Dec 9 '18 at 11:22


















2












$begingroup$


In linear algebra, is there a separate name / concept for the notion of distance between linear vector subspaces?



I'm asking this because I'm considering a problem in numerical linear algebra where a Krylov subspace iterative method is used. Since for every subsequent $n$ a Krylov subspace method implicitly generates an additional basis vector in Krylov subspace, which approaches the eigenspace of the matrix for which the problem $$Ax=b$$ is being solved, it must be true that if $b$ is in the span of the eigenspace of $A$ then the convergence will happen faster.



But what if $b$ is very "far" from the eigenspace? I'm trying to think about what the notion of a distance between two vector subspaces could mean or how it could be defined. Would a vector $b$ contained in a subspace "far away" from the eigenspace of $A$ make iteration of a Krylov subspace method take longer than in a general case?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
    $endgroup$
    – Jean Marie
    Dec 9 '18 at 11:22
















2












2








2





$begingroup$


In linear algebra, is there a separate name / concept for the notion of distance between linear vector subspaces?



I'm asking this because I'm considering a problem in numerical linear algebra where a Krylov subspace iterative method is used. Since for every subsequent $n$ a Krylov subspace method implicitly generates an additional basis vector in Krylov subspace, which approaches the eigenspace of the matrix for which the problem $$Ax=b$$ is being solved, it must be true that if $b$ is in the span of the eigenspace of $A$ then the convergence will happen faster.



But what if $b$ is very "far" from the eigenspace? I'm trying to think about what the notion of a distance between two vector subspaces could mean or how it could be defined. Would a vector $b$ contained in a subspace "far away" from the eigenspace of $A$ make iteration of a Krylov subspace method take longer than in a general case?










share|cite|improve this question











$endgroup$




In linear algebra, is there a separate name / concept for the notion of distance between linear vector subspaces?



I'm asking this because I'm considering a problem in numerical linear algebra where a Krylov subspace iterative method is used. Since for every subsequent $n$ a Krylov subspace method implicitly generates an additional basis vector in Krylov subspace, which approaches the eigenspace of the matrix for which the problem $$Ax=b$$ is being solved, it must be true that if $b$ is in the span of the eigenspace of $A$ then the convergence will happen faster.



But what if $b$ is very "far" from the eigenspace? I'm trying to think about what the notion of a distance between two vector subspaces could mean or how it could be defined. Would a vector $b$ contained in a subspace "far away" from the eigenspace of $A$ make iteration of a Krylov subspace method take longer than in a general case?







linear-algebra eigenvalues-eigenvectors algorithms terminology numerical-linear-algebra






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 9 '18 at 11:27









Omnomnomnom

129k792185




129k792185










asked Dec 9 '18 at 9:17









sequencesequence

4,27331437




4,27331437








  • 1




    $begingroup$
    A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
    $endgroup$
    – Jean Marie
    Dec 9 '18 at 11:22
















  • 1




    $begingroup$
    A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
    $endgroup$
    – Jean Marie
    Dec 9 '18 at 11:22










1




1




$begingroup$
A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
$endgroup$
– Jean Marie
Dec 9 '18 at 11:22






$begingroup$
A distance between two planes in $mathbb{R}^3$ can be defined by $|sin(theta)|$ where $theta$ is the angle between their normals (in particular, the triangular inequality is verified). I have the remembrance that such a result is generalizable, but I must look for references.
$endgroup$
– Jean Marie
Dec 9 '18 at 11:22












1 Answer
1






active

oldest

votes


















3












$begingroup$

The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.



At this point we can define
$$d(V,W) = | P - Q |$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.



This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032192%2fdistance-from-eigenspace-of-matrix%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.



    At this point we can define
    $$d(V,W) = | P - Q |$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.



    This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.






    share|cite|improve this answer









    $endgroup$


















      3












      $begingroup$

      The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.



      At this point we can define
      $$d(V,W) = | P - Q |$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.



      This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.






      share|cite|improve this answer









      $endgroup$
















        3












        3








        3





        $begingroup$

        The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.



        At this point we can define
        $$d(V,W) = | P - Q |$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.



        This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.






        share|cite|improve this answer









        $endgroup$



        The common notion of distance is to consider an orthogonal projection $P$ onto the first linear subspace $V$, and an orthogonal projection $Q$ onto the other subspace $W$.



        At this point we can define
        $$d(V,W) = | P - Q |$$ as the distance between these subspaces, where the norm used is the operator norm. For properties and applications see Section 2.5.3 of Golub and Van Loan.



        This distance metric is used throughout GVL’s exposition on unsymmetrical eigenvalue problems (which involve Krylov methods) — see Chapter 7.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Dec 9 '18 at 19:28









        cdipaolocdipaolo

        650313




        650313






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032192%2fdistance-from-eigenspace-of-matrix%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How to change which sound is reproduced for terminal bell?

            Can I use Tabulator js library in my java Spring + Thymeleaf project?

            Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents