Prove the matrix is positive











up vote
0
down vote

favorite












Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$



Prove that $A$ is positive.



My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).










share|cite|improve this question


















  • 1




    A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
    – xbh
    Nov 13 at 1:44










  • That was fast, thank you! Can you please post it as answer so I can accept it?
    – 2ndYearFreshman
    Nov 13 at 1:50










  • @2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
    – Will Jagy
    Nov 13 at 2:02










  • @SangchulLee I think the OP here is addressing you, requesting you to post a full answer
    – Will Jagy
    Nov 13 at 2:02










  • @WillJagy Thank you.
    – 2ndYearFreshman
    Nov 13 at 2:04















up vote
0
down vote

favorite












Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$



Prove that $A$ is positive.



My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).










share|cite|improve this question


















  • 1




    A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
    – xbh
    Nov 13 at 1:44










  • That was fast, thank you! Can you please post it as answer so I can accept it?
    – 2ndYearFreshman
    Nov 13 at 1:50










  • @2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
    – Will Jagy
    Nov 13 at 2:02










  • @SangchulLee I think the OP here is addressing you, requesting you to post a full answer
    – Will Jagy
    Nov 13 at 2:02










  • @WillJagy Thank you.
    – 2ndYearFreshman
    Nov 13 at 2:04













up vote
0
down vote

favorite









up vote
0
down vote

favorite











Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$



Prove that $A$ is positive.



My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).










share|cite|improve this question













Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$



Prove that $A$ is positive.



My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).







linear-algebra matrices






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 13 at 1:41









2ndYearFreshman

113112




113112








  • 1




    A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
    – xbh
    Nov 13 at 1:44










  • That was fast, thank you! Can you please post it as answer so I can accept it?
    – 2ndYearFreshman
    Nov 13 at 1:50










  • @2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
    – Will Jagy
    Nov 13 at 2:02










  • @SangchulLee I think the OP here is addressing you, requesting you to post a full answer
    – Will Jagy
    Nov 13 at 2:02










  • @WillJagy Thank you.
    – 2ndYearFreshman
    Nov 13 at 2:04














  • 1




    A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
    – xbh
    Nov 13 at 1:44










  • That was fast, thank you! Can you please post it as answer so I can accept it?
    – 2ndYearFreshman
    Nov 13 at 1:50










  • @2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
    – Will Jagy
    Nov 13 at 2:02










  • @SangchulLee I think the OP here is addressing you, requesting you to post a full answer
    – Will Jagy
    Nov 13 at 2:02










  • @WillJagy Thank you.
    – 2ndYearFreshman
    Nov 13 at 2:04








1




1




A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44




A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44












That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50




That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50












@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02




@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02












@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02




@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02












@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04




@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04










3 Answers
3






active

oldest

votes

















up vote
2
down vote



accepted










(Migrated from comment)



The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then



$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$



Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.






share|cite|improve this answer




























    up vote
    2
    down vote













    tried two , Sylvester Inertia



    $$ Q^T D Q = H $$
    $$left(
    begin{array}{rrr}
    1 & 0 & 0 \
    frac{ 1 }{ 2 } & 1 & 0 \
    frac{ 1 }{ 3 } & 1 & 1 \
    end{array}
    right)
    left(
    begin{array}{rrr}
    60 & 0 & 0 \
    0 & 5 & 0 \
    0 & 0 & frac{ 1 }{ 3 } \
    end{array}
    right)
    left(
    begin{array}{rrr}
    1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
    0 & 1 & 1 \
    0 & 0 & 1 \
    end{array}
    right)
    = left(
    begin{array}{rrr}
    60 & 30 & 20 \
    30 & 20 & 15 \
    20 & 15 & 12 \
    end{array}
    right)
    $$



    $$ Q^T D Q = H $$
    $$left(
    begin{array}{rrrr}
    1 & 0 & 0 & 0 \
    frac{ 1 }{ 2 } & 1 & 0 & 0 \
    frac{ 1 }{ 3 } & 1 & 1 & 0 \
    frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
    end{array}
    right)
    left(
    begin{array}{rrrr}
    420 & 0 & 0 & 0 \
    0 & 35 & 0 & 0 \
    0 & 0 & frac{ 7 }{ 3 } & 0 \
    0 & 0 & 0 & frac{ 3 }{ 20 } \
    end{array}
    right)
    left(
    begin{array}{rrrr}
    1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
    0 & 1 & 1 & frac{ 9 }{ 10 } \
    0 & 0 & 1 & frac{ 3 }{ 2 } \
    0 & 0 & 0 & 1 \
    end{array}
    right)
    = left(
    begin{array}{rrrr}
    420 & 210 & 140 & 105 \
    210 & 140 & 105 & 84 \
    140 & 105 & 84 & 70 \
    105 & 84 & 70 & 60 \
    end{array}
    right)
    $$






    share|cite|improve this answer





















    • Upvoted because it taught me the Sylvester Inertia.
      – 2ndYearFreshman
      Nov 13 at 20:26


















    up vote
    1
    down vote













    This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity



    $$ frac{1}{x} = int_0^infty e^{-sx} ds.$$



    Observe:



    $A_{i,j} = frac{1}{i+j-1}$.



    begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
    & = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
    & = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
    & = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
    & ge 0
    end{align*}



    Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.






    share|cite|improve this answer





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














       

      draft saved


      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996169%2fprove-the-matrix-is-positive%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      2
      down vote



      accepted










      (Migrated from comment)



      The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then



      $$ x^T A x
      = sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
      = sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
      = int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
      geq 0 $$



      Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.






      share|cite|improve this answer

























        up vote
        2
        down vote



        accepted










        (Migrated from comment)



        The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then



        $$ x^T A x
        = sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
        = sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
        = int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
        geq 0 $$



        Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.






        share|cite|improve this answer























          up vote
          2
          down vote



          accepted







          up vote
          2
          down vote



          accepted






          (Migrated from comment)



          The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then



          $$ x^T A x
          = sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
          = sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
          = int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
          geq 0 $$



          Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.






          share|cite|improve this answer












          (Migrated from comment)



          The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then



          $$ x^T A x
          = sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
          = sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
          = int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
          geq 0 $$



          Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 13 at 2:05









          Sangchul Lee

          89.7k12161262




          89.7k12161262






















              up vote
              2
              down vote













              tried two , Sylvester Inertia



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrr}
              1 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 \
              end{array}
              right)
              left(
              begin{array}{rrr}
              60 & 0 & 0 \
              0 & 5 & 0 \
              0 & 0 & frac{ 1 }{ 3 } \
              end{array}
              right)
              left(
              begin{array}{rrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
              0 & 1 & 1 \
              0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrr}
              60 & 30 & 20 \
              30 & 20 & 15 \
              20 & 15 & 12 \
              end{array}
              right)
              $$



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrrr}
              1 & 0 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 & 0 \
              frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              420 & 0 & 0 & 0 \
              0 & 35 & 0 & 0 \
              0 & 0 & frac{ 7 }{ 3 } & 0 \
              0 & 0 & 0 & frac{ 3 }{ 20 } \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
              0 & 1 & 1 & frac{ 9 }{ 10 } \
              0 & 0 & 1 & frac{ 3 }{ 2 } \
              0 & 0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrrr}
              420 & 210 & 140 & 105 \
              210 & 140 & 105 & 84 \
              140 & 105 & 84 & 70 \
              105 & 84 & 70 & 60 \
              end{array}
              right)
              $$






              share|cite|improve this answer





















              • Upvoted because it taught me the Sylvester Inertia.
                – 2ndYearFreshman
                Nov 13 at 20:26















              up vote
              2
              down vote













              tried two , Sylvester Inertia



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrr}
              1 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 \
              end{array}
              right)
              left(
              begin{array}{rrr}
              60 & 0 & 0 \
              0 & 5 & 0 \
              0 & 0 & frac{ 1 }{ 3 } \
              end{array}
              right)
              left(
              begin{array}{rrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
              0 & 1 & 1 \
              0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrr}
              60 & 30 & 20 \
              30 & 20 & 15 \
              20 & 15 & 12 \
              end{array}
              right)
              $$



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrrr}
              1 & 0 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 & 0 \
              frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              420 & 0 & 0 & 0 \
              0 & 35 & 0 & 0 \
              0 & 0 & frac{ 7 }{ 3 } & 0 \
              0 & 0 & 0 & frac{ 3 }{ 20 } \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
              0 & 1 & 1 & frac{ 9 }{ 10 } \
              0 & 0 & 1 & frac{ 3 }{ 2 } \
              0 & 0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrrr}
              420 & 210 & 140 & 105 \
              210 & 140 & 105 & 84 \
              140 & 105 & 84 & 70 \
              105 & 84 & 70 & 60 \
              end{array}
              right)
              $$






              share|cite|improve this answer





















              • Upvoted because it taught me the Sylvester Inertia.
                – 2ndYearFreshman
                Nov 13 at 20:26













              up vote
              2
              down vote










              up vote
              2
              down vote









              tried two , Sylvester Inertia



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrr}
              1 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 \
              end{array}
              right)
              left(
              begin{array}{rrr}
              60 & 0 & 0 \
              0 & 5 & 0 \
              0 & 0 & frac{ 1 }{ 3 } \
              end{array}
              right)
              left(
              begin{array}{rrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
              0 & 1 & 1 \
              0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrr}
              60 & 30 & 20 \
              30 & 20 & 15 \
              20 & 15 & 12 \
              end{array}
              right)
              $$



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrrr}
              1 & 0 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 & 0 \
              frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              420 & 0 & 0 & 0 \
              0 & 35 & 0 & 0 \
              0 & 0 & frac{ 7 }{ 3 } & 0 \
              0 & 0 & 0 & frac{ 3 }{ 20 } \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
              0 & 1 & 1 & frac{ 9 }{ 10 } \
              0 & 0 & 1 & frac{ 3 }{ 2 } \
              0 & 0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrrr}
              420 & 210 & 140 & 105 \
              210 & 140 & 105 & 84 \
              140 & 105 & 84 & 70 \
              105 & 84 & 70 & 60 \
              end{array}
              right)
              $$






              share|cite|improve this answer












              tried two , Sylvester Inertia



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrr}
              1 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 \
              end{array}
              right)
              left(
              begin{array}{rrr}
              60 & 0 & 0 \
              0 & 5 & 0 \
              0 & 0 & frac{ 1 }{ 3 } \
              end{array}
              right)
              left(
              begin{array}{rrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
              0 & 1 & 1 \
              0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrr}
              60 & 30 & 20 \
              30 & 20 & 15 \
              20 & 15 & 12 \
              end{array}
              right)
              $$



              $$ Q^T D Q = H $$
              $$left(
              begin{array}{rrrr}
              1 & 0 & 0 & 0 \
              frac{ 1 }{ 2 } & 1 & 0 & 0 \
              frac{ 1 }{ 3 } & 1 & 1 & 0 \
              frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              420 & 0 & 0 & 0 \
              0 & 35 & 0 & 0 \
              0 & 0 & frac{ 7 }{ 3 } & 0 \
              0 & 0 & 0 & frac{ 3 }{ 20 } \
              end{array}
              right)
              left(
              begin{array}{rrrr}
              1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
              0 & 1 & 1 & frac{ 9 }{ 10 } \
              0 & 0 & 1 & frac{ 3 }{ 2 } \
              0 & 0 & 0 & 1 \
              end{array}
              right)
              = left(
              begin{array}{rrrr}
              420 & 210 & 140 & 105 \
              210 & 140 & 105 & 84 \
              140 & 105 & 84 & 70 \
              105 & 84 & 70 & 60 \
              end{array}
              right)
              $$







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Nov 13 at 1:55









              Will Jagy

              100k597198




              100k597198












              • Upvoted because it taught me the Sylvester Inertia.
                – 2ndYearFreshman
                Nov 13 at 20:26


















              • Upvoted because it taught me the Sylvester Inertia.
                – 2ndYearFreshman
                Nov 13 at 20:26
















              Upvoted because it taught me the Sylvester Inertia.
              – 2ndYearFreshman
              Nov 13 at 20:26




              Upvoted because it taught me the Sylvester Inertia.
              – 2ndYearFreshman
              Nov 13 at 20:26










              up vote
              1
              down vote













              This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity



              $$ frac{1}{x} = int_0^infty e^{-sx} ds.$$



              Observe:



              $A_{i,j} = frac{1}{i+j-1}$.



              begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
              & = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
              & = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
              & = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
              & ge 0
              end{align*}



              Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.






              share|cite|improve this answer

























                up vote
                1
                down vote













                This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity



                $$ frac{1}{x} = int_0^infty e^{-sx} ds.$$



                Observe:



                $A_{i,j} = frac{1}{i+j-1}$.



                begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
                & = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
                & = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
                & = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
                & ge 0
                end{align*}



                Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.






                share|cite|improve this answer























                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity



                  $$ frac{1}{x} = int_0^infty e^{-sx} ds.$$



                  Observe:



                  $A_{i,j} = frac{1}{i+j-1}$.



                  begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
                  & = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
                  & = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
                  & = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
                  & ge 0
                  end{align*}



                  Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.






                  share|cite|improve this answer












                  This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity



                  $$ frac{1}{x} = int_0^infty e^{-sx} ds.$$



                  Observe:



                  $A_{i,j} = frac{1}{i+j-1}$.



                  begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
                  & = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
                  & = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
                  & = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
                  & ge 0
                  end{align*}



                  Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Nov 13 at 2:12









                  Fnacool

                  4,891511




                  4,891511






























                       

                      draft saved


                      draft discarded



















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996169%2fprove-the-matrix-is-positive%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

                      ComboBox Display Member on multiple fields

                      Is it possible to collect Nectar points via Trainline?