Prove the matrix is positive
up vote
0
down vote
favorite
Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$
Prove that $A$ is positive.
My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).
linear-algebra matrices
add a comment |
up vote
0
down vote
favorite
Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$
Prove that $A$ is positive.
My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).
linear-algebra matrices
1
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$
Prove that $A$ is positive.
My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).
linear-algebra matrices
Consider the matrix $A=begin{bmatrix}
1 & 1/2 & 1/3 &dots &1/n \
1/2 & 1/3 & 1/4 &dots &1/(n+1) \
vdots & vdots & vdots & vdots & vdots\
1/n & 1/(n+1) & 1/(n+2) & dots& 1/(2n-1)
end{bmatrix}$
Prove that $A$ is positive.
My work: $A$ is diagonalisable, symmetric but I can't seem to put these facts togheter to help me. I tried to prove by induction (a naive attempt) that the determinant of its minors is always positive but knowing $det(A^{k,k})>0$ there is no information of $det(A^{k+1,k+1}).
linear-algebra matrices
linear-algebra matrices
asked Nov 13 at 1:41
2ndYearFreshman
113112
113112
1
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04
add a comment |
1
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04
1
1
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04
add a comment |
3 Answers
3
active
oldest
votes
up vote
2
down vote
accepted
(Migrated from comment)
The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then
$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$
Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.
add a comment |
up vote
2
down vote
tried two , Sylvester Inertia
$$ Q^T D Q = H $$
$$left(
begin{array}{rrr}
1 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 \
frac{ 1 }{ 3 } & 1 & 1 \
end{array}
right)
left(
begin{array}{rrr}
60 & 0 & 0 \
0 & 5 & 0 \
0 & 0 & frac{ 1 }{ 3 } \
end{array}
right)
left(
begin{array}{rrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
0 & 1 & 1 \
0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrr}
60 & 30 & 20 \
30 & 20 & 15 \
20 & 15 & 12 \
end{array}
right)
$$
$$ Q^T D Q = H $$
$$left(
begin{array}{rrrr}
1 & 0 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 & 0 \
frac{ 1 }{ 3 } & 1 & 1 & 0 \
frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
end{array}
right)
left(
begin{array}{rrrr}
420 & 0 & 0 & 0 \
0 & 35 & 0 & 0 \
0 & 0 & frac{ 7 }{ 3 } & 0 \
0 & 0 & 0 & frac{ 3 }{ 20 } \
end{array}
right)
left(
begin{array}{rrrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
0 & 1 & 1 & frac{ 9 }{ 10 } \
0 & 0 & 1 & frac{ 3 }{ 2 } \
0 & 0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrrr}
420 & 210 & 140 & 105 \
210 & 140 & 105 & 84 \
140 & 105 & 84 & 70 \
105 & 84 & 70 & 60 \
end{array}
right)
$$
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
add a comment |
up vote
1
down vote
This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity
$$ frac{1}{x} = int_0^infty e^{-sx} ds.$$
Observe:
$A_{i,j} = frac{1}{i+j-1}$.
begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
& = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
& = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
& = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
& ge 0
end{align*}
Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
(Migrated from comment)
The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then
$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$
Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.
add a comment |
up vote
2
down vote
accepted
(Migrated from comment)
The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then
$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$
Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.
add a comment |
up vote
2
down vote
accepted
up vote
2
down vote
accepted
(Migrated from comment)
The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then
$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$
Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.
(Migrated from comment)
The matrix in question is called the Hilbert matrix. To see that $A$ is positive-definite, let $x in mathbb{R}^n$. Then
$$ x^T A x
= sum_{i, j = 1}^{n} frac{x_i x_j}{i+j-1}
= sum_{i, j = 1}^{n} int_{0}^{1} t^{i+j-2} x_i x_j , dt
= int_{0}^{1} left( sum_{i=1}^{n} x_i t^{i-1} right)^2 , dt
geq 0 $$
Moreover, the equality in the last step holds if and only if $sum_{i=1}^{n} x_i t^{i-1} equiv 0$ on $t in [0, 1]$, which is equivalent to $x = 0$. Therefore $A$ is positive-definite as required.
answered Nov 13 at 2:05
Sangchul Lee
89.7k12161262
89.7k12161262
add a comment |
add a comment |
up vote
2
down vote
tried two , Sylvester Inertia
$$ Q^T D Q = H $$
$$left(
begin{array}{rrr}
1 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 \
frac{ 1 }{ 3 } & 1 & 1 \
end{array}
right)
left(
begin{array}{rrr}
60 & 0 & 0 \
0 & 5 & 0 \
0 & 0 & frac{ 1 }{ 3 } \
end{array}
right)
left(
begin{array}{rrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
0 & 1 & 1 \
0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrr}
60 & 30 & 20 \
30 & 20 & 15 \
20 & 15 & 12 \
end{array}
right)
$$
$$ Q^T D Q = H $$
$$left(
begin{array}{rrrr}
1 & 0 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 & 0 \
frac{ 1 }{ 3 } & 1 & 1 & 0 \
frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
end{array}
right)
left(
begin{array}{rrrr}
420 & 0 & 0 & 0 \
0 & 35 & 0 & 0 \
0 & 0 & frac{ 7 }{ 3 } & 0 \
0 & 0 & 0 & frac{ 3 }{ 20 } \
end{array}
right)
left(
begin{array}{rrrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
0 & 1 & 1 & frac{ 9 }{ 10 } \
0 & 0 & 1 & frac{ 3 }{ 2 } \
0 & 0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrrr}
420 & 210 & 140 & 105 \
210 & 140 & 105 & 84 \
140 & 105 & 84 & 70 \
105 & 84 & 70 & 60 \
end{array}
right)
$$
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
add a comment |
up vote
2
down vote
tried two , Sylvester Inertia
$$ Q^T D Q = H $$
$$left(
begin{array}{rrr}
1 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 \
frac{ 1 }{ 3 } & 1 & 1 \
end{array}
right)
left(
begin{array}{rrr}
60 & 0 & 0 \
0 & 5 & 0 \
0 & 0 & frac{ 1 }{ 3 } \
end{array}
right)
left(
begin{array}{rrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
0 & 1 & 1 \
0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrr}
60 & 30 & 20 \
30 & 20 & 15 \
20 & 15 & 12 \
end{array}
right)
$$
$$ Q^T D Q = H $$
$$left(
begin{array}{rrrr}
1 & 0 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 & 0 \
frac{ 1 }{ 3 } & 1 & 1 & 0 \
frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
end{array}
right)
left(
begin{array}{rrrr}
420 & 0 & 0 & 0 \
0 & 35 & 0 & 0 \
0 & 0 & frac{ 7 }{ 3 } & 0 \
0 & 0 & 0 & frac{ 3 }{ 20 } \
end{array}
right)
left(
begin{array}{rrrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
0 & 1 & 1 & frac{ 9 }{ 10 } \
0 & 0 & 1 & frac{ 3 }{ 2 } \
0 & 0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrrr}
420 & 210 & 140 & 105 \
210 & 140 & 105 & 84 \
140 & 105 & 84 & 70 \
105 & 84 & 70 & 60 \
end{array}
right)
$$
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
add a comment |
up vote
2
down vote
up vote
2
down vote
tried two , Sylvester Inertia
$$ Q^T D Q = H $$
$$left(
begin{array}{rrr}
1 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 \
frac{ 1 }{ 3 } & 1 & 1 \
end{array}
right)
left(
begin{array}{rrr}
60 & 0 & 0 \
0 & 5 & 0 \
0 & 0 & frac{ 1 }{ 3 } \
end{array}
right)
left(
begin{array}{rrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
0 & 1 & 1 \
0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrr}
60 & 30 & 20 \
30 & 20 & 15 \
20 & 15 & 12 \
end{array}
right)
$$
$$ Q^T D Q = H $$
$$left(
begin{array}{rrrr}
1 & 0 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 & 0 \
frac{ 1 }{ 3 } & 1 & 1 & 0 \
frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
end{array}
right)
left(
begin{array}{rrrr}
420 & 0 & 0 & 0 \
0 & 35 & 0 & 0 \
0 & 0 & frac{ 7 }{ 3 } & 0 \
0 & 0 & 0 & frac{ 3 }{ 20 } \
end{array}
right)
left(
begin{array}{rrrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
0 & 1 & 1 & frac{ 9 }{ 10 } \
0 & 0 & 1 & frac{ 3 }{ 2 } \
0 & 0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrrr}
420 & 210 & 140 & 105 \
210 & 140 & 105 & 84 \
140 & 105 & 84 & 70 \
105 & 84 & 70 & 60 \
end{array}
right)
$$
tried two , Sylvester Inertia
$$ Q^T D Q = H $$
$$left(
begin{array}{rrr}
1 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 \
frac{ 1 }{ 3 } & 1 & 1 \
end{array}
right)
left(
begin{array}{rrr}
60 & 0 & 0 \
0 & 5 & 0 \
0 & 0 & frac{ 1 }{ 3 } \
end{array}
right)
left(
begin{array}{rrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } \
0 & 1 & 1 \
0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrr}
60 & 30 & 20 \
30 & 20 & 15 \
20 & 15 & 12 \
end{array}
right)
$$
$$ Q^T D Q = H $$
$$left(
begin{array}{rrrr}
1 & 0 & 0 & 0 \
frac{ 1 }{ 2 } & 1 & 0 & 0 \
frac{ 1 }{ 3 } & 1 & 1 & 0 \
frac{ 1 }{ 4 } & frac{ 9 }{ 10 } & frac{ 3 }{ 2 } & 1 \
end{array}
right)
left(
begin{array}{rrrr}
420 & 0 & 0 & 0 \
0 & 35 & 0 & 0 \
0 & 0 & frac{ 7 }{ 3 } & 0 \
0 & 0 & 0 & frac{ 3 }{ 20 } \
end{array}
right)
left(
begin{array}{rrrr}
1 & frac{ 1 }{ 2 } & frac{ 1 }{ 3 } & frac{ 1 }{ 4 } \
0 & 1 & 1 & frac{ 9 }{ 10 } \
0 & 0 & 1 & frac{ 3 }{ 2 } \
0 & 0 & 0 & 1 \
end{array}
right)
= left(
begin{array}{rrrr}
420 & 210 & 140 & 105 \
210 & 140 & 105 & 84 \
140 & 105 & 84 & 70 \
105 & 84 & 70 & 60 \
end{array}
right)
$$
answered Nov 13 at 1:55
Will Jagy
100k597198
100k597198
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
add a comment |
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
Upvoted because it taught me the Sylvester Inertia.
– 2ndYearFreshman
Nov 13 at 20:26
add a comment |
up vote
1
down vote
This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity
$$ frac{1}{x} = int_0^infty e^{-sx} ds.$$
Observe:
$A_{i,j} = frac{1}{i+j-1}$.
begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
& = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
& = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
& = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
& ge 0
end{align*}
Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.
add a comment |
up vote
1
down vote
This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity
$$ frac{1}{x} = int_0^infty e^{-sx} ds.$$
Observe:
$A_{i,j} = frac{1}{i+j-1}$.
begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
& = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
& = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
& = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
& ge 0
end{align*}
Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.
add a comment |
up vote
1
down vote
up vote
1
down vote
This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity
$$ frac{1}{x} = int_0^infty e^{-sx} ds.$$
Observe:
$A_{i,j} = frac{1}{i+j-1}$.
begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
& = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
& = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
& = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
& ge 0
end{align*}
Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.
This is not a new answer. It is essentially the same ideas as already presented by Sangchool Lee. I'm giving it just as yet another application of the identity
$$ frac{1}{x} = int_0^infty e^{-sx} ds.$$
Observe:
$A_{i,j} = frac{1}{i+j-1}$.
begin{align*} (Av,v) &= sum_{i,j} frac{v_j v_i}{i+j-1}\
& = sum_{i,j} int_0^infty (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i) ds \
& = int_0^infty sum_{i,j} (e^{-(j+1/2)s} v_j) (e^{-(i+1/2)s}v_i)ds \
& = int_0^infty sum_{j} (e^{-(j+1/2)s} v_j)^2 ds\
& ge 0
end{align*}
Note that the proof works whenever $A_{i,j} = 1/(f(i) + f(j))$ where $f$ is a positive function.
answered Nov 13 at 2:12
Fnacool
4,891511
4,891511
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996169%2fprove-the-matrix-is-positive%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
A pretty quick way is to see $A$ as a Gram matrix of an inner product under some basis.
– xbh
Nov 13 at 1:44
That was fast, thank you! Can you please post it as answer so I can accept it?
– 2ndYearFreshman
Nov 13 at 1:50
@2ndYearFreshman if you wish someone to be notified of your reply to a comment, begin your reply with an @ sign and the beginning of that username.
– Will Jagy
Nov 13 at 2:02
@SangchulLee I think the OP here is addressing you, requesting you to post a full answer
– Will Jagy
Nov 13 at 2:02
@WillJagy Thank you.
– 2ndYearFreshman
Nov 13 at 2:04