How do I determine the divergence/convergence of $sum_n frac{1}{log(log(n))}$?
$begingroup$
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
$endgroup$
add a comment |
$begingroup$
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
$endgroup$
add a comment |
$begingroup$
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
$endgroup$
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
convergence
edited Dec 1 '18 at 6:15
Chinnapparaj R
5,4872928
5,4872928
asked Dec 1 '18 at 6:12
XiaomiXiaomi
1,057115
1,057115
add a comment |
add a comment |
5 Answers
5
active
oldest
votes
$begingroup$
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
$endgroup$
add a comment |
$begingroup$
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
$endgroup$
add a comment |
$begingroup$
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
$endgroup$
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
add a comment |
$begingroup$
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
$endgroup$
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
$begingroup$
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021045%2fhow-do-i-determine-the-divergence-convergence-of-sum-n-frac1-log-logn%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
$endgroup$
add a comment |
$begingroup$
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
$endgroup$
add a comment |
$begingroup$
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
$endgroup$
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
answered Dec 1 '18 at 6:17
plattyplatty
3,370320
3,370320
add a comment |
add a comment |
$begingroup$
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
$endgroup$
add a comment |
$begingroup$
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
$endgroup$
add a comment |
$begingroup$
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
$endgroup$
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
answered Dec 1 '18 at 6:19
MoonKnightMoonKnight
1,369611
1,369611
add a comment |
add a comment |
$begingroup$
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
$endgroup$
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
add a comment |
$begingroup$
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
$endgroup$
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
add a comment |
$begingroup$
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
$endgroup$
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
answered Dec 1 '18 at 6:18
xbhxbh
6,1351522
6,1351522
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
add a comment |
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
$endgroup$
– Xiaomi
Dec 1 '18 at 6:21
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
$begingroup$
@Xiaomi Sorry, I suck at probability theory.
$endgroup$
– xbh
Dec 1 '18 at 7:15
add a comment |
$begingroup$
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
$endgroup$
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
$begingroup$
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
$endgroup$
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
$begingroup$
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
$endgroup$
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
answered Dec 1 '18 at 6:20
trancelocationtrancelocation
11.3k1724
11.3k1724
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
$endgroup$
– Xiaomi
Dec 1 '18 at 6:24
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
$begingroup$
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
$endgroup$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
$begingroup$
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
$endgroup$
add a comment |
$begingroup$
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
$endgroup$
add a comment |
$begingroup$
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
$endgroup$
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
answered Dec 1 '18 at 7:15
Peter SzilasPeter Szilas
11.3k2822
11.3k2822
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021045%2fhow-do-i-determine-the-divergence-convergence-of-sum-n-frac1-log-logn%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown