$lim E[f^2(X_n)]neq E[f^2(X)]$ even if $X_n rightarrow ^d X$












3












$begingroup$


Let $X_nsim N(0,1/n)$. Is there a continuous function $f$ such that




  1. $E[f^2(X_n)]<infty$


  2. $lim_{nrightarrow infty} E[f^2(X_n)] neq f^2(0)$?


Also, what would happen if I add the condition





  1. $E[|f(X)f(Y)|]<infty$ for all jointly normal $X,Y$ such that $EX=EY=0$


I know that there is no such $f$ if we additionally require $f$ to be bounded since $X_n stackrel{d}{rightarrow}delta_0$.



However, I am totally clueless when it comes to proving (or disproving) the existence of such $f$ if we drop out boundedness condition.



I appreciate every hint!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Does the identity work?
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:37










  • $begingroup$
    @Meneer-Beer Which identity are you refering?
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:48










  • $begingroup$
    Choose for $f$ the map that sends $xmapsto x$.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:50






  • 1




    $begingroup$
    @Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:52












  • $begingroup$
    If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 11:39


















3












$begingroup$


Let $X_nsim N(0,1/n)$. Is there a continuous function $f$ such that




  1. $E[f^2(X_n)]<infty$


  2. $lim_{nrightarrow infty} E[f^2(X_n)] neq f^2(0)$?


Also, what would happen if I add the condition





  1. $E[|f(X)f(Y)|]<infty$ for all jointly normal $X,Y$ such that $EX=EY=0$


I know that there is no such $f$ if we additionally require $f$ to be bounded since $X_n stackrel{d}{rightarrow}delta_0$.



However, I am totally clueless when it comes to proving (or disproving) the existence of such $f$ if we drop out boundedness condition.



I appreciate every hint!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Does the identity work?
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:37










  • $begingroup$
    @Meneer-Beer Which identity are you refering?
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:48










  • $begingroup$
    Choose for $f$ the map that sends $xmapsto x$.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:50






  • 1




    $begingroup$
    @Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:52












  • $begingroup$
    If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 11:39
















3












3








3


3



$begingroup$


Let $X_nsim N(0,1/n)$. Is there a continuous function $f$ such that




  1. $E[f^2(X_n)]<infty$


  2. $lim_{nrightarrow infty} E[f^2(X_n)] neq f^2(0)$?


Also, what would happen if I add the condition





  1. $E[|f(X)f(Y)|]<infty$ for all jointly normal $X,Y$ such that $EX=EY=0$


I know that there is no such $f$ if we additionally require $f$ to be bounded since $X_n stackrel{d}{rightarrow}delta_0$.



However, I am totally clueless when it comes to proving (or disproving) the existence of such $f$ if we drop out boundedness condition.



I appreciate every hint!










share|cite|improve this question











$endgroup$




Let $X_nsim N(0,1/n)$. Is there a continuous function $f$ such that




  1. $E[f^2(X_n)]<infty$


  2. $lim_{nrightarrow infty} E[f^2(X_n)] neq f^2(0)$?


Also, what would happen if I add the condition





  1. $E[|f(X)f(Y)|]<infty$ for all jointly normal $X,Y$ such that $EX=EY=0$


I know that there is no such $f$ if we additionally require $f$ to be bounded since $X_n stackrel{d}{rightarrow}delta_0$.



However, I am totally clueless when it comes to proving (or disproving) the existence of such $f$ if we drop out boundedness condition.



I appreciate every hint!







real-analysis functional-analysis probability-theory convergence normal-distribution






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 29 '18 at 7:51









saz

79.9k860125




79.9k860125










asked Nov 28 '18 at 6:42









MhrMhr

54119




54119












  • $begingroup$
    Does the identity work?
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:37










  • $begingroup$
    @Meneer-Beer Which identity are you refering?
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:48










  • $begingroup$
    Choose for $f$ the map that sends $xmapsto x$.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:50






  • 1




    $begingroup$
    @Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:52












  • $begingroup$
    If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 11:39




















  • $begingroup$
    Does the identity work?
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:37










  • $begingroup$
    @Meneer-Beer Which identity are you refering?
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:48










  • $begingroup$
    Choose for $f$ the map that sends $xmapsto x$.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 8:50






  • 1




    $begingroup$
    @Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
    $endgroup$
    – Mhr
    Nov 28 '18 at 8:52












  • $begingroup$
    If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
    $endgroup$
    – Meneer-Beer
    Nov 28 '18 at 11:39


















$begingroup$
Does the identity work?
$endgroup$
– Meneer-Beer
Nov 28 '18 at 8:37




$begingroup$
Does the identity work?
$endgroup$
– Meneer-Beer
Nov 28 '18 at 8:37












$begingroup$
@Meneer-Beer Which identity are you refering?
$endgroup$
– Mhr
Nov 28 '18 at 8:48




$begingroup$
@Meneer-Beer Which identity are you refering?
$endgroup$
– Mhr
Nov 28 '18 at 8:48












$begingroup$
Choose for $f$ the map that sends $xmapsto x$.
$endgroup$
– Meneer-Beer
Nov 28 '18 at 8:50




$begingroup$
Choose for $f$ the map that sends $xmapsto x$.
$endgroup$
– Meneer-Beer
Nov 28 '18 at 8:50




1




1




$begingroup$
@Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
$endgroup$
– Mhr
Nov 28 '18 at 8:52






$begingroup$
@Meneer-Beer That does not work. The limit is $0$ which equals $f^2(0)$
$endgroup$
– Mhr
Nov 28 '18 at 8:52














$begingroup$
If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
$endgroup$
– Meneer-Beer
Nov 28 '18 at 11:39






$begingroup$
If it helps: If $f^2=x^k$ for fome $k>0$, then $E(f^2(X_n))=0$ if $k$ is even and $frac{(2k-1)!!}{n^k}$ if $k$ is odd. Taking to limit of $n$ gives $0$. Since $f^2(0)=0$, no polynomials satisfy condition 2.
$endgroup$
– Meneer-Beer
Nov 28 '18 at 11:39












1 Answer
1






active

oldest

votes


















5












$begingroup$

No, there doesn't exist such a function $f$.




Theorem: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $mathbb{E}g(X_n) to g(0)$ as $n to infty$.




For the proof of the statement we use the following auxiliary result.




Lemma: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $$lim_{R to infty} sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) = 0.$$




Proof of the lemma: If we set $R_n := sqrt{log(n)/(n-1)}$ for $n geq 2$, then a straight-forward computation shows that $$exp left(- frac{y^2}{2} (n-1) right) leq frac{1}{sqrt{n}} quad text{for all $|y| geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| geq R_n$.) Hence, $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_n$.}$$ Since $R_n to 0$ as $n to infty$, we have $R_0 := sup_{n geq 2} R_n$ and $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_0$, $n in mathbb{N}$.} tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, begin{align*} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) &= sqrt{frac{n}{2pi}} int_{|y| geq R} g(y) exp left(-n frac{y^2}{2} right) ,d y \ &leq frac{1}{sqrt{2pi}} int_{|y| geq R} g(y) exp left(- frac{y^2}{2} right) , dy \ &= mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) tag{2} end{align*} for all $R geq R_0$. By assumption, $mathbb{E}g(X_1)<infty$, and therefore the monotone convergence theorem yields that $$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) stackrel{(2)}{leq} mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) xrightarrow{R to infty} 0.$$





Proof of the theorem: Fix $epsilon>0$. According to the above lemma, we can choose $R>0$ such that



$$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) leq epsilon.$$



Without loss of generality, we may assume that $R$ is sufficiently large such that



$$sup_{n in mathbb{N}} mathbb{P}(|X_n| geq R) leq epsilon.$$
By the triangle inequality, this implies that



$$begin{align*} mathbb{E}(|g(X_n)-g(0)|) &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| > R}}) \ &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + 2 epsilon. end{align*}$$



The random variable $X_n sim N(0,1/n)$ equals in distribution $U/sqrt{n}$ for any $U sim N(0,1)$. Hence,



$$mathbb{E}(|g(X_n)-g(0)|) leq mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right]+ 2 epsilon tag{3}$$



Noting that



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } leq 2 sup_{|y| leq R} |g(y)| < infty$$



and



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } xrightarrow{n to infty} 0$$



by the continuity of $g$, we conclude from the dominated convergence theorem that



$$lim_{n to infty} mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right] = 0,$$



and so $(3)$ gives



$$lim_{n to infty} mathbb{E}(|g(X_n)-g(0)|) =0.$$



This implies, in particular, that



$$|mathbb{E}g(X_n)-g(0)| leq mathbb{E}(|g(X_n)-g(0)|) to 0,$$



i.e. $mathbb{E}g(X_n) to g(0)$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
    $endgroup$
    – Mhr
    Nov 29 '18 at 6:20






  • 1




    $begingroup$
    @Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
    $endgroup$
    – saz
    Nov 29 '18 at 7:51











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3016825%2flim-ef2x-n-neq-ef2x-even-if-x-n-rightarrow-d-x%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









5












$begingroup$

No, there doesn't exist such a function $f$.




Theorem: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $mathbb{E}g(X_n) to g(0)$ as $n to infty$.




For the proof of the statement we use the following auxiliary result.




Lemma: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $$lim_{R to infty} sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) = 0.$$




Proof of the lemma: If we set $R_n := sqrt{log(n)/(n-1)}$ for $n geq 2$, then a straight-forward computation shows that $$exp left(- frac{y^2}{2} (n-1) right) leq frac{1}{sqrt{n}} quad text{for all $|y| geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| geq R_n$.) Hence, $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_n$.}$$ Since $R_n to 0$ as $n to infty$, we have $R_0 := sup_{n geq 2} R_n$ and $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_0$, $n in mathbb{N}$.} tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, begin{align*} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) &= sqrt{frac{n}{2pi}} int_{|y| geq R} g(y) exp left(-n frac{y^2}{2} right) ,d y \ &leq frac{1}{sqrt{2pi}} int_{|y| geq R} g(y) exp left(- frac{y^2}{2} right) , dy \ &= mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) tag{2} end{align*} for all $R geq R_0$. By assumption, $mathbb{E}g(X_1)<infty$, and therefore the monotone convergence theorem yields that $$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) stackrel{(2)}{leq} mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) xrightarrow{R to infty} 0.$$





Proof of the theorem: Fix $epsilon>0$. According to the above lemma, we can choose $R>0$ such that



$$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) leq epsilon.$$



Without loss of generality, we may assume that $R$ is sufficiently large such that



$$sup_{n in mathbb{N}} mathbb{P}(|X_n| geq R) leq epsilon.$$
By the triangle inequality, this implies that



$$begin{align*} mathbb{E}(|g(X_n)-g(0)|) &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| > R}}) \ &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + 2 epsilon. end{align*}$$



The random variable $X_n sim N(0,1/n)$ equals in distribution $U/sqrt{n}$ for any $U sim N(0,1)$. Hence,



$$mathbb{E}(|g(X_n)-g(0)|) leq mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right]+ 2 epsilon tag{3}$$



Noting that



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } leq 2 sup_{|y| leq R} |g(y)| < infty$$



and



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } xrightarrow{n to infty} 0$$



by the continuity of $g$, we conclude from the dominated convergence theorem that



$$lim_{n to infty} mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right] = 0,$$



and so $(3)$ gives



$$lim_{n to infty} mathbb{E}(|g(X_n)-g(0)|) =0.$$



This implies, in particular, that



$$|mathbb{E}g(X_n)-g(0)| leq mathbb{E}(|g(X_n)-g(0)|) to 0,$$



i.e. $mathbb{E}g(X_n) to g(0)$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
    $endgroup$
    – Mhr
    Nov 29 '18 at 6:20






  • 1




    $begingroup$
    @Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
    $endgroup$
    – saz
    Nov 29 '18 at 7:51
















5












$begingroup$

No, there doesn't exist such a function $f$.




Theorem: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $mathbb{E}g(X_n) to g(0)$ as $n to infty$.




For the proof of the statement we use the following auxiliary result.




Lemma: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $$lim_{R to infty} sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) = 0.$$




Proof of the lemma: If we set $R_n := sqrt{log(n)/(n-1)}$ for $n geq 2$, then a straight-forward computation shows that $$exp left(- frac{y^2}{2} (n-1) right) leq frac{1}{sqrt{n}} quad text{for all $|y| geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| geq R_n$.) Hence, $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_n$.}$$ Since $R_n to 0$ as $n to infty$, we have $R_0 := sup_{n geq 2} R_n$ and $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_0$, $n in mathbb{N}$.} tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, begin{align*} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) &= sqrt{frac{n}{2pi}} int_{|y| geq R} g(y) exp left(-n frac{y^2}{2} right) ,d y \ &leq frac{1}{sqrt{2pi}} int_{|y| geq R} g(y) exp left(- frac{y^2}{2} right) , dy \ &= mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) tag{2} end{align*} for all $R geq R_0$. By assumption, $mathbb{E}g(X_1)<infty$, and therefore the monotone convergence theorem yields that $$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) stackrel{(2)}{leq} mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) xrightarrow{R to infty} 0.$$





Proof of the theorem: Fix $epsilon>0$. According to the above lemma, we can choose $R>0$ such that



$$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) leq epsilon.$$



Without loss of generality, we may assume that $R$ is sufficiently large such that



$$sup_{n in mathbb{N}} mathbb{P}(|X_n| geq R) leq epsilon.$$
By the triangle inequality, this implies that



$$begin{align*} mathbb{E}(|g(X_n)-g(0)|) &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| > R}}) \ &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + 2 epsilon. end{align*}$$



The random variable $X_n sim N(0,1/n)$ equals in distribution $U/sqrt{n}$ for any $U sim N(0,1)$. Hence,



$$mathbb{E}(|g(X_n)-g(0)|) leq mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right]+ 2 epsilon tag{3}$$



Noting that



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } leq 2 sup_{|y| leq R} |g(y)| < infty$$



and



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } xrightarrow{n to infty} 0$$



by the continuity of $g$, we conclude from the dominated convergence theorem that



$$lim_{n to infty} mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right] = 0,$$



and so $(3)$ gives



$$lim_{n to infty} mathbb{E}(|g(X_n)-g(0)|) =0.$$



This implies, in particular, that



$$|mathbb{E}g(X_n)-g(0)| leq mathbb{E}(|g(X_n)-g(0)|) to 0,$$



i.e. $mathbb{E}g(X_n) to g(0)$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
    $endgroup$
    – Mhr
    Nov 29 '18 at 6:20






  • 1




    $begingroup$
    @Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
    $endgroup$
    – saz
    Nov 29 '18 at 7:51














5












5








5





$begingroup$

No, there doesn't exist such a function $f$.




Theorem: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $mathbb{E}g(X_n) to g(0)$ as $n to infty$.




For the proof of the statement we use the following auxiliary result.




Lemma: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $$lim_{R to infty} sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) = 0.$$




Proof of the lemma: If we set $R_n := sqrt{log(n)/(n-1)}$ for $n geq 2$, then a straight-forward computation shows that $$exp left(- frac{y^2}{2} (n-1) right) leq frac{1}{sqrt{n}} quad text{for all $|y| geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| geq R_n$.) Hence, $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_n$.}$$ Since $R_n to 0$ as $n to infty$, we have $R_0 := sup_{n geq 2} R_n$ and $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_0$, $n in mathbb{N}$.} tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, begin{align*} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) &= sqrt{frac{n}{2pi}} int_{|y| geq R} g(y) exp left(-n frac{y^2}{2} right) ,d y \ &leq frac{1}{sqrt{2pi}} int_{|y| geq R} g(y) exp left(- frac{y^2}{2} right) , dy \ &= mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) tag{2} end{align*} for all $R geq R_0$. By assumption, $mathbb{E}g(X_1)<infty$, and therefore the monotone convergence theorem yields that $$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) stackrel{(2)}{leq} mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) xrightarrow{R to infty} 0.$$





Proof of the theorem: Fix $epsilon>0$. According to the above lemma, we can choose $R>0$ such that



$$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) leq epsilon.$$



Without loss of generality, we may assume that $R$ is sufficiently large such that



$$sup_{n in mathbb{N}} mathbb{P}(|X_n| geq R) leq epsilon.$$
By the triangle inequality, this implies that



$$begin{align*} mathbb{E}(|g(X_n)-g(0)|) &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| > R}}) \ &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + 2 epsilon. end{align*}$$



The random variable $X_n sim N(0,1/n)$ equals in distribution $U/sqrt{n}$ for any $U sim N(0,1)$. Hence,



$$mathbb{E}(|g(X_n)-g(0)|) leq mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right]+ 2 epsilon tag{3}$$



Noting that



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } leq 2 sup_{|y| leq R} |g(y)| < infty$$



and



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } xrightarrow{n to infty} 0$$



by the continuity of $g$, we conclude from the dominated convergence theorem that



$$lim_{n to infty} mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right] = 0,$$



and so $(3)$ gives



$$lim_{n to infty} mathbb{E}(|g(X_n)-g(0)|) =0.$$



This implies, in particular, that



$$|mathbb{E}g(X_n)-g(0)| leq mathbb{E}(|g(X_n)-g(0)|) to 0,$$



i.e. $mathbb{E}g(X_n) to g(0)$.






share|cite|improve this answer











$endgroup$



No, there doesn't exist such a function $f$.




Theorem: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $mathbb{E}g(X_n) to g(0)$ as $n to infty$.




For the proof of the statement we use the following auxiliary result.




Lemma: Let $X_n sim N(0,1/n)$ and let $g geq 0$ be a continuous function. If $mathbb{E}g(X_1)<infty$ then $$lim_{R to infty} sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) = 0.$$




Proof of the lemma: If we set $R_n := sqrt{log(n)/(n-1)}$ for $n geq 2$, then a straight-forward computation shows that $$exp left(- frac{y^2}{2} (n-1) right) leq frac{1}{sqrt{n}} quad text{for all $|y| geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| geq R_n$.) Hence, $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_n$.}$$ Since $R_n to 0$ as $n to infty$, we have $R_0 := sup_{n geq 2} R_n$ and $$exp left(-n frac{y^2}{2} right) leq frac{1}{sqrt{n}} exp left(-frac{y^2}{2} right) quad text{for all $|y| geq R_0$, $n in mathbb{N}$.} tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, begin{align*} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) &= sqrt{frac{n}{2pi}} int_{|y| geq R} g(y) exp left(-n frac{y^2}{2} right) ,d y \ &leq frac{1}{sqrt{2pi}} int_{|y| geq R} g(y) exp left(- frac{y^2}{2} right) , dy \ &= mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) tag{2} end{align*} for all $R geq R_0$. By assumption, $mathbb{E}g(X_1)<infty$, and therefore the monotone convergence theorem yields that $$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) stackrel{(2)}{leq} mathbb{E}(g(X_1) 1_{{|X_1| geq R}}) xrightarrow{R to infty} 0.$$





Proof of the theorem: Fix $epsilon>0$. According to the above lemma, we can choose $R>0$ such that



$$sup_{n in mathbb{N}} mathbb{E}(g(X_n) 1_{{|X_n| geq R}}) leq epsilon.$$



Without loss of generality, we may assume that $R$ is sufficiently large such that



$$sup_{n in mathbb{N}} mathbb{P}(|X_n| geq R) leq epsilon.$$
By the triangle inequality, this implies that



$$begin{align*} mathbb{E}(|g(X_n)-g(0)|) &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| > R}}) \ &leq mathbb{E}(|g(X_n)-g(0)| 1_{{|X_n| leq R}}) + 2 epsilon. end{align*}$$



The random variable $X_n sim N(0,1/n)$ equals in distribution $U/sqrt{n}$ for any $U sim N(0,1)$. Hence,



$$mathbb{E}(|g(X_n)-g(0)|) leq mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right]+ 2 epsilon tag{3}$$



Noting that



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } leq 2 sup_{|y| leq R} |g(y)| < infty$$



and



$$ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } xrightarrow{n to infty} 0$$



by the continuity of $g$, we conclude from the dominated convergence theorem that



$$lim_{n to infty} mathbb{E} left[ left| g left( frac{U}{sqrt{n}} right) - g(0) right| 1_{|U|/sqrt{n} leq R } right] = 0,$$



and so $(3)$ gives



$$lim_{n to infty} mathbb{E}(|g(X_n)-g(0)|) =0.$$



This implies, in particular, that



$$|mathbb{E}g(X_n)-g(0)| leq mathbb{E}(|g(X_n)-g(0)|) to 0,$$



i.e. $mathbb{E}g(X_n) to g(0)$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 30 '18 at 9:30

























answered Nov 28 '18 at 20:23









sazsaz

79.9k860125




79.9k860125












  • $begingroup$
    Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
    $endgroup$
    – Mhr
    Nov 29 '18 at 6:20






  • 1




    $begingroup$
    @Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
    $endgroup$
    – saz
    Nov 29 '18 at 7:51


















  • $begingroup$
    Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
    $endgroup$
    – Mhr
    Nov 29 '18 at 6:20






  • 1




    $begingroup$
    @Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
    $endgroup$
    – saz
    Nov 29 '18 at 7:51
















$begingroup$
Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
$endgroup$
– Mhr
Nov 29 '18 at 6:20




$begingroup$
Wow. Great answer. So much thanks. It seems that this result relies on the assumption that the variance is $1/n$. Do you think the similar result holds for any $X_n sim N(0,sigma_n)$ such that $sigma_n rightarrow 0$?
$endgroup$
– Mhr
Nov 29 '18 at 6:20




1




1




$begingroup$
@Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
$endgroup$
– saz
Nov 29 '18 at 7:51




$begingroup$
@Mhr I would think so, yes. The only part which needs to be modified is the first part of the proof of the lemma... for general $sigma_n$ it's more difficult to find a suitable $R_0$ such that (1) holds but it should be possible.
$endgroup$
– saz
Nov 29 '18 at 7:51


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3016825%2flim-ef2x-n-neq-ef2x-even-if-x-n-rightarrow-d-x%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to change which sound is reproduced for terminal bell?

Can I use Tabulator js library in my java Spring + Thymeleaf project?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents