Expected value of $k$th ordered statistic in Uniform(0, r) for r<1












1












$begingroup$


Suppose that we draw $X_1, ldots, X_n$ independently and uniformly from $(0,r)$ and let $X_{(k)}$ denote the $k$th smallest number drawn.



Denoting the pdf of $X_{(k)}$ by function $f_k$, I know that
$$
f_{k}(x) = n frac{1}{r} binom{n-1}{k-1}(x/r)^{k-1}(1-x/r)^{n-k}.
$$



Questions




  • How can I find the expected value of $X_{(k)}$?


I know that $mathbb{E}[X_{(k)}] = int_0^rf_k(x)xdx$; but I don't know how to solve it. Intuitively, I believe the answer should be $mathbb{E}[X_{(k)}]=frac{k}{n+1}r$ but I don't have any proofs for it.




  • How concentrated is $X_{(k)}$ around its expectation? More precisely, I would like to know an upper bound on $Pr[|X_{(k)}-mathbb{E}[X_{(k)}]| > epsilon]$ for given $epsilon$.










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
    $endgroup$
    – Clarinetist
    Dec 30 '18 at 3:09












  • $begingroup$
    @Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 3:52










  • $begingroup$
    For the second question, you can start off with Chebyshev's inequality.
    $endgroup$
    – StubbornAtom
    Dec 30 '18 at 6:15










  • $begingroup$
    Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
    $endgroup$
    – drhab
    Dec 30 '18 at 9:27
















1












$begingroup$


Suppose that we draw $X_1, ldots, X_n$ independently and uniformly from $(0,r)$ and let $X_{(k)}$ denote the $k$th smallest number drawn.



Denoting the pdf of $X_{(k)}$ by function $f_k$, I know that
$$
f_{k}(x) = n frac{1}{r} binom{n-1}{k-1}(x/r)^{k-1}(1-x/r)^{n-k}.
$$



Questions




  • How can I find the expected value of $X_{(k)}$?


I know that $mathbb{E}[X_{(k)}] = int_0^rf_k(x)xdx$; but I don't know how to solve it. Intuitively, I believe the answer should be $mathbb{E}[X_{(k)}]=frac{k}{n+1}r$ but I don't have any proofs for it.




  • How concentrated is $X_{(k)}$ around its expectation? More precisely, I would like to know an upper bound on $Pr[|X_{(k)}-mathbb{E}[X_{(k)}]| > epsilon]$ for given $epsilon$.










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
    $endgroup$
    – Clarinetist
    Dec 30 '18 at 3:09












  • $begingroup$
    @Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 3:52










  • $begingroup$
    For the second question, you can start off with Chebyshev's inequality.
    $endgroup$
    – StubbornAtom
    Dec 30 '18 at 6:15










  • $begingroup$
    Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
    $endgroup$
    – drhab
    Dec 30 '18 at 9:27














1












1








1


0



$begingroup$


Suppose that we draw $X_1, ldots, X_n$ independently and uniformly from $(0,r)$ and let $X_{(k)}$ denote the $k$th smallest number drawn.



Denoting the pdf of $X_{(k)}$ by function $f_k$, I know that
$$
f_{k}(x) = n frac{1}{r} binom{n-1}{k-1}(x/r)^{k-1}(1-x/r)^{n-k}.
$$



Questions




  • How can I find the expected value of $X_{(k)}$?


I know that $mathbb{E}[X_{(k)}] = int_0^rf_k(x)xdx$; but I don't know how to solve it. Intuitively, I believe the answer should be $mathbb{E}[X_{(k)}]=frac{k}{n+1}r$ but I don't have any proofs for it.




  • How concentrated is $X_{(k)}$ around its expectation? More precisely, I would like to know an upper bound on $Pr[|X_{(k)}-mathbb{E}[X_{(k)}]| > epsilon]$ for given $epsilon$.










share|cite|improve this question









$endgroup$




Suppose that we draw $X_1, ldots, X_n$ independently and uniformly from $(0,r)$ and let $X_{(k)}$ denote the $k$th smallest number drawn.



Denoting the pdf of $X_{(k)}$ by function $f_k$, I know that
$$
f_{k}(x) = n frac{1}{r} binom{n-1}{k-1}(x/r)^{k-1}(1-x/r)^{n-k}.
$$



Questions




  • How can I find the expected value of $X_{(k)}$?


I know that $mathbb{E}[X_{(k)}] = int_0^rf_k(x)xdx$; but I don't know how to solve it. Intuitively, I believe the answer should be $mathbb{E}[X_{(k)}]=frac{k}{n+1}r$ but I don't have any proofs for it.




  • How concentrated is $X_{(k)}$ around its expectation? More precisely, I would like to know an upper bound on $Pr[|X_{(k)}-mathbb{E}[X_{(k)}]| > epsilon]$ for given $epsilon$.







probability statistics






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 30 '18 at 3:02









Jeff CooperJeff Cooper

62




62








  • 1




    $begingroup$
    Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
    $endgroup$
    – Clarinetist
    Dec 30 '18 at 3:09












  • $begingroup$
    @Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 3:52










  • $begingroup$
    For the second question, you can start off with Chebyshev's inequality.
    $endgroup$
    – StubbornAtom
    Dec 30 '18 at 6:15










  • $begingroup$
    Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
    $endgroup$
    – drhab
    Dec 30 '18 at 9:27














  • 1




    $begingroup$
    Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
    $endgroup$
    – Clarinetist
    Dec 30 '18 at 3:09












  • $begingroup$
    @Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 3:52










  • $begingroup$
    For the second question, you can start off with Chebyshev's inequality.
    $endgroup$
    – StubbornAtom
    Dec 30 '18 at 6:15










  • $begingroup$
    Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
    $endgroup$
    – drhab
    Dec 30 '18 at 9:27








1




1




$begingroup$
Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
$endgroup$
– Clarinetist
Dec 30 '18 at 3:09






$begingroup$
Hint for $mathbb{E}[X_{(k)}]$: observe that $Y_{(k)} = X_{(k)}/r$ has a probability density function which is proportional to $y^{k-1}(1-y)^{n-k}$ for $0 < y < 1$. What distribution does $Y_{(k)}$ follow? Then, use the fact that $r$ is a known constant to find $mathbb{E}[X_{(k)}]$.
$endgroup$
– Clarinetist
Dec 30 '18 at 3:09














$begingroup$
@Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
$endgroup$
– Jeff Cooper
Dec 30 '18 at 3:52




$begingroup$
@Clarinetist Thank you. It follows the Beta distribution and after writing it down I realized that it indeed implies that $mathbb{E}[X_{(k)}] = frac{k}{n+1}r$. Do you have any suggestions for the second question?
$endgroup$
– Jeff Cooper
Dec 30 '18 at 3:52












$begingroup$
For the second question, you can start off with Chebyshev's inequality.
$endgroup$
– StubbornAtom
Dec 30 '18 at 6:15




$begingroup$
For the second question, you can start off with Chebyshev's inequality.
$endgroup$
– StubbornAtom
Dec 30 '18 at 6:15












$begingroup$
Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
$endgroup$
– drhab
Dec 30 '18 at 9:27




$begingroup$
Just a remark aside: in cases like this first solve it for special case $Y_1,dots, Y_k$ where $r=1$ and afterwards take $X_i=rY_i$. This decreases probability on mistakes with (awkward) parameter $r$.
$endgroup$
– drhab
Dec 30 '18 at 9:27










2 Answers
2






active

oldest

votes


















0












$begingroup$

Here is an intuitive proof by contradiction of the first question without any Beta distribution. I will work with the case $r=1$ since for general $r$ you can just rescale.



So suppose without loss of generality that $mathbb{E} X_{(k)} > frac{k}{n+1}$. What this means is that if you draw $n$ points on $[0, 1]$ uniformly at random, repeatedly for $m$ times, eventually as $m to infty$ you will see points more concentrated on the right of $mathbb{E} X_{(k)}$ than on its left. But this contradicts the assumption of uniform sampling on $[0, 1]$.



For the second question, you can get various bounds through the 2nd, 3rd, 4th, etc moments of Beta distributions, or their moment generating functions, as listed in the wikipedia page.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 20:10





















0












$begingroup$

Drawing $X_1,dots,X_n$ independently and uniformly from $(0,2pi)$ can be done like this:



Choose $Z_1,dots, Z_n, Z_{n+1}$ independently and uniformly on $S_1={(x,y)mid x^2+y^2}$.



Then let $X_i$ be the length of the arc that connects $Z_{n+1}$ and $Z_i$ anti-clockwise.



The circle is split up in $n+1$ disjoint arcs and the setup reveals that the lengths of these arcs have equal distribution hence equal expectation.



That gives expectation $frac{k}{n+1}cdot2pi$ for the length of the arc corresponding with $X_{(k)}$.






share|cite|improve this answer









$endgroup$














    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056471%2fexpected-value-of-kth-ordered-statistic-in-uniform0-r-for-r1%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    Here is an intuitive proof by contradiction of the first question without any Beta distribution. I will work with the case $r=1$ since for general $r$ you can just rescale.



    So suppose without loss of generality that $mathbb{E} X_{(k)} > frac{k}{n+1}$. What this means is that if you draw $n$ points on $[0, 1]$ uniformly at random, repeatedly for $m$ times, eventually as $m to infty$ you will see points more concentrated on the right of $mathbb{E} X_{(k)}$ than on its left. But this contradicts the assumption of uniform sampling on $[0, 1]$.



    For the second question, you can get various bounds through the 2nd, 3rd, 4th, etc moments of Beta distributions, or their moment generating functions, as listed in the wikipedia page.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
      $endgroup$
      – Jeff Cooper
      Dec 30 '18 at 20:10


















    0












    $begingroup$

    Here is an intuitive proof by contradiction of the first question without any Beta distribution. I will work with the case $r=1$ since for general $r$ you can just rescale.



    So suppose without loss of generality that $mathbb{E} X_{(k)} > frac{k}{n+1}$. What this means is that if you draw $n$ points on $[0, 1]$ uniformly at random, repeatedly for $m$ times, eventually as $m to infty$ you will see points more concentrated on the right of $mathbb{E} X_{(k)}$ than on its left. But this contradicts the assumption of uniform sampling on $[0, 1]$.



    For the second question, you can get various bounds through the 2nd, 3rd, 4th, etc moments of Beta distributions, or their moment generating functions, as listed in the wikipedia page.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
      $endgroup$
      – Jeff Cooper
      Dec 30 '18 at 20:10
















    0












    0








    0





    $begingroup$

    Here is an intuitive proof by contradiction of the first question without any Beta distribution. I will work with the case $r=1$ since for general $r$ you can just rescale.



    So suppose without loss of generality that $mathbb{E} X_{(k)} > frac{k}{n+1}$. What this means is that if you draw $n$ points on $[0, 1]$ uniformly at random, repeatedly for $m$ times, eventually as $m to infty$ you will see points more concentrated on the right of $mathbb{E} X_{(k)}$ than on its left. But this contradicts the assumption of uniform sampling on $[0, 1]$.



    For the second question, you can get various bounds through the 2nd, 3rd, 4th, etc moments of Beta distributions, or their moment generating functions, as listed in the wikipedia page.






    share|cite|improve this answer









    $endgroup$



    Here is an intuitive proof by contradiction of the first question without any Beta distribution. I will work with the case $r=1$ since for general $r$ you can just rescale.



    So suppose without loss of generality that $mathbb{E} X_{(k)} > frac{k}{n+1}$. What this means is that if you draw $n$ points on $[0, 1]$ uniformly at random, repeatedly for $m$ times, eventually as $m to infty$ you will see points more concentrated on the right of $mathbb{E} X_{(k)}$ than on its left. But this contradicts the assumption of uniform sampling on $[0, 1]$.



    For the second question, you can get various bounds through the 2nd, 3rd, 4th, etc moments of Beta distributions, or their moment generating functions, as listed in the wikipedia page.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Dec 30 '18 at 6:06









    John JiangJohn Jiang

    369312




    369312












    • $begingroup$
      Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
      $endgroup$
      – Jeff Cooper
      Dec 30 '18 at 20:10




















    • $begingroup$
      Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
      $endgroup$
      – Jeff Cooper
      Dec 30 '18 at 20:10


















    $begingroup$
    Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 20:10






    $begingroup$
    Thanks John. For the second question, I prefer to get the strongest possible concentration bound which I assume should be done via the moment generating function, but I have no experience doing it. In particular, the equations that I end up with seem very complicated and hard to deal with. Is there a standard way of achieving a simple exponential concentration bound? I do not care about constants.
    $endgroup$
    – Jeff Cooper
    Dec 30 '18 at 20:10













    0












    $begingroup$

    Drawing $X_1,dots,X_n$ independently and uniformly from $(0,2pi)$ can be done like this:



    Choose $Z_1,dots, Z_n, Z_{n+1}$ independently and uniformly on $S_1={(x,y)mid x^2+y^2}$.



    Then let $X_i$ be the length of the arc that connects $Z_{n+1}$ and $Z_i$ anti-clockwise.



    The circle is split up in $n+1$ disjoint arcs and the setup reveals that the lengths of these arcs have equal distribution hence equal expectation.



    That gives expectation $frac{k}{n+1}cdot2pi$ for the length of the arc corresponding with $X_{(k)}$.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      Drawing $X_1,dots,X_n$ independently and uniformly from $(0,2pi)$ can be done like this:



      Choose $Z_1,dots, Z_n, Z_{n+1}$ independently and uniformly on $S_1={(x,y)mid x^2+y^2}$.



      Then let $X_i$ be the length of the arc that connects $Z_{n+1}$ and $Z_i$ anti-clockwise.



      The circle is split up in $n+1$ disjoint arcs and the setup reveals that the lengths of these arcs have equal distribution hence equal expectation.



      That gives expectation $frac{k}{n+1}cdot2pi$ for the length of the arc corresponding with $X_{(k)}$.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        Drawing $X_1,dots,X_n$ independently and uniformly from $(0,2pi)$ can be done like this:



        Choose $Z_1,dots, Z_n, Z_{n+1}$ independently and uniformly on $S_1={(x,y)mid x^2+y^2}$.



        Then let $X_i$ be the length of the arc that connects $Z_{n+1}$ and $Z_i$ anti-clockwise.



        The circle is split up in $n+1$ disjoint arcs and the setup reveals that the lengths of these arcs have equal distribution hence equal expectation.



        That gives expectation $frac{k}{n+1}cdot2pi$ for the length of the arc corresponding with $X_{(k)}$.






        share|cite|improve this answer









        $endgroup$



        Drawing $X_1,dots,X_n$ independently and uniformly from $(0,2pi)$ can be done like this:



        Choose $Z_1,dots, Z_n, Z_{n+1}$ independently and uniformly on $S_1={(x,y)mid x^2+y^2}$.



        Then let $X_i$ be the length of the arc that connects $Z_{n+1}$ and $Z_i$ anti-clockwise.



        The circle is split up in $n+1$ disjoint arcs and the setup reveals that the lengths of these arcs have equal distribution hence equal expectation.



        That gives expectation $frac{k}{n+1}cdot2pi$ for the length of the arc corresponding with $X_{(k)}$.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Dec 31 '18 at 10:01









        drhabdrhab

        104k545136




        104k545136






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056471%2fexpected-value-of-kth-ordered-statistic-in-uniform0-r-for-r1%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            mysqli_query(): Empty query in /home/lucindabrummitt/public_html/blog/wp-includes/wp-db.php on line 1924

            How to change which sound is reproduced for terminal bell?

            Can I use Tabulator js library in my java Spring + Thymeleaf project?