Finding the regression line given the mean, correlation and standard deviation of $x$ and $y$.












1












$begingroup$


So we have $100$ observations for $(x, y)$.
The mean of $x$ is $1.06$, and for $y$ it is $3$.
The standard deviation is $0.52$ for $x$ and for $y$ it is $1.13$.
the correlation between $x$ and $y$ is $0.89$.



In the question we are told to:



• Estimate the linear regression line of the regression of $Y$ on $X$ and the standard deviation of the errors.



• estimate the regression line when we regress $X$ as dependent variable on $Y$ and obtain an estimate of the standard deviation of the errors.



• Are the two regression lines the same? If not, then explain why not.



• For the regression of $Y$ on $X$, suppose that we wish to predict the
dependent variable $y$ at $x = x^* = 0.7$. Obtain the prediction, as
well as the standard error of the prediction.



• Obtain the standard deviation of the prediction error and hence
obtain a $95%$ prediction interval for $y$ for the the given $x = x*.$



Now I thought we were supposed to generate $100$ points of data assuming $x$ and $y$ had a normal distribution with the given means and standard deviations, and then use stata to regress and find the prediction interval, etc



But I was told this was not the case by the lecturer, and was wondering if there was a way to solve this another way? I'm thinking some kind of derivation/calculations using the above info, but I have no idea where to start.










share|cite|improve this question











$endgroup$

















    1












    $begingroup$


    So we have $100$ observations for $(x, y)$.
    The mean of $x$ is $1.06$, and for $y$ it is $3$.
    The standard deviation is $0.52$ for $x$ and for $y$ it is $1.13$.
    the correlation between $x$ and $y$ is $0.89$.



    In the question we are told to:



    • Estimate the linear regression line of the regression of $Y$ on $X$ and the standard deviation of the errors.



    • estimate the regression line when we regress $X$ as dependent variable on $Y$ and obtain an estimate of the standard deviation of the errors.



    • Are the two regression lines the same? If not, then explain why not.



    • For the regression of $Y$ on $X$, suppose that we wish to predict the
    dependent variable $y$ at $x = x^* = 0.7$. Obtain the prediction, as
    well as the standard error of the prediction.



    • Obtain the standard deviation of the prediction error and hence
    obtain a $95%$ prediction interval for $y$ for the the given $x = x*.$



    Now I thought we were supposed to generate $100$ points of data assuming $x$ and $y$ had a normal distribution with the given means and standard deviations, and then use stata to regress and find the prediction interval, etc



    But I was told this was not the case by the lecturer, and was wondering if there was a way to solve this another way? I'm thinking some kind of derivation/calculations using the above info, but I have no idea where to start.










    share|cite|improve this question











    $endgroup$















      1












      1








      1





      $begingroup$


      So we have $100$ observations for $(x, y)$.
      The mean of $x$ is $1.06$, and for $y$ it is $3$.
      The standard deviation is $0.52$ for $x$ and for $y$ it is $1.13$.
      the correlation between $x$ and $y$ is $0.89$.



      In the question we are told to:



      • Estimate the linear regression line of the regression of $Y$ on $X$ and the standard deviation of the errors.



      • estimate the regression line when we regress $X$ as dependent variable on $Y$ and obtain an estimate of the standard deviation of the errors.



      • Are the two regression lines the same? If not, then explain why not.



      • For the regression of $Y$ on $X$, suppose that we wish to predict the
      dependent variable $y$ at $x = x^* = 0.7$. Obtain the prediction, as
      well as the standard error of the prediction.



      • Obtain the standard deviation of the prediction error and hence
      obtain a $95%$ prediction interval for $y$ for the the given $x = x*.$



      Now I thought we were supposed to generate $100$ points of data assuming $x$ and $y$ had a normal distribution with the given means and standard deviations, and then use stata to regress and find the prediction interval, etc



      But I was told this was not the case by the lecturer, and was wondering if there was a way to solve this another way? I'm thinking some kind of derivation/calculations using the above info, but I have no idea where to start.










      share|cite|improve this question











      $endgroup$




      So we have $100$ observations for $(x, y)$.
      The mean of $x$ is $1.06$, and for $y$ it is $3$.
      The standard deviation is $0.52$ for $x$ and for $y$ it is $1.13$.
      the correlation between $x$ and $y$ is $0.89$.



      In the question we are told to:



      • Estimate the linear regression line of the regression of $Y$ on $X$ and the standard deviation of the errors.



      • estimate the regression line when we regress $X$ as dependent variable on $Y$ and obtain an estimate of the standard deviation of the errors.



      • Are the two regression lines the same? If not, then explain why not.



      • For the regression of $Y$ on $X$, suppose that we wish to predict the
      dependent variable $y$ at $x = x^* = 0.7$. Obtain the prediction, as
      well as the standard error of the prediction.



      • Obtain the standard deviation of the prediction error and hence
      obtain a $95%$ prediction interval for $y$ for the the given $x = x*.$



      Now I thought we were supposed to generate $100$ points of data assuming $x$ and $y$ had a normal distribution with the given means and standard deviations, and then use stata to regress and find the prediction interval, etc



      But I was told this was not the case by the lecturer, and was wondering if there was a way to solve this another way? I'm thinking some kind of derivation/calculations using the above info, but I have no idea where to start.







      statistics






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jul 7 '16 at 16:33









      Zain Patel

      15.7k51949




      15.7k51949










      asked Aug 28 '13 at 11:28









      RaditzRaditz

      1113




      1113






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          I've found estimates for $B_1$ and $B_0$ from modifying the formula used for their estimation; the numerator can be turned into $n times cov(x,y)$; and we can find $cov(x,y)$ given $corr(x,y)$ and std of x and y.



          Problem now is how to find the standard deviation of the errors and the prediction errors.






          share|cite|improve this answer











          $endgroup$





















            0












            $begingroup$

            Using matrix notation, you get



            $ hat{beta} = (X'X)^{-1}X'Y \
            Var(hat{beta}) =(X'X)^{-1}X'sigma_{y}^{2}X(X'X)^{-1} = (X'X)^{-1}sigma_{y}^{2}$



            So for the simple linear regression, this will be



            begin{align}
            X &= (j, x) quad X'=(j,x)'\
            X'X &=
            begin{bmatrix}
            n & sum_{k=1}^{n} x_{k} \
            sum_{k=1}^{n} x_{k} & sum_{k=1}^{n} x_{k}^{2} \
            end{bmatrix} \
            (X'X)^{-1}sigma_{y}^{2} &=
            begin{bmatrix}
            sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
            -sum_{k=1}^{n} x_{k} & n \
            end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(sum_{k=1}^{n} x_{k})^{2}} \&= begin{bmatrix}
            sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
            -sum_{k=1}^{n} x_{k} & n \
            end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(nbar{X})^{2}} \ quad \
            &= begin{bmatrix}
            frac{sigma_{y}^{2}sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
            -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
            end{bmatrix}\
            &=begin{bmatrix}
            frac{sigma_{y}^{2}(S^{2}_{X}+nbar{X}^{2})}{nS^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
            -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
            end{bmatrix}\
            end{align}



            So if you have the s.d. for X, you just need to find $sum_{k=1}^{n} x_{k}^{2} $ and you will have everything needed to calculate the estimate variance.



            Just manipulate the variance formula a little to get it: $sum_{k=1}^{n} x_{k}^{2} = sigma_{X}^{2} +nbar{X}^{2}$



            If you are predicting in-sample, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} $



            If you are predicting beyond sample range, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} + sigma^{2}_{Y} $






            share|cite|improve this answer











            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f478103%2ffinding-the-regression-line-given-the-mean-correlation-and-standard-deviation-o%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              I've found estimates for $B_1$ and $B_0$ from modifying the formula used for their estimation; the numerator can be turned into $n times cov(x,y)$; and we can find $cov(x,y)$ given $corr(x,y)$ and std of x and y.



              Problem now is how to find the standard deviation of the errors and the prediction errors.






              share|cite|improve this answer











              $endgroup$


















                0












                $begingroup$

                I've found estimates for $B_1$ and $B_0$ from modifying the formula used for their estimation; the numerator can be turned into $n times cov(x,y)$; and we can find $cov(x,y)$ given $corr(x,y)$ and std of x and y.



                Problem now is how to find the standard deviation of the errors and the prediction errors.






                share|cite|improve this answer











                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  I've found estimates for $B_1$ and $B_0$ from modifying the formula used for their estimation; the numerator can be turned into $n times cov(x,y)$; and we can find $cov(x,y)$ given $corr(x,y)$ and std of x and y.



                  Problem now is how to find the standard deviation of the errors and the prediction errors.






                  share|cite|improve this answer











                  $endgroup$



                  I've found estimates for $B_1$ and $B_0$ from modifying the formula used for their estimation; the numerator can be turned into $n times cov(x,y)$; and we can find $cov(x,y)$ given $corr(x,y)$ and std of x and y.



                  Problem now is how to find the standard deviation of the errors and the prediction errors.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Aug 29 '13 at 2:18









                  Stefan4024

                  30.6k63479




                  30.6k63479










                  answered Aug 29 '13 at 1:33









                  RaditzRaditz

                  1




                  1























                      0












                      $begingroup$

                      Using matrix notation, you get



                      $ hat{beta} = (X'X)^{-1}X'Y \
                      Var(hat{beta}) =(X'X)^{-1}X'sigma_{y}^{2}X(X'X)^{-1} = (X'X)^{-1}sigma_{y}^{2}$



                      So for the simple linear regression, this will be



                      begin{align}
                      X &= (j, x) quad X'=(j,x)'\
                      X'X &=
                      begin{bmatrix}
                      n & sum_{k=1}^{n} x_{k} \
                      sum_{k=1}^{n} x_{k} & sum_{k=1}^{n} x_{k}^{2} \
                      end{bmatrix} \
                      (X'X)^{-1}sigma_{y}^{2} &=
                      begin{bmatrix}
                      sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                      -sum_{k=1}^{n} x_{k} & n \
                      end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(sum_{k=1}^{n} x_{k})^{2}} \&= begin{bmatrix}
                      sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                      -sum_{k=1}^{n} x_{k} & n \
                      end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(nbar{X})^{2}} \ quad \
                      &= begin{bmatrix}
                      frac{sigma_{y}^{2}sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                      -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                      end{bmatrix}\
                      &=begin{bmatrix}
                      frac{sigma_{y}^{2}(S^{2}_{X}+nbar{X}^{2})}{nS^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                      -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                      end{bmatrix}\
                      end{align}



                      So if you have the s.d. for X, you just need to find $sum_{k=1}^{n} x_{k}^{2} $ and you will have everything needed to calculate the estimate variance.



                      Just manipulate the variance formula a little to get it: $sum_{k=1}^{n} x_{k}^{2} = sigma_{X}^{2} +nbar{X}^{2}$



                      If you are predicting in-sample, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} $



                      If you are predicting beyond sample range, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} + sigma^{2}_{Y} $






                      share|cite|improve this answer











                      $endgroup$


















                        0












                        $begingroup$

                        Using matrix notation, you get



                        $ hat{beta} = (X'X)^{-1}X'Y \
                        Var(hat{beta}) =(X'X)^{-1}X'sigma_{y}^{2}X(X'X)^{-1} = (X'X)^{-1}sigma_{y}^{2}$



                        So for the simple linear regression, this will be



                        begin{align}
                        X &= (j, x) quad X'=(j,x)'\
                        X'X &=
                        begin{bmatrix}
                        n & sum_{k=1}^{n} x_{k} \
                        sum_{k=1}^{n} x_{k} & sum_{k=1}^{n} x_{k}^{2} \
                        end{bmatrix} \
                        (X'X)^{-1}sigma_{y}^{2} &=
                        begin{bmatrix}
                        sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                        -sum_{k=1}^{n} x_{k} & n \
                        end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(sum_{k=1}^{n} x_{k})^{2}} \&= begin{bmatrix}
                        sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                        -sum_{k=1}^{n} x_{k} & n \
                        end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(nbar{X})^{2}} \ quad \
                        &= begin{bmatrix}
                        frac{sigma_{y}^{2}sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                        -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                        end{bmatrix}\
                        &=begin{bmatrix}
                        frac{sigma_{y}^{2}(S^{2}_{X}+nbar{X}^{2})}{nS^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                        -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                        end{bmatrix}\
                        end{align}



                        So if you have the s.d. for X, you just need to find $sum_{k=1}^{n} x_{k}^{2} $ and you will have everything needed to calculate the estimate variance.



                        Just manipulate the variance formula a little to get it: $sum_{k=1}^{n} x_{k}^{2} = sigma_{X}^{2} +nbar{X}^{2}$



                        If you are predicting in-sample, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} $



                        If you are predicting beyond sample range, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} + sigma^{2}_{Y} $






                        share|cite|improve this answer











                        $endgroup$
















                          0












                          0








                          0





                          $begingroup$

                          Using matrix notation, you get



                          $ hat{beta} = (X'X)^{-1}X'Y \
                          Var(hat{beta}) =(X'X)^{-1}X'sigma_{y}^{2}X(X'X)^{-1} = (X'X)^{-1}sigma_{y}^{2}$



                          So for the simple linear regression, this will be



                          begin{align}
                          X &= (j, x) quad X'=(j,x)'\
                          X'X &=
                          begin{bmatrix}
                          n & sum_{k=1}^{n} x_{k} \
                          sum_{k=1}^{n} x_{k} & sum_{k=1}^{n} x_{k}^{2} \
                          end{bmatrix} \
                          (X'X)^{-1}sigma_{y}^{2} &=
                          begin{bmatrix}
                          sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                          -sum_{k=1}^{n} x_{k} & n \
                          end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(sum_{k=1}^{n} x_{k})^{2}} \&= begin{bmatrix}
                          sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                          -sum_{k=1}^{n} x_{k} & n \
                          end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(nbar{X})^{2}} \ quad \
                          &= begin{bmatrix}
                          frac{sigma_{y}^{2}sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                          -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                          end{bmatrix}\
                          &=begin{bmatrix}
                          frac{sigma_{y}^{2}(S^{2}_{X}+nbar{X}^{2})}{nS^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                          -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                          end{bmatrix}\
                          end{align}



                          So if you have the s.d. for X, you just need to find $sum_{k=1}^{n} x_{k}^{2} $ and you will have everything needed to calculate the estimate variance.



                          Just manipulate the variance formula a little to get it: $sum_{k=1}^{n} x_{k}^{2} = sigma_{X}^{2} +nbar{X}^{2}$



                          If you are predicting in-sample, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} $



                          If you are predicting beyond sample range, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} + sigma^{2}_{Y} $






                          share|cite|improve this answer











                          $endgroup$



                          Using matrix notation, you get



                          $ hat{beta} = (X'X)^{-1}X'Y \
                          Var(hat{beta}) =(X'X)^{-1}X'sigma_{y}^{2}X(X'X)^{-1} = (X'X)^{-1}sigma_{y}^{2}$



                          So for the simple linear regression, this will be



                          begin{align}
                          X &= (j, x) quad X'=(j,x)'\
                          X'X &=
                          begin{bmatrix}
                          n & sum_{k=1}^{n} x_{k} \
                          sum_{k=1}^{n} x_{k} & sum_{k=1}^{n} x_{k}^{2} \
                          end{bmatrix} \
                          (X'X)^{-1}sigma_{y}^{2} &=
                          begin{bmatrix}
                          sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                          -sum_{k=1}^{n} x_{k} & n \
                          end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(sum_{k=1}^{n} x_{k})^{2}} \&= begin{bmatrix}
                          sum_{k=1}^{n} x_{k}^{2} & -sum_{k=1}^{n} x_{k} \
                          -sum_{k=1}^{n} x_{k} & n \
                          end{bmatrix}frac{sigma_{y}^{2}}{nsum_{k=1}^{n} x_{k}^{2} -(nbar{X})^{2}} \ quad \
                          &= begin{bmatrix}
                          frac{sigma_{y}^{2}sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                          -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                          end{bmatrix}\
                          &=begin{bmatrix}
                          frac{sigma_{y}^{2}(S^{2}_{X}+nbar{X}^{2})}{nS^{2}_{X}} & -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} \
                          -frac{sum_{k=1}^{n} x_{k}}{nS^{2}_{X}} & frac{sigma_{y}^{2}}{S^{2}_{X}} \
                          end{bmatrix}\
                          end{align}



                          So if you have the s.d. for X, you just need to find $sum_{k=1}^{n} x_{k}^{2} $ and you will have everything needed to calculate the estimate variance.



                          Just manipulate the variance formula a little to get it: $sum_{k=1}^{n} x_{k}^{2} = sigma_{X}^{2} +nbar{X}^{2}$



                          If you are predicting in-sample, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} $



                          If you are predicting beyond sample range, you get $frac{sum_{k=1}^{n} x_{k}^{2}}{S^{2}_{X}} + frac{sigma^{2}_{Y}}{n} + sigma^{2}_{Y} $







                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Jul 30 '18 at 1:53

























                          answered Jul 29 '18 at 23:41









                          Sergio AndradeSergio Andrade

                          302212




                          302212






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f478103%2ffinding-the-regression-line-given-the-mean-correlation-and-standard-deviation-o%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              How to change which sound is reproduced for terminal bell?

                              Can I use Tabulator js library in my java Spring + Thymeleaf project?

                              Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents