Why does $ operatorname{Var}(X) = E[X^2] - (E[X])^2 $
$begingroup$
$ operatorname{Var}(X) = E[X^2] - (E[X])^2 $
I have seen and understand (mathematically) the proof for this. What I want to understand is: intuitively, why is this true? What does this formula tell us? From the formula, we see that if we subtract the square of expected value of x from the expected value of $ x^2 $, we get a measure of dispersion in the data (or in the case of standard deviation, the root of this value gets us a measure of dispersion in the data).
So it seems that there is some linkage between the expected value of $ x^2 $ and $ x $. How do I make sense of this formula? For example, the formula
$$ sigma^2 = frac 1n sum_{i = 1}^n (x_i - bar{x})^2 $$
makes perfect intuitive sense. It simply gives us the average of squares of deviations from the mean. What does the other formula tell us?
probability statistics variance
$endgroup$
add a comment |
$begingroup$
$ operatorname{Var}(X) = E[X^2] - (E[X])^2 $
I have seen and understand (mathematically) the proof for this. What I want to understand is: intuitively, why is this true? What does this formula tell us? From the formula, we see that if we subtract the square of expected value of x from the expected value of $ x^2 $, we get a measure of dispersion in the data (or in the case of standard deviation, the root of this value gets us a measure of dispersion in the data).
So it seems that there is some linkage between the expected value of $ x^2 $ and $ x $. How do I make sense of this formula? For example, the formula
$$ sigma^2 = frac 1n sum_{i = 1}^n (x_i - bar{x})^2 $$
makes perfect intuitive sense. It simply gives us the average of squares of deviations from the mean. What does the other formula tell us?
probability statistics variance
$endgroup$
1
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51
add a comment |
$begingroup$
$ operatorname{Var}(X) = E[X^2] - (E[X])^2 $
I have seen and understand (mathematically) the proof for this. What I want to understand is: intuitively, why is this true? What does this formula tell us? From the formula, we see that if we subtract the square of expected value of x from the expected value of $ x^2 $, we get a measure of dispersion in the data (or in the case of standard deviation, the root of this value gets us a measure of dispersion in the data).
So it seems that there is some linkage between the expected value of $ x^2 $ and $ x $. How do I make sense of this formula? For example, the formula
$$ sigma^2 = frac 1n sum_{i = 1}^n (x_i - bar{x})^2 $$
makes perfect intuitive sense. It simply gives us the average of squares of deviations from the mean. What does the other formula tell us?
probability statistics variance
$endgroup$
$ operatorname{Var}(X) = E[X^2] - (E[X])^2 $
I have seen and understand (mathematically) the proof for this. What I want to understand is: intuitively, why is this true? What does this formula tell us? From the formula, we see that if we subtract the square of expected value of x from the expected value of $ x^2 $, we get a measure of dispersion in the data (or in the case of standard deviation, the root of this value gets us a measure of dispersion in the data).
So it seems that there is some linkage between the expected value of $ x^2 $ and $ x $. How do I make sense of this formula? For example, the formula
$$ sigma^2 = frac 1n sum_{i = 1}^n (x_i - bar{x})^2 $$
makes perfect intuitive sense. It simply gives us the average of squares of deviations from the mean. What does the other formula tell us?
probability statistics variance
probability statistics variance
edited Dec 4 '18 at 22:50
Foobaz John
22.3k41452
22.3k41452
asked Dec 4 '18 at 22:45
WorldGovWorldGov
324111
324111
1
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51
add a comment |
1
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51
1
1
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
The other formula tells you exactly the same thing as the one that you have given with $x,x^2$ $&$ $n$. You say you understand this formula so I assume that you also get that variance is just the average of all the deviations squared.
Now,
$mathbb{E}(X)$ is just the average of of all $x’_is$, which is to say that it is the mean of all $x’_is$.
Let us now define a deviation using the expectation operator.
$$Deviation = D = (X-mathbb{E}(X))$$
And Deviation squared is,
$$D^2 = (X-mathbb{E}(X))^2$$
Now that we have deviation let’s find the variance.
Using the above mentioned definition of variance, you should be able to see that
$$Variance = mathbb{E}(D^2)$$
Since $mathbb{E}(X)$ is the average value of $X$,The above equation is just the average of deviations squared.
Putting the value of $D^2$, we get,
$$Var(X) = mathbb{E}(X-mathbb{E}(X))^2 = mathbb{E}(X^2+mathbb{E}(X)^2-2X*mathbb{E}(X)) = mathbb{E}(X^2)+mathbb{E}(X)^2-2mathbb{E}(X)^2 = mathbb{E}(X^2)-mathbb{E}(X)^2$$
Hope this helps.
$endgroup$
add a comment |
$begingroup$
Easy! Expand by the definition. Variance is the mean squared deviation, i.e., $V(X) = E((X-mu)^2).$ Now:
$$ (X-mu)^2 = X^2 - 2X mu + mu^2$$
and use the fact that $E(cdot)$ is a linear function and that $mu$ (the mean) is a constant.
The shortcut computes the same thing, but counts the difference in the mean of squares and the square of the mean.
$endgroup$
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
add a comment |
$begingroup$
Some times ago, a professor showed me this right triangle:
The formula you reported can be seen as the application of the Phytagora's theorem:
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X].$$
Here, $P = mathbb{E}^2[X]$ (which is the second uncentered moment of $X$) is read as "the power" of $X$. Indeed, there is a physical explanation.
In physics, energy and power are related to the "square" of some quantity (i.e. $X$ can be velocity for kinetic energy, current for Joule law, etc.).
Suppose that these quantities are random (indeed, $X$ is a random variable). Then, the power $P$ is the sum of two contribution:
- The square of the expected value of $X$;
- Its variance (i.e. how much it varies from the expected value).
It is clear that, if $X$ is not random, then $text{Var}[X] = 0$ and $mathbb{E}^2[X] = X^2$, so that:
$$P = X^2,$$
which is a typical physical definition of energy/power. When randomness is present, the we must use the whole formula
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X]$$
to evaluate the power of the signal.
As a final remark, the power of $X$ can be seen as the length of the vector which components corresponds to the square of its expected value plus its variability.
P.S.
A further clarification... the values $P$, $text{Var}[X]$ and $mathbb{E}^2[X]$ represent the squares of the sides of the triangle, not their length...
$endgroup$
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
add a comment |
$begingroup$
One intuitive way of measuring the variation of $X$ would be to look at how far, on average, $X$ is from it’s mean, $E(X)=mu$. That is, we want to compute $E(X-mu)$. However, mathematically, it’s “inconvenient” to use $E(X-mu)$, so we use the more convenient $E((X-mu)^{2}))$.
To add, the formula you gave above, $frac{1}{n}sum_{i=1}^{n}(x_{i}-bar{x})$ is what you would use when you have finite data points. There is nothing random once you have your data points. $Var(X)$ is for a random variable, that can take on finite values, infinite countable values, or values on an interval.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026285%2fwhy-does-operatornamevarx-ex2-ex2%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The other formula tells you exactly the same thing as the one that you have given with $x,x^2$ $&$ $n$. You say you understand this formula so I assume that you also get that variance is just the average of all the deviations squared.
Now,
$mathbb{E}(X)$ is just the average of of all $x’_is$, which is to say that it is the mean of all $x’_is$.
Let us now define a deviation using the expectation operator.
$$Deviation = D = (X-mathbb{E}(X))$$
And Deviation squared is,
$$D^2 = (X-mathbb{E}(X))^2$$
Now that we have deviation let’s find the variance.
Using the above mentioned definition of variance, you should be able to see that
$$Variance = mathbb{E}(D^2)$$
Since $mathbb{E}(X)$ is the average value of $X$,The above equation is just the average of deviations squared.
Putting the value of $D^2$, we get,
$$Var(X) = mathbb{E}(X-mathbb{E}(X))^2 = mathbb{E}(X^2+mathbb{E}(X)^2-2X*mathbb{E}(X)) = mathbb{E}(X^2)+mathbb{E}(X)^2-2mathbb{E}(X)^2 = mathbb{E}(X^2)-mathbb{E}(X)^2$$
Hope this helps.
$endgroup$
add a comment |
$begingroup$
The other formula tells you exactly the same thing as the one that you have given with $x,x^2$ $&$ $n$. You say you understand this formula so I assume that you also get that variance is just the average of all the deviations squared.
Now,
$mathbb{E}(X)$ is just the average of of all $x’_is$, which is to say that it is the mean of all $x’_is$.
Let us now define a deviation using the expectation operator.
$$Deviation = D = (X-mathbb{E}(X))$$
And Deviation squared is,
$$D^2 = (X-mathbb{E}(X))^2$$
Now that we have deviation let’s find the variance.
Using the above mentioned definition of variance, you should be able to see that
$$Variance = mathbb{E}(D^2)$$
Since $mathbb{E}(X)$ is the average value of $X$,The above equation is just the average of deviations squared.
Putting the value of $D^2$, we get,
$$Var(X) = mathbb{E}(X-mathbb{E}(X))^2 = mathbb{E}(X^2+mathbb{E}(X)^2-2X*mathbb{E}(X)) = mathbb{E}(X^2)+mathbb{E}(X)^2-2mathbb{E}(X)^2 = mathbb{E}(X^2)-mathbb{E}(X)^2$$
Hope this helps.
$endgroup$
add a comment |
$begingroup$
The other formula tells you exactly the same thing as the one that you have given with $x,x^2$ $&$ $n$. You say you understand this formula so I assume that you also get that variance is just the average of all the deviations squared.
Now,
$mathbb{E}(X)$ is just the average of of all $x’_is$, which is to say that it is the mean of all $x’_is$.
Let us now define a deviation using the expectation operator.
$$Deviation = D = (X-mathbb{E}(X))$$
And Deviation squared is,
$$D^2 = (X-mathbb{E}(X))^2$$
Now that we have deviation let’s find the variance.
Using the above mentioned definition of variance, you should be able to see that
$$Variance = mathbb{E}(D^2)$$
Since $mathbb{E}(X)$ is the average value of $X$,The above equation is just the average of deviations squared.
Putting the value of $D^2$, we get,
$$Var(X) = mathbb{E}(X-mathbb{E}(X))^2 = mathbb{E}(X^2+mathbb{E}(X)^2-2X*mathbb{E}(X)) = mathbb{E}(X^2)+mathbb{E}(X)^2-2mathbb{E}(X)^2 = mathbb{E}(X^2)-mathbb{E}(X)^2$$
Hope this helps.
$endgroup$
The other formula tells you exactly the same thing as the one that you have given with $x,x^2$ $&$ $n$. You say you understand this formula so I assume that you also get that variance is just the average of all the deviations squared.
Now,
$mathbb{E}(X)$ is just the average of of all $x’_is$, which is to say that it is the mean of all $x’_is$.
Let us now define a deviation using the expectation operator.
$$Deviation = D = (X-mathbb{E}(X))$$
And Deviation squared is,
$$D^2 = (X-mathbb{E}(X))^2$$
Now that we have deviation let’s find the variance.
Using the above mentioned definition of variance, you should be able to see that
$$Variance = mathbb{E}(D^2)$$
Since $mathbb{E}(X)$ is the average value of $X$,The above equation is just the average of deviations squared.
Putting the value of $D^2$, we get,
$$Var(X) = mathbb{E}(X-mathbb{E}(X))^2 = mathbb{E}(X^2+mathbb{E}(X)^2-2X*mathbb{E}(X)) = mathbb{E}(X^2)+mathbb{E}(X)^2-2mathbb{E}(X)^2 = mathbb{E}(X^2)-mathbb{E}(X)^2$$
Hope this helps.
answered Dec 5 '18 at 0:02
user601297user601297
37119
37119
add a comment |
add a comment |
$begingroup$
Easy! Expand by the definition. Variance is the mean squared deviation, i.e., $V(X) = E((X-mu)^2).$ Now:
$$ (X-mu)^2 = X^2 - 2X mu + mu^2$$
and use the fact that $E(cdot)$ is a linear function and that $mu$ (the mean) is a constant.
The shortcut computes the same thing, but counts the difference in the mean of squares and the square of the mean.
$endgroup$
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
add a comment |
$begingroup$
Easy! Expand by the definition. Variance is the mean squared deviation, i.e., $V(X) = E((X-mu)^2).$ Now:
$$ (X-mu)^2 = X^2 - 2X mu + mu^2$$
and use the fact that $E(cdot)$ is a linear function and that $mu$ (the mean) is a constant.
The shortcut computes the same thing, but counts the difference in the mean of squares and the square of the mean.
$endgroup$
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
add a comment |
$begingroup$
Easy! Expand by the definition. Variance is the mean squared deviation, i.e., $V(X) = E((X-mu)^2).$ Now:
$$ (X-mu)^2 = X^2 - 2X mu + mu^2$$
and use the fact that $E(cdot)$ is a linear function and that $mu$ (the mean) is a constant.
The shortcut computes the same thing, but counts the difference in the mean of squares and the square of the mean.
$endgroup$
Easy! Expand by the definition. Variance is the mean squared deviation, i.e., $V(X) = E((X-mu)^2).$ Now:
$$ (X-mu)^2 = X^2 - 2X mu + mu^2$$
and use the fact that $E(cdot)$ is a linear function and that $mu$ (the mean) is a constant.
The shortcut computes the same thing, but counts the difference in the mean of squares and the square of the mean.
answered Dec 4 '18 at 22:50
Sean RobersonSean Roberson
6,39031327
6,39031327
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
add a comment |
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
$begingroup$
How can one prove that the expected value is a linear function?
$endgroup$
– Zacky
Dec 4 '18 at 22:57
1
1
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
$begingroup$
It follows from writing it as a sum: $$E(kX + Y) = sum (kxP(X = x) + yP(Y = y)) = ksum xP(X = x) + sum yP(Y = y)$$
$endgroup$
– Sean Roberson
Dec 4 '18 at 22:59
1
1
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
$begingroup$
Just to add to this, and take this with a grain of salt since I don't know probability: That this is a good definition for variance follows from wanting to get a sense of the distance you expect values of your random variable to be from the mean, one might naively choose the absolute value, but squaring is better as a smooth operation.
$endgroup$
– qbert
Dec 4 '18 at 23:50
add a comment |
$begingroup$
Some times ago, a professor showed me this right triangle:
The formula you reported can be seen as the application of the Phytagora's theorem:
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X].$$
Here, $P = mathbb{E}^2[X]$ (which is the second uncentered moment of $X$) is read as "the power" of $X$. Indeed, there is a physical explanation.
In physics, energy and power are related to the "square" of some quantity (i.e. $X$ can be velocity for kinetic energy, current for Joule law, etc.).
Suppose that these quantities are random (indeed, $X$ is a random variable). Then, the power $P$ is the sum of two contribution:
- The square of the expected value of $X$;
- Its variance (i.e. how much it varies from the expected value).
It is clear that, if $X$ is not random, then $text{Var}[X] = 0$ and $mathbb{E}^2[X] = X^2$, so that:
$$P = X^2,$$
which is a typical physical definition of energy/power. When randomness is present, the we must use the whole formula
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X]$$
to evaluate the power of the signal.
As a final remark, the power of $X$ can be seen as the length of the vector which components corresponds to the square of its expected value plus its variability.
P.S.
A further clarification... the values $P$, $text{Var}[X]$ and $mathbb{E}^2[X]$ represent the squares of the sides of the triangle, not their length...
$endgroup$
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
add a comment |
$begingroup$
Some times ago, a professor showed me this right triangle:
The formula you reported can be seen as the application of the Phytagora's theorem:
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X].$$
Here, $P = mathbb{E}^2[X]$ (which is the second uncentered moment of $X$) is read as "the power" of $X$. Indeed, there is a physical explanation.
In physics, energy and power are related to the "square" of some quantity (i.e. $X$ can be velocity for kinetic energy, current for Joule law, etc.).
Suppose that these quantities are random (indeed, $X$ is a random variable). Then, the power $P$ is the sum of two contribution:
- The square of the expected value of $X$;
- Its variance (i.e. how much it varies from the expected value).
It is clear that, if $X$ is not random, then $text{Var}[X] = 0$ and $mathbb{E}^2[X] = X^2$, so that:
$$P = X^2,$$
which is a typical physical definition of energy/power. When randomness is present, the we must use the whole formula
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X]$$
to evaluate the power of the signal.
As a final remark, the power of $X$ can be seen as the length of the vector which components corresponds to the square of its expected value plus its variability.
P.S.
A further clarification... the values $P$, $text{Var}[X]$ and $mathbb{E}^2[X]$ represent the squares of the sides of the triangle, not their length...
$endgroup$
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
add a comment |
$begingroup$
Some times ago, a professor showed me this right triangle:
The formula you reported can be seen as the application of the Phytagora's theorem:
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X].$$
Here, $P = mathbb{E}^2[X]$ (which is the second uncentered moment of $X$) is read as "the power" of $X$. Indeed, there is a physical explanation.
In physics, energy and power are related to the "square" of some quantity (i.e. $X$ can be velocity for kinetic energy, current for Joule law, etc.).
Suppose that these quantities are random (indeed, $X$ is a random variable). Then, the power $P$ is the sum of two contribution:
- The square of the expected value of $X$;
- Its variance (i.e. how much it varies from the expected value).
It is clear that, if $X$ is not random, then $text{Var}[X] = 0$ and $mathbb{E}^2[X] = X^2$, so that:
$$P = X^2,$$
which is a typical physical definition of energy/power. When randomness is present, the we must use the whole formula
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X]$$
to evaluate the power of the signal.
As a final remark, the power of $X$ can be seen as the length of the vector which components corresponds to the square of its expected value plus its variability.
P.S.
A further clarification... the values $P$, $text{Var}[X]$ and $mathbb{E}^2[X]$ represent the squares of the sides of the triangle, not their length...
$endgroup$
Some times ago, a professor showed me this right triangle:
The formula you reported can be seen as the application of the Phytagora's theorem:
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X].$$
Here, $P = mathbb{E}^2[X]$ (which is the second uncentered moment of $X$) is read as "the power" of $X$. Indeed, there is a physical explanation.
In physics, energy and power are related to the "square" of some quantity (i.e. $X$ can be velocity for kinetic energy, current for Joule law, etc.).
Suppose that these quantities are random (indeed, $X$ is a random variable). Then, the power $P$ is the sum of two contribution:
- The square of the expected value of $X$;
- Its variance (i.e. how much it varies from the expected value).
It is clear that, if $X$ is not random, then $text{Var}[X] = 0$ and $mathbb{E}^2[X] = X^2$, so that:
$$P = X^2,$$
which is a typical physical definition of energy/power. When randomness is present, the we must use the whole formula
$$P = mathbb{E}[X^2] = text{Var}[X] + mathbb{E}^2[X]$$
to evaluate the power of the signal.
As a final remark, the power of $X$ can be seen as the length of the vector which components corresponds to the square of its expected value plus its variability.
P.S.
A further clarification... the values $P$, $text{Var}[X]$ and $mathbb{E}^2[X]$ represent the squares of the sides of the triangle, not their length...
edited Dec 4 '18 at 23:52
answered Dec 4 '18 at 23:47
the_candymanthe_candyman
8,97832145
8,97832145
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
add a comment |
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
$begingroup$
+1, I love this interpretation! I never saw it before.
$endgroup$
– Sean Roberson
Dec 5 '18 at 0:50
add a comment |
$begingroup$
One intuitive way of measuring the variation of $X$ would be to look at how far, on average, $X$ is from it’s mean, $E(X)=mu$. That is, we want to compute $E(X-mu)$. However, mathematically, it’s “inconvenient” to use $E(X-mu)$, so we use the more convenient $E((X-mu)^{2}))$.
To add, the formula you gave above, $frac{1}{n}sum_{i=1}^{n}(x_{i}-bar{x})$ is what you would use when you have finite data points. There is nothing random once you have your data points. $Var(X)$ is for a random variable, that can take on finite values, infinite countable values, or values on an interval.
$endgroup$
add a comment |
$begingroup$
One intuitive way of measuring the variation of $X$ would be to look at how far, on average, $X$ is from it’s mean, $E(X)=mu$. That is, we want to compute $E(X-mu)$. However, mathematically, it’s “inconvenient” to use $E(X-mu)$, so we use the more convenient $E((X-mu)^{2}))$.
To add, the formula you gave above, $frac{1}{n}sum_{i=1}^{n}(x_{i}-bar{x})$ is what you would use when you have finite data points. There is nothing random once you have your data points. $Var(X)$ is for a random variable, that can take on finite values, infinite countable values, or values on an interval.
$endgroup$
add a comment |
$begingroup$
One intuitive way of measuring the variation of $X$ would be to look at how far, on average, $X$ is from it’s mean, $E(X)=mu$. That is, we want to compute $E(X-mu)$. However, mathematically, it’s “inconvenient” to use $E(X-mu)$, so we use the more convenient $E((X-mu)^{2}))$.
To add, the formula you gave above, $frac{1}{n}sum_{i=1}^{n}(x_{i}-bar{x})$ is what you would use when you have finite data points. There is nothing random once you have your data points. $Var(X)$ is for a random variable, that can take on finite values, infinite countable values, or values on an interval.
$endgroup$
One intuitive way of measuring the variation of $X$ would be to look at how far, on average, $X$ is from it’s mean, $E(X)=mu$. That is, we want to compute $E(X-mu)$. However, mathematically, it’s “inconvenient” to use $E(X-mu)$, so we use the more convenient $E((X-mu)^{2}))$.
To add, the formula you gave above, $frac{1}{n}sum_{i=1}^{n}(x_{i}-bar{x})$ is what you would use when you have finite data points. There is nothing random once you have your data points. $Var(X)$ is for a random variable, that can take on finite values, infinite countable values, or values on an interval.
edited Dec 4 '18 at 23:18
answered Dec 4 '18 at 23:10
Live Free or π HardLive Free or π Hard
479213
479213
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026285%2fwhy-does-operatornamevarx-ex2-ex2%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
But... this is... a... definition, no?
$endgroup$
– Did
Dec 4 '18 at 23:51