How to prove this identity for discrete random variables? [closed]
How to prove that $E(ξE(η|G)) = E(ηE(ξ|G))$, where $ξ$, $G$ and $η$ are random discrete variables. Of course, if both parts exist.
probability-theory random-variables expected-value
closed as off-topic by Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf Nov 20 at 14:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
How to prove that $E(ξE(η|G)) = E(ηE(ξ|G))$, where $ξ$, $G$ and $η$ are random discrete variables. Of course, if both parts exist.
probability-theory random-variables expected-value
closed as off-topic by Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf Nov 20 at 14:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf
If this question can be reworded to fit the rules in the help center, please edit the question.
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29
add a comment |
How to prove that $E(ξE(η|G)) = E(ηE(ξ|G))$, where $ξ$, $G$ and $η$ are random discrete variables. Of course, if both parts exist.
probability-theory random-variables expected-value
How to prove that $E(ξE(η|G)) = E(ηE(ξ|G))$, where $ξ$, $G$ and $η$ are random discrete variables. Of course, if both parts exist.
probability-theory random-variables expected-value
probability-theory random-variables expected-value
edited Nov 20 at 8:29
asked Nov 20 at 8:02
anykk
665
665
closed as off-topic by Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf Nov 20 at 14:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf
If this question can be reworded to fit the rules in the help center, please edit the question.
closed as off-topic by Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf Nov 20 at 14:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Kavi Rama Murthy, Christopher, Davide Giraudo, Adrian Keister, supinf
If this question can be reworded to fit the rules in the help center, please edit the question.
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29
add a comment |
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29
add a comment |
2 Answers
2
active
oldest
votes
In this answer $G$ must be looked at as a $sigma$-algebra (eventually generated by a random variable).
Characteristic for $mathbb E[ximid G]$ is that it satisfies:$$int_Axi(omega)P(domega)=int_Amathbb E[ximid G](omega)P(domega)text{ whenever }Atext{ is }Gtext{-measurable}$$
Or equivalently:$$mathbb E[ximathbf1_A]=mathbb E[mathbb E[ximid G]mathbf1_A]text{ whenever }Atext{ is }Gtext{-measurable}$$
This can be expanded to the more general statement that: $$mathbb E[xipsi]=mathbb E[mathbb E[ximid G]psi]text{ whenever }psitext{ is }Gtext{-measurable}$$
Now note that $mathbb E[etamid G]$ is by definition $G$-measurable so that we are allowed to conclude that:$$mathbb E[ximathbb E[etamid G]]=mathbb E[mathbb E[ximid G]mathbb E[etamid G]]tag1$$
Similarly we have:$$mathbb E[etamathbb E[ximid G]]=mathbb E[mathbb E[etamid G]mathbb E[ximid G]]tag2$$
Now note that $(1)$ and $(2)$ have equal RHS.
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
|
show 5 more comments
From the law of total expectation
begin{align*}
Eleft(xi Eleft(eta|Gright)right) & =Eleft(Eleft(xi Eleft(eta|Gright)|Gright)right)\
& =Eleft(Eleft(eta|Gright)Eleft(xi|Gright)right)\
& =Eleft(Eleft(eta Eleft(xi|Gright)|Gright)right)\
& =Eleft(eta Eleft(xi|Gright)right)
end{align*}
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
In this answer $G$ must be looked at as a $sigma$-algebra (eventually generated by a random variable).
Characteristic for $mathbb E[ximid G]$ is that it satisfies:$$int_Axi(omega)P(domega)=int_Amathbb E[ximid G](omega)P(domega)text{ whenever }Atext{ is }Gtext{-measurable}$$
Or equivalently:$$mathbb E[ximathbf1_A]=mathbb E[mathbb E[ximid G]mathbf1_A]text{ whenever }Atext{ is }Gtext{-measurable}$$
This can be expanded to the more general statement that: $$mathbb E[xipsi]=mathbb E[mathbb E[ximid G]psi]text{ whenever }psitext{ is }Gtext{-measurable}$$
Now note that $mathbb E[etamid G]$ is by definition $G$-measurable so that we are allowed to conclude that:$$mathbb E[ximathbb E[etamid G]]=mathbb E[mathbb E[ximid G]mathbb E[etamid G]]tag1$$
Similarly we have:$$mathbb E[etamathbb E[ximid G]]=mathbb E[mathbb E[etamid G]mathbb E[ximid G]]tag2$$
Now note that $(1)$ and $(2)$ have equal RHS.
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
|
show 5 more comments
In this answer $G$ must be looked at as a $sigma$-algebra (eventually generated by a random variable).
Characteristic for $mathbb E[ximid G]$ is that it satisfies:$$int_Axi(omega)P(domega)=int_Amathbb E[ximid G](omega)P(domega)text{ whenever }Atext{ is }Gtext{-measurable}$$
Or equivalently:$$mathbb E[ximathbf1_A]=mathbb E[mathbb E[ximid G]mathbf1_A]text{ whenever }Atext{ is }Gtext{-measurable}$$
This can be expanded to the more general statement that: $$mathbb E[xipsi]=mathbb E[mathbb E[ximid G]psi]text{ whenever }psitext{ is }Gtext{-measurable}$$
Now note that $mathbb E[etamid G]$ is by definition $G$-measurable so that we are allowed to conclude that:$$mathbb E[ximathbb E[etamid G]]=mathbb E[mathbb E[ximid G]mathbb E[etamid G]]tag1$$
Similarly we have:$$mathbb E[etamathbb E[ximid G]]=mathbb E[mathbb E[etamid G]mathbb E[ximid G]]tag2$$
Now note that $(1)$ and $(2)$ have equal RHS.
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
|
show 5 more comments
In this answer $G$ must be looked at as a $sigma$-algebra (eventually generated by a random variable).
Characteristic for $mathbb E[ximid G]$ is that it satisfies:$$int_Axi(omega)P(domega)=int_Amathbb E[ximid G](omega)P(domega)text{ whenever }Atext{ is }Gtext{-measurable}$$
Or equivalently:$$mathbb E[ximathbf1_A]=mathbb E[mathbb E[ximid G]mathbf1_A]text{ whenever }Atext{ is }Gtext{-measurable}$$
This can be expanded to the more general statement that: $$mathbb E[xipsi]=mathbb E[mathbb E[ximid G]psi]text{ whenever }psitext{ is }Gtext{-measurable}$$
Now note that $mathbb E[etamid G]$ is by definition $G$-measurable so that we are allowed to conclude that:$$mathbb E[ximathbb E[etamid G]]=mathbb E[mathbb E[ximid G]mathbb E[etamid G]]tag1$$
Similarly we have:$$mathbb E[etamathbb E[ximid G]]=mathbb E[mathbb E[etamid G]mathbb E[ximid G]]tag2$$
Now note that $(1)$ and $(2)$ have equal RHS.
In this answer $G$ must be looked at as a $sigma$-algebra (eventually generated by a random variable).
Characteristic for $mathbb E[ximid G]$ is that it satisfies:$$int_Axi(omega)P(domega)=int_Amathbb E[ximid G](omega)P(domega)text{ whenever }Atext{ is }Gtext{-measurable}$$
Or equivalently:$$mathbb E[ximathbf1_A]=mathbb E[mathbb E[ximid G]mathbf1_A]text{ whenever }Atext{ is }Gtext{-measurable}$$
This can be expanded to the more general statement that: $$mathbb E[xipsi]=mathbb E[mathbb E[ximid G]psi]text{ whenever }psitext{ is }Gtext{-measurable}$$
Now note that $mathbb E[etamid G]$ is by definition $G$-measurable so that we are allowed to conclude that:$$mathbb E[ximathbb E[etamid G]]=mathbb E[mathbb E[ximid G]mathbb E[etamid G]]tag1$$
Similarly we have:$$mathbb E[etamathbb E[ximid G]]=mathbb E[mathbb E[etamid G]mathbb E[ximid G]]tag2$$
Now note that $(1)$ and $(2)$ have equal RHS.
edited Nov 20 at 8:46
answered Nov 20 at 8:30
drhab
97.6k544128
97.6k544128
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
|
show 5 more comments
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Thank you, i'll meditate on it
– anykk
Nov 20 at 8:51
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
Can you explain me, why the integral and the next line are equivalente? I can't see it. And, how we can expange it to $mathbb E[xi psi]$?
– anykk
Nov 26 at 18:34
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
By definition $int_AX(omega)P(domega)=int X(omega) mathbf1_A(omega)P(domega)$ and $mathbb E[Xmathbf1_A]$ is just another notation for the RHS. It can be expanded by first proving it for simple functions, then measurable functions that are limits of simple functions. It goes a bit too far to handle this in a comment. I reckon that you must have the disposal of some mathematical material that handles this stuff, don't you?
– drhab
Nov 26 at 18:48
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
No, I don't. I will learn it later in university. I think, that second proof is more simple, than yours one, but yours is stronger.
– anykk
Nov 26 at 19:26
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
Then where did you meet the question that you posed? Indeed the other answer (I upvoted it) is more direct. I tried to reveal something of the background of the equalities that you find there.
– drhab
Nov 26 at 19:28
|
show 5 more comments
From the law of total expectation
begin{align*}
Eleft(xi Eleft(eta|Gright)right) & =Eleft(Eleft(xi Eleft(eta|Gright)|Gright)right)\
& =Eleft(Eleft(eta|Gright)Eleft(xi|Gright)right)\
& =Eleft(Eleft(eta Eleft(xi|Gright)|Gright)right)\
& =Eleft(eta Eleft(xi|Gright)right)
end{align*}
add a comment |
From the law of total expectation
begin{align*}
Eleft(xi Eleft(eta|Gright)right) & =Eleft(Eleft(xi Eleft(eta|Gright)|Gright)right)\
& =Eleft(Eleft(eta|Gright)Eleft(xi|Gright)right)\
& =Eleft(Eleft(eta Eleft(xi|Gright)|Gright)right)\
& =Eleft(eta Eleft(xi|Gright)right)
end{align*}
add a comment |
From the law of total expectation
begin{align*}
Eleft(xi Eleft(eta|Gright)right) & =Eleft(Eleft(xi Eleft(eta|Gright)|Gright)right)\
& =Eleft(Eleft(eta|Gright)Eleft(xi|Gright)right)\
& =Eleft(Eleft(eta Eleft(xi|Gright)|Gright)right)\
& =Eleft(eta Eleft(xi|Gright)right)
end{align*}
From the law of total expectation
begin{align*}
Eleft(xi Eleft(eta|Gright)right) & =Eleft(Eleft(xi Eleft(eta|Gright)|Gright)right)\
& =Eleft(Eleft(eta|Gright)Eleft(xi|Gright)right)\
& =Eleft(Eleft(eta Eleft(xi|Gright)|Gright)right)\
& =Eleft(eta Eleft(xi|Gright)right)
end{align*}
answered Nov 20 at 8:31
hopeless
585
585
add a comment |
add a comment |
By linearity of expectation, you can change it to $E(xi)E(etamid G) = E(eta)E(ximid G)$. I suspect that that's easier to work with. That being said, what is $G$? I can't imagine that this is true if, for instance, $G$ is $eta=0$.
– Arthur
Nov 20 at 8:10
@Arthur, G is random discrete variable too.
– anykk
Nov 20 at 8:29