Prove that the sum and the absolute difference of 2 Bernoulli(0.5) random variables are not independent
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
add a comment |
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
1
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
probability self-study independence bernoulli-distribution
asked Nov 27 at 22:25
MSE
988
988
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
1
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35
add a comment |
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
1
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
1
1
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
answered Nov 27 at 22:48
Taylor
11.5k11744
11.5k11744
add a comment |
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
add a comment |
up vote
2
down vote
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
answered Nov 27 at 22:28
user158565
4,7191317
4,7191317
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
add a comment |
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
Nov 28 at 0:39
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
Nov 27 at 23:19
1
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35