Density of $(X,Y)$ when $X:=sqrt{-2log U}cos(2pi V)\Y:=sqrt{-2 log U}sin(2pi V)$ $U,V sim(0,1)$
up vote
0
down vote
favorite
Let $U,V sim(0,1)$ be two independent uniformly distributed random variables:
$$X:=sqrt{-2log U}cos(2pi V)\Y:=sqrt{-2 log U}sin(2pi V)$$
How can I determine the density of the distribution of $(X,Y)$?
I know that $frac{Y}{X}=tan(2pi V)$ and $frac{X^2}{log U}+frac{Y^2}{log U}=-2$ but I don't know if this helps here.
probability probability-theory measure-theory probability-distributions
add a comment |
up vote
0
down vote
favorite
Let $U,V sim(0,1)$ be two independent uniformly distributed random variables:
$$X:=sqrt{-2log U}cos(2pi V)\Y:=sqrt{-2 log U}sin(2pi V)$$
How can I determine the density of the distribution of $(X,Y)$?
I know that $frac{Y}{X}=tan(2pi V)$ and $frac{X^2}{log U}+frac{Y^2}{log U}=-2$ but I don't know if this helps here.
probability probability-theory measure-theory probability-distributions
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
@Ian No I don't
– user610431
Nov 13 at 19:00
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Let $U,V sim(0,1)$ be two independent uniformly distributed random variables:
$$X:=sqrt{-2log U}cos(2pi V)\Y:=sqrt{-2 log U}sin(2pi V)$$
How can I determine the density of the distribution of $(X,Y)$?
I know that $frac{Y}{X}=tan(2pi V)$ and $frac{X^2}{log U}+frac{Y^2}{log U}=-2$ but I don't know if this helps here.
probability probability-theory measure-theory probability-distributions
Let $U,V sim(0,1)$ be two independent uniformly distributed random variables:
$$X:=sqrt{-2log U}cos(2pi V)\Y:=sqrt{-2 log U}sin(2pi V)$$
How can I determine the density of the distribution of $(X,Y)$?
I know that $frac{Y}{X}=tan(2pi V)$ and $frac{X^2}{log U}+frac{Y^2}{log U}=-2$ but I don't know if this helps here.
probability probability-theory measure-theory probability-distributions
probability probability-theory measure-theory probability-distributions
edited Nov 13 at 19:04
asked Nov 13 at 18:54
user610431
667
667
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
@Ian No I don't
– user610431
Nov 13 at 19:00
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52
add a comment |
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
@Ian No I don't
– user610431
Nov 13 at 19:00
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
@Ian No I don't
– user610431
Nov 13 at 19:00
@Ian No I don't
– user610431
Nov 13 at 19:00
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
You mean $U,V$ are uniform, and so $(X,Y)$ are $N(0,1)$ in the margins, and clearly uncorrelated. That does not, however, prove them to be independent (though actually they are). To find the density properly, use the transformation law, by finding the Jacobian.
add a comment |
up vote
0
down vote
Using the relations you mentioned, one can solve for $U$ and $V$ in terms of $X$ and $Y$. Let's say $U=g(X,Y)$ and $V=h(X,Y)$.
Then, by the theorem of change of variables, we have
$$f_{XY}(x,y)=f_{UV}(g(x,y),h(x,y))cdot J(x,y),$$
where $f_{UV}$ is the joint PDF of $U$ and $V$, and so
$$f_{UV}(u,v)=left{begin{matrix}1& 0<u<1 wedge 0<v<1 \0 & text{otherwise}\ end{matrix}right.,$$
and $J$ is the Jacobian determinant of $u$ and $v$ with respect to $x$ and $y$, that is
$$J(x,y)=left|begin{matrix} frac{partial g}{partial x}(x,y)& frac{partial g}{partial y}(x,y)\frac{partial h}{partial x}(x,y) & frac{partial h}{partial y}(x,y)\ end{matrix}right|.$$
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You mean $U,V$ are uniform, and so $(X,Y)$ are $N(0,1)$ in the margins, and clearly uncorrelated. That does not, however, prove them to be independent (though actually they are). To find the density properly, use the transformation law, by finding the Jacobian.
add a comment |
up vote
0
down vote
You mean $U,V$ are uniform, and so $(X,Y)$ are $N(0,1)$ in the margins, and clearly uncorrelated. That does not, however, prove them to be independent (though actually they are). To find the density properly, use the transformation law, by finding the Jacobian.
add a comment |
up vote
0
down vote
up vote
0
down vote
You mean $U,V$ are uniform, and so $(X,Y)$ are $N(0,1)$ in the margins, and clearly uncorrelated. That does not, however, prove them to be independent (though actually they are). To find the density properly, use the transformation law, by finding the Jacobian.
You mean $U,V$ are uniform, and so $(X,Y)$ are $N(0,1)$ in the margins, and clearly uncorrelated. That does not, however, prove them to be independent (though actually they are). To find the density properly, use the transformation law, by finding the Jacobian.
answered Nov 13 at 19:04
Richard Martin
1,4718
1,4718
add a comment |
add a comment |
up vote
0
down vote
Using the relations you mentioned, one can solve for $U$ and $V$ in terms of $X$ and $Y$. Let's say $U=g(X,Y)$ and $V=h(X,Y)$.
Then, by the theorem of change of variables, we have
$$f_{XY}(x,y)=f_{UV}(g(x,y),h(x,y))cdot J(x,y),$$
where $f_{UV}$ is the joint PDF of $U$ and $V$, and so
$$f_{UV}(u,v)=left{begin{matrix}1& 0<u<1 wedge 0<v<1 \0 & text{otherwise}\ end{matrix}right.,$$
and $J$ is the Jacobian determinant of $u$ and $v$ with respect to $x$ and $y$, that is
$$J(x,y)=left|begin{matrix} frac{partial g}{partial x}(x,y)& frac{partial g}{partial y}(x,y)\frac{partial h}{partial x}(x,y) & frac{partial h}{partial y}(x,y)\ end{matrix}right|.$$
add a comment |
up vote
0
down vote
Using the relations you mentioned, one can solve for $U$ and $V$ in terms of $X$ and $Y$. Let's say $U=g(X,Y)$ and $V=h(X,Y)$.
Then, by the theorem of change of variables, we have
$$f_{XY}(x,y)=f_{UV}(g(x,y),h(x,y))cdot J(x,y),$$
where $f_{UV}$ is the joint PDF of $U$ and $V$, and so
$$f_{UV}(u,v)=left{begin{matrix}1& 0<u<1 wedge 0<v<1 \0 & text{otherwise}\ end{matrix}right.,$$
and $J$ is the Jacobian determinant of $u$ and $v$ with respect to $x$ and $y$, that is
$$J(x,y)=left|begin{matrix} frac{partial g}{partial x}(x,y)& frac{partial g}{partial y}(x,y)\frac{partial h}{partial x}(x,y) & frac{partial h}{partial y}(x,y)\ end{matrix}right|.$$
add a comment |
up vote
0
down vote
up vote
0
down vote
Using the relations you mentioned, one can solve for $U$ and $V$ in terms of $X$ and $Y$. Let's say $U=g(X,Y)$ and $V=h(X,Y)$.
Then, by the theorem of change of variables, we have
$$f_{XY}(x,y)=f_{UV}(g(x,y),h(x,y))cdot J(x,y),$$
where $f_{UV}$ is the joint PDF of $U$ and $V$, and so
$$f_{UV}(u,v)=left{begin{matrix}1& 0<u<1 wedge 0<v<1 \0 & text{otherwise}\ end{matrix}right.,$$
and $J$ is the Jacobian determinant of $u$ and $v$ with respect to $x$ and $y$, that is
$$J(x,y)=left|begin{matrix} frac{partial g}{partial x}(x,y)& frac{partial g}{partial y}(x,y)\frac{partial h}{partial x}(x,y) & frac{partial h}{partial y}(x,y)\ end{matrix}right|.$$
Using the relations you mentioned, one can solve for $U$ and $V$ in terms of $X$ and $Y$. Let's say $U=g(X,Y)$ and $V=h(X,Y)$.
Then, by the theorem of change of variables, we have
$$f_{XY}(x,y)=f_{UV}(g(x,y),h(x,y))cdot J(x,y),$$
where $f_{UV}$ is the joint PDF of $U$ and $V$, and so
$$f_{UV}(u,v)=left{begin{matrix}1& 0<u<1 wedge 0<v<1 \0 & text{otherwise}\ end{matrix}right.,$$
and $J$ is the Jacobian determinant of $u$ and $v$ with respect to $x$ and $y$, that is
$$J(x,y)=left|begin{matrix} frac{partial g}{partial x}(x,y)& frac{partial g}{partial y}(x,y)\frac{partial h}{partial x}(x,y) & frac{partial h}{partial y}(x,y)\ end{matrix}right|.$$
answered Nov 13 at 19:15
Alejandro Nasif Salum
3,629117
3,629117
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997147%2fdensity-of-x-y-when-x-sqrt-2-log-u-cos2-pi-v-y-sqrt-2-log-u-sin%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Do you know the distribution (specifically the CDF) of $-log(U)$? If you do then you can get the distribution of $sqrt{-2log(U)}$ and then the rest of the problem is conceptual.
– Ian
Nov 13 at 18:57
@Ian No I don't
– user610431
Nov 13 at 19:00
Well, work on that part first. It isn't hard.
– Ian
Nov 13 at 19:01
I guess you mean that $U$ and $V$ are uniformly distributed (and independent), not $X$ and $Y$.
– Alejandro Nasif Salum
Nov 13 at 19:03
Possible duplicate of Proof of the Box-Muller method
– StubbornAtom
Nov 14 at 15:52