Combination of Poisson and binomial distribution












2












$begingroup$


I'm working on the following problem




Each time you flip a certain coin, heads appears with probability $p$. Suppose that you flip the coin a random number of $N$ times, where $N$ has the Poisson distribution with parameter $lambda$ and is independent of the outcomes of the flips. Find the distributions of the numbers $X$ and $Y$ of the resulting heads and tails, respectively, and show that $X$ and $Y$ are independent.




What I tried, is conditioning on the value of $N$:
begin{eqnarray}
mathbb{P}(X=x) & = & sum_{k=0}^{infty}mathbb{P}(X=x | N=k)mathbb{P}(N=k)\
& = & sum_{k=0}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}\
& = & sum_{k=x}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}.\
end{eqnarray}
Similarly, for $Y$ i found $$mathbb{P}(y=y)=sum_{k=y}^{infty}binom{k}{y}p^{k-y}(1-p)^yfrac{lambda^ke^{-lambda}}{k!}.$$
I tried to work this out but I didn't seem to go anywhere. The answer should be that $X sim Pois(lambda p)$ and because of symmetry we would have $Y sim Pois(lambda (1-p))$.



Can anyone provide some help about how to from where I came to $X sim Pois(lambda p)$? Thanks in advance.










share|cite|improve this question











$endgroup$












  • $begingroup$
    By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:23
















2












$begingroup$


I'm working on the following problem




Each time you flip a certain coin, heads appears with probability $p$. Suppose that you flip the coin a random number of $N$ times, where $N$ has the Poisson distribution with parameter $lambda$ and is independent of the outcomes of the flips. Find the distributions of the numbers $X$ and $Y$ of the resulting heads and tails, respectively, and show that $X$ and $Y$ are independent.




What I tried, is conditioning on the value of $N$:
begin{eqnarray}
mathbb{P}(X=x) & = & sum_{k=0}^{infty}mathbb{P}(X=x | N=k)mathbb{P}(N=k)\
& = & sum_{k=0}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}\
& = & sum_{k=x}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}.\
end{eqnarray}
Similarly, for $Y$ i found $$mathbb{P}(y=y)=sum_{k=y}^{infty}binom{k}{y}p^{k-y}(1-p)^yfrac{lambda^ke^{-lambda}}{k!}.$$
I tried to work this out but I didn't seem to go anywhere. The answer should be that $X sim Pois(lambda p)$ and because of symmetry we would have $Y sim Pois(lambda (1-p))$.



Can anyone provide some help about how to from where I came to $X sim Pois(lambda p)$? Thanks in advance.










share|cite|improve this question











$endgroup$












  • $begingroup$
    By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:23














2












2








2


1



$begingroup$


I'm working on the following problem




Each time you flip a certain coin, heads appears with probability $p$. Suppose that you flip the coin a random number of $N$ times, where $N$ has the Poisson distribution with parameter $lambda$ and is independent of the outcomes of the flips. Find the distributions of the numbers $X$ and $Y$ of the resulting heads and tails, respectively, and show that $X$ and $Y$ are independent.




What I tried, is conditioning on the value of $N$:
begin{eqnarray}
mathbb{P}(X=x) & = & sum_{k=0}^{infty}mathbb{P}(X=x | N=k)mathbb{P}(N=k)\
& = & sum_{k=0}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}\
& = & sum_{k=x}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}.\
end{eqnarray}
Similarly, for $Y$ i found $$mathbb{P}(y=y)=sum_{k=y}^{infty}binom{k}{y}p^{k-y}(1-p)^yfrac{lambda^ke^{-lambda}}{k!}.$$
I tried to work this out but I didn't seem to go anywhere. The answer should be that $X sim Pois(lambda p)$ and because of symmetry we would have $Y sim Pois(lambda (1-p))$.



Can anyone provide some help about how to from where I came to $X sim Pois(lambda p)$? Thanks in advance.










share|cite|improve this question











$endgroup$




I'm working on the following problem




Each time you flip a certain coin, heads appears with probability $p$. Suppose that you flip the coin a random number of $N$ times, where $N$ has the Poisson distribution with parameter $lambda$ and is independent of the outcomes of the flips. Find the distributions of the numbers $X$ and $Y$ of the resulting heads and tails, respectively, and show that $X$ and $Y$ are independent.




What I tried, is conditioning on the value of $N$:
begin{eqnarray}
mathbb{P}(X=x) & = & sum_{k=0}^{infty}mathbb{P}(X=x | N=k)mathbb{P}(N=k)\
& = & sum_{k=0}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}\
& = & sum_{k=x}^{infty}binom{k}{x}p^x(1-p)^{k-x}frac{lambda^ke^{-lambda}}{k!}.\
end{eqnarray}
Similarly, for $Y$ i found $$mathbb{P}(y=y)=sum_{k=y}^{infty}binom{k}{y}p^{k-y}(1-p)^yfrac{lambda^ke^{-lambda}}{k!}.$$
I tried to work this out but I didn't seem to go anywhere. The answer should be that $X sim Pois(lambda p)$ and because of symmetry we would have $Y sim Pois(lambda (1-p))$.



Can anyone provide some help about how to from where I came to $X sim Pois(lambda p)$? Thanks in advance.







probability probability-distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 6 '17 at 19:24







Václav Mordvinov

















asked Nov 6 '17 at 19:11









Václav MordvinovVáclav Mordvinov

1,959421




1,959421












  • $begingroup$
    By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:23


















  • $begingroup$
    By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:23
















$begingroup$
By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
$endgroup$
– Václav Mordvinov
Nov 6 '17 at 19:23




$begingroup$
By the way, I know how to show $X$ and $Y$ are independent given $X sim Pois(lambda p)$ and $X sim Pois(lambda (1-p))$ so it's not necessary to answer that subquestion.
$endgroup$
– Václav Mordvinov
Nov 6 '17 at 19:23










1 Answer
1






active

oldest

votes


















3












$begingroup$

I would suggest to use moment-generating functions (MGF): simpler, faster proof. Namely, you have, for $tinmathbb{R}$,
$$begin{align}
mathbb{E} e^{tX}
&= mathbb{E}[ mathbb{E}[ e^{tX} mid N ] ]
stackrel{(dagger)}{=} mathbb{E}[ (1-p+pe^{t})^N ]\
&= mathbb{E}[ e^{Nln(1-p+pe^{t})} ]
stackrel{(ddagger)}{=} exp(lambda(e^{ln(1-p+pe^{t})}-1))\
&= exp(lambda((1-p+pe^{t})-1))\
&= exp(lambda p(e^{t}-1))
end{align}$$

where $(dagger)$ uses the expression of the MGF of a Binomial distribution with parameters $N$ and $p$, and $(ddagger)$ that of the MGF of a Poisson distribution with parameter $lambda$ (applied to the argument $t'stackrel{rm def}{=}ln(1-p+pe^{t})$).



At the end, you get that, for every $tinmathbb{R}$,
$$
mathbb{E} e^{tX} = exp(lambda p(e^{t}-1)) tag{$ast$}
$$

which is the MGF of a Poisson distribution with parameter $lambda p$. As the MGF characterizes the distribution (when it exists), we have the result.





However, if you want to finish your computation: here how it goes. I assume $pneq 1$, otherwise the answer is trivial.
begin{align}
mathbb{P}{X=n} &= sum_{k=n}^infty binom{k}{n}p^n(1-p)^{k-n} frac{lambda^k e^{-lambda}}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty binom{k}{n}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty frac{k!}{n!(k-n)!}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{k=n}^infty frac{1}{(k-n)!}(1-p)^{k} lambda^k\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{ell=0}^infty frac{1}{ell!}(1-p)^{ell+n} lambda^{ell+n}\
&= e^{-lambda}frac{(lambda p)^n}{n!}sum_{ell=0}^infty frac{(1-p)^{ell} lambda^{ell}}{ell!}\
&= e^{-lambda}frac{(lambda p)^n}{n!}e^{lambda(1-p)}
= boxed{e^{-lambda p}frac{(lambda p)^n}{n!}}
end{align}

and you get the probability mass function of a Poisson r.v. with parameter $lambda p$, as desired.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:37






  • 1




    $begingroup$
    Yes. I can update my answer to show how to finish your computation.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:40










  • $begingroup$
    @VáclavMordvinov Updated.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:46






  • 1




    $begingroup$
    What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
    $endgroup$
    – LoveTooNap29
    Nov 6 '17 at 19:54








  • 2




    $begingroup$
    @LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:56











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2507946%2fcombination-of-poisson-and-binomial-distribution%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









3












$begingroup$

I would suggest to use moment-generating functions (MGF): simpler, faster proof. Namely, you have, for $tinmathbb{R}$,
$$begin{align}
mathbb{E} e^{tX}
&= mathbb{E}[ mathbb{E}[ e^{tX} mid N ] ]
stackrel{(dagger)}{=} mathbb{E}[ (1-p+pe^{t})^N ]\
&= mathbb{E}[ e^{Nln(1-p+pe^{t})} ]
stackrel{(ddagger)}{=} exp(lambda(e^{ln(1-p+pe^{t})}-1))\
&= exp(lambda((1-p+pe^{t})-1))\
&= exp(lambda p(e^{t}-1))
end{align}$$

where $(dagger)$ uses the expression of the MGF of a Binomial distribution with parameters $N$ and $p$, and $(ddagger)$ that of the MGF of a Poisson distribution with parameter $lambda$ (applied to the argument $t'stackrel{rm def}{=}ln(1-p+pe^{t})$).



At the end, you get that, for every $tinmathbb{R}$,
$$
mathbb{E} e^{tX} = exp(lambda p(e^{t}-1)) tag{$ast$}
$$

which is the MGF of a Poisson distribution with parameter $lambda p$. As the MGF characterizes the distribution (when it exists), we have the result.





However, if you want to finish your computation: here how it goes. I assume $pneq 1$, otherwise the answer is trivial.
begin{align}
mathbb{P}{X=n} &= sum_{k=n}^infty binom{k}{n}p^n(1-p)^{k-n} frac{lambda^k e^{-lambda}}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty binom{k}{n}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty frac{k!}{n!(k-n)!}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{k=n}^infty frac{1}{(k-n)!}(1-p)^{k} lambda^k\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{ell=0}^infty frac{1}{ell!}(1-p)^{ell+n} lambda^{ell+n}\
&= e^{-lambda}frac{(lambda p)^n}{n!}sum_{ell=0}^infty frac{(1-p)^{ell} lambda^{ell}}{ell!}\
&= e^{-lambda}frac{(lambda p)^n}{n!}e^{lambda(1-p)}
= boxed{e^{-lambda p}frac{(lambda p)^n}{n!}}
end{align}

and you get the probability mass function of a Poisson r.v. with parameter $lambda p$, as desired.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:37






  • 1




    $begingroup$
    Yes. I can update my answer to show how to finish your computation.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:40










  • $begingroup$
    @VáclavMordvinov Updated.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:46






  • 1




    $begingroup$
    What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
    $endgroup$
    – LoveTooNap29
    Nov 6 '17 at 19:54








  • 2




    $begingroup$
    @LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:56
















3












$begingroup$

I would suggest to use moment-generating functions (MGF): simpler, faster proof. Namely, you have, for $tinmathbb{R}$,
$$begin{align}
mathbb{E} e^{tX}
&= mathbb{E}[ mathbb{E}[ e^{tX} mid N ] ]
stackrel{(dagger)}{=} mathbb{E}[ (1-p+pe^{t})^N ]\
&= mathbb{E}[ e^{Nln(1-p+pe^{t})} ]
stackrel{(ddagger)}{=} exp(lambda(e^{ln(1-p+pe^{t})}-1))\
&= exp(lambda((1-p+pe^{t})-1))\
&= exp(lambda p(e^{t}-1))
end{align}$$

where $(dagger)$ uses the expression of the MGF of a Binomial distribution with parameters $N$ and $p$, and $(ddagger)$ that of the MGF of a Poisson distribution with parameter $lambda$ (applied to the argument $t'stackrel{rm def}{=}ln(1-p+pe^{t})$).



At the end, you get that, for every $tinmathbb{R}$,
$$
mathbb{E} e^{tX} = exp(lambda p(e^{t}-1)) tag{$ast$}
$$

which is the MGF of a Poisson distribution with parameter $lambda p$. As the MGF characterizes the distribution (when it exists), we have the result.





However, if you want to finish your computation: here how it goes. I assume $pneq 1$, otherwise the answer is trivial.
begin{align}
mathbb{P}{X=n} &= sum_{k=n}^infty binom{k}{n}p^n(1-p)^{k-n} frac{lambda^k e^{-lambda}}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty binom{k}{n}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty frac{k!}{n!(k-n)!}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{k=n}^infty frac{1}{(k-n)!}(1-p)^{k} lambda^k\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{ell=0}^infty frac{1}{ell!}(1-p)^{ell+n} lambda^{ell+n}\
&= e^{-lambda}frac{(lambda p)^n}{n!}sum_{ell=0}^infty frac{(1-p)^{ell} lambda^{ell}}{ell!}\
&= e^{-lambda}frac{(lambda p)^n}{n!}e^{lambda(1-p)}
= boxed{e^{-lambda p}frac{(lambda p)^n}{n!}}
end{align}

and you get the probability mass function of a Poisson r.v. with parameter $lambda p$, as desired.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:37






  • 1




    $begingroup$
    Yes. I can update my answer to show how to finish your computation.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:40










  • $begingroup$
    @VáclavMordvinov Updated.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:46






  • 1




    $begingroup$
    What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
    $endgroup$
    – LoveTooNap29
    Nov 6 '17 at 19:54








  • 2




    $begingroup$
    @LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:56














3












3








3





$begingroup$

I would suggest to use moment-generating functions (MGF): simpler, faster proof. Namely, you have, for $tinmathbb{R}$,
$$begin{align}
mathbb{E} e^{tX}
&= mathbb{E}[ mathbb{E}[ e^{tX} mid N ] ]
stackrel{(dagger)}{=} mathbb{E}[ (1-p+pe^{t})^N ]\
&= mathbb{E}[ e^{Nln(1-p+pe^{t})} ]
stackrel{(ddagger)}{=} exp(lambda(e^{ln(1-p+pe^{t})}-1))\
&= exp(lambda((1-p+pe^{t})-1))\
&= exp(lambda p(e^{t}-1))
end{align}$$

where $(dagger)$ uses the expression of the MGF of a Binomial distribution with parameters $N$ and $p$, and $(ddagger)$ that of the MGF of a Poisson distribution with parameter $lambda$ (applied to the argument $t'stackrel{rm def}{=}ln(1-p+pe^{t})$).



At the end, you get that, for every $tinmathbb{R}$,
$$
mathbb{E} e^{tX} = exp(lambda p(e^{t}-1)) tag{$ast$}
$$

which is the MGF of a Poisson distribution with parameter $lambda p$. As the MGF characterizes the distribution (when it exists), we have the result.





However, if you want to finish your computation: here how it goes. I assume $pneq 1$, otherwise the answer is trivial.
begin{align}
mathbb{P}{X=n} &= sum_{k=n}^infty binom{k}{n}p^n(1-p)^{k-n} frac{lambda^k e^{-lambda}}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty binom{k}{n}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty frac{k!}{n!(k-n)!}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{k=n}^infty frac{1}{(k-n)!}(1-p)^{k} lambda^k\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{ell=0}^infty frac{1}{ell!}(1-p)^{ell+n} lambda^{ell+n}\
&= e^{-lambda}frac{(lambda p)^n}{n!}sum_{ell=0}^infty frac{(1-p)^{ell} lambda^{ell}}{ell!}\
&= e^{-lambda}frac{(lambda p)^n}{n!}e^{lambda(1-p)}
= boxed{e^{-lambda p}frac{(lambda p)^n}{n!}}
end{align}

and you get the probability mass function of a Poisson r.v. with parameter $lambda p$, as desired.






share|cite|improve this answer











$endgroup$



I would suggest to use moment-generating functions (MGF): simpler, faster proof. Namely, you have, for $tinmathbb{R}$,
$$begin{align}
mathbb{E} e^{tX}
&= mathbb{E}[ mathbb{E}[ e^{tX} mid N ] ]
stackrel{(dagger)}{=} mathbb{E}[ (1-p+pe^{t})^N ]\
&= mathbb{E}[ e^{Nln(1-p+pe^{t})} ]
stackrel{(ddagger)}{=} exp(lambda(e^{ln(1-p+pe^{t})}-1))\
&= exp(lambda((1-p+pe^{t})-1))\
&= exp(lambda p(e^{t}-1))
end{align}$$

where $(dagger)$ uses the expression of the MGF of a Binomial distribution with parameters $N$ and $p$, and $(ddagger)$ that of the MGF of a Poisson distribution with parameter $lambda$ (applied to the argument $t'stackrel{rm def}{=}ln(1-p+pe^{t})$).



At the end, you get that, for every $tinmathbb{R}$,
$$
mathbb{E} e^{tX} = exp(lambda p(e^{t}-1)) tag{$ast$}
$$

which is the MGF of a Poisson distribution with parameter $lambda p$. As the MGF characterizes the distribution (when it exists), we have the result.





However, if you want to finish your computation: here how it goes. I assume $pneq 1$, otherwise the answer is trivial.
begin{align}
mathbb{P}{X=n} &= sum_{k=n}^infty binom{k}{n}p^n(1-p)^{k-n} frac{lambda^k e^{-lambda}}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty binom{k}{n}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{(1-p)^n}sum_{k=n}^infty frac{k!}{n!(k-n)!}(1-p)^{k} frac{lambda^k}{k!}\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{k=n}^infty frac{1}{(k-n)!}(1-p)^{k} lambda^k\
&= e^{-lambda}frac{p^n}{n!(1-p)^n}sum_{ell=0}^infty frac{1}{ell!}(1-p)^{ell+n} lambda^{ell+n}\
&= e^{-lambda}frac{(lambda p)^n}{n!}sum_{ell=0}^infty frac{(1-p)^{ell} lambda^{ell}}{ell!}\
&= e^{-lambda}frac{(lambda p)^n}{n!}e^{lambda(1-p)}
= boxed{e^{-lambda p}frac{(lambda p)^n}{n!}}
end{align}

and you get the probability mass function of a Poisson r.v. with parameter $lambda p$, as desired.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 24 '18 at 15:42

























answered Nov 6 '17 at 19:35









Clement C.Clement C.

49.8k33886




49.8k33886












  • $begingroup$
    Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:37






  • 1




    $begingroup$
    Yes. I can update my answer to show how to finish your computation.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:40










  • $begingroup$
    @VáclavMordvinov Updated.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:46






  • 1




    $begingroup$
    What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
    $endgroup$
    – LoveTooNap29
    Nov 6 '17 at 19:54








  • 2




    $begingroup$
    @LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:56


















  • $begingroup$
    Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
    $endgroup$
    – Václav Mordvinov
    Nov 6 '17 at 19:37






  • 1




    $begingroup$
    Yes. I can update my answer to show how to finish your computation.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:40










  • $begingroup$
    @VáclavMordvinov Updated.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:46






  • 1




    $begingroup$
    What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
    $endgroup$
    – LoveTooNap29
    Nov 6 '17 at 19:54








  • 2




    $begingroup$
    @LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
    $endgroup$
    – Clement C.
    Nov 6 '17 at 19:56
















$begingroup$
Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
$endgroup$
– Václav Mordvinov
Nov 6 '17 at 19:37




$begingroup$
Thanks for this answer, but I have never seen moment-generating functions before. Would you know a more elementary approach?
$endgroup$
– Václav Mordvinov
Nov 6 '17 at 19:37




1




1




$begingroup$
Yes. I can update my answer to show how to finish your computation.
$endgroup$
– Clement C.
Nov 6 '17 at 19:40




$begingroup$
Yes. I can update my answer to show how to finish your computation.
$endgroup$
– Clement C.
Nov 6 '17 at 19:40












$begingroup$
@VáclavMordvinov Updated.
$endgroup$
– Clement C.
Nov 6 '17 at 19:46




$begingroup$
@VáclavMordvinov Updated.
$endgroup$
– Clement C.
Nov 6 '17 at 19:46




1




1




$begingroup$
What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
$endgroup$
– LoveTooNap29
Nov 6 '17 at 19:54






$begingroup$
What you call the MGF is usually referred to as the characteristic function being related to the MGF by $phi_X(t)=M_X(it)$, where $phi$ is the CF and $M$ the MGF (of some RV). Or perhaps it was a typo, as you start out with $E(e^{itX})$ but end with the expression for the MGF—without any $i$.
$endgroup$
– LoveTooNap29
Nov 6 '17 at 19:54






2




2




$begingroup$
@LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
$endgroup$
– Clement C.
Nov 6 '17 at 19:56




$begingroup$
@LoveTooNap29 I know what the CF is, and its relation to the MGF. I used the MGF for simplicity here, to avoid taking the N-power of a complex number. It'd have made things less nice.
$endgroup$
– Clement C.
Nov 6 '17 at 19:56


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2507946%2fcombination-of-poisson-and-binomial-distribution%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

How to change which sound is reproduced for terminal bell?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents