Simplifying a likelihood function
up vote
1
down vote
favorite
I'm trying to simplify following equation:
$log L(theta|M)=sum_{d=1}^Dlogbigg((1-alpha)(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-epsilon_s)frac{epsilon_s^S}{S!}+alpha(1-delta)(exp(-(epsilon_b+mu))frac{(epsilon_b+mu)^B}{B!}exp(-epsilon)frac{epsilon_s^S}{S!})+alphadelta(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-(epsilon_s+mu))frac{(epsilon_s+mu)^S}{S!})bigg) $
They have dropped the constant term $−log(Bd!Sd!)$ and simplified the equation into following factorization:
$log L(theta|M)=sum_{d=1}^Dbigg(-epsilon_b-epsilon_s+M_d(log(x_b)+log(x_s))+B*log(mu+epsilon_b)+S*log(mu+epsilon_s)bigg)+sum_{d=1}^Dlogbigg((1-alpha)x_s^{S-M} x_b^{B-M}+alpha(1-delta) exp(-mu)x_s^{S-M}x_b^{-M}+alphadeltaexp(-mu) x_b^{B-M}x_s^{-M}bigg)$
where $M_d=min(B,S)+frac{max(B,S)}{2}, x_s=frac{epsilon_s}{epsilon_s+mu} $ and $x_b=frac{epsilon_b}{epsilon_b+mu}$.
Unfortunately I do not have a clue how the simplification happened. I would be really thankful and more than happy if somebody could describe the simplification step by step.
If there is anything in addition you need to know, please do not hesitate to ask me.
You can find the whole article which mentions the simplification in the following page:
https://cran.r-project.org/web/packages/pinbasic/vignettes/pinbasicVignette.html#general_pin_framework
Thank you for your help!
statistics maximum-likelihood
add a comment |
up vote
1
down vote
favorite
I'm trying to simplify following equation:
$log L(theta|M)=sum_{d=1}^Dlogbigg((1-alpha)(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-epsilon_s)frac{epsilon_s^S}{S!}+alpha(1-delta)(exp(-(epsilon_b+mu))frac{(epsilon_b+mu)^B}{B!}exp(-epsilon)frac{epsilon_s^S}{S!})+alphadelta(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-(epsilon_s+mu))frac{(epsilon_s+mu)^S}{S!})bigg) $
They have dropped the constant term $−log(Bd!Sd!)$ and simplified the equation into following factorization:
$log L(theta|M)=sum_{d=1}^Dbigg(-epsilon_b-epsilon_s+M_d(log(x_b)+log(x_s))+B*log(mu+epsilon_b)+S*log(mu+epsilon_s)bigg)+sum_{d=1}^Dlogbigg((1-alpha)x_s^{S-M} x_b^{B-M}+alpha(1-delta) exp(-mu)x_s^{S-M}x_b^{-M}+alphadeltaexp(-mu) x_b^{B-M}x_s^{-M}bigg)$
where $M_d=min(B,S)+frac{max(B,S)}{2}, x_s=frac{epsilon_s}{epsilon_s+mu} $ and $x_b=frac{epsilon_b}{epsilon_b+mu}$.
Unfortunately I do not have a clue how the simplification happened. I would be really thankful and more than happy if somebody could describe the simplification step by step.
If there is anything in addition you need to know, please do not hesitate to ask me.
You can find the whole article which mentions the simplification in the following page:
https://cran.r-project.org/web/packages/pinbasic/vignettes/pinbasicVignette.html#general_pin_framework
Thank you for your help!
statistics maximum-likelihood
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I'm trying to simplify following equation:
$log L(theta|M)=sum_{d=1}^Dlogbigg((1-alpha)(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-epsilon_s)frac{epsilon_s^S}{S!}+alpha(1-delta)(exp(-(epsilon_b+mu))frac{(epsilon_b+mu)^B}{B!}exp(-epsilon)frac{epsilon_s^S}{S!})+alphadelta(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-(epsilon_s+mu))frac{(epsilon_s+mu)^S}{S!})bigg) $
They have dropped the constant term $−log(Bd!Sd!)$ and simplified the equation into following factorization:
$log L(theta|M)=sum_{d=1}^Dbigg(-epsilon_b-epsilon_s+M_d(log(x_b)+log(x_s))+B*log(mu+epsilon_b)+S*log(mu+epsilon_s)bigg)+sum_{d=1}^Dlogbigg((1-alpha)x_s^{S-M} x_b^{B-M}+alpha(1-delta) exp(-mu)x_s^{S-M}x_b^{-M}+alphadeltaexp(-mu) x_b^{B-M}x_s^{-M}bigg)$
where $M_d=min(B,S)+frac{max(B,S)}{2}, x_s=frac{epsilon_s}{epsilon_s+mu} $ and $x_b=frac{epsilon_b}{epsilon_b+mu}$.
Unfortunately I do not have a clue how the simplification happened. I would be really thankful and more than happy if somebody could describe the simplification step by step.
If there is anything in addition you need to know, please do not hesitate to ask me.
You can find the whole article which mentions the simplification in the following page:
https://cran.r-project.org/web/packages/pinbasic/vignettes/pinbasicVignette.html#general_pin_framework
Thank you for your help!
statistics maximum-likelihood
I'm trying to simplify following equation:
$log L(theta|M)=sum_{d=1}^Dlogbigg((1-alpha)(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-epsilon_s)frac{epsilon_s^S}{S!}+alpha(1-delta)(exp(-(epsilon_b+mu))frac{(epsilon_b+mu)^B}{B!}exp(-epsilon)frac{epsilon_s^S}{S!})+alphadelta(exp(-epsilon_b)frac{epsilon_b^B}{B!}exp(-(epsilon_s+mu))frac{(epsilon_s+mu)^S}{S!})bigg) $
They have dropped the constant term $−log(Bd!Sd!)$ and simplified the equation into following factorization:
$log L(theta|M)=sum_{d=1}^Dbigg(-epsilon_b-epsilon_s+M_d(log(x_b)+log(x_s))+B*log(mu+epsilon_b)+S*log(mu+epsilon_s)bigg)+sum_{d=1}^Dlogbigg((1-alpha)x_s^{S-M} x_b^{B-M}+alpha(1-delta) exp(-mu)x_s^{S-M}x_b^{-M}+alphadeltaexp(-mu) x_b^{B-M}x_s^{-M}bigg)$
where $M_d=min(B,S)+frac{max(B,S)}{2}, x_s=frac{epsilon_s}{epsilon_s+mu} $ and $x_b=frac{epsilon_b}{epsilon_b+mu}$.
Unfortunately I do not have a clue how the simplification happened. I would be really thankful and more than happy if somebody could describe the simplification step by step.
If there is anything in addition you need to know, please do not hesitate to ask me.
You can find the whole article which mentions the simplification in the following page:
https://cran.r-project.org/web/packages/pinbasic/vignettes/pinbasicVignette.html#general_pin_framework
Thank you for your help!
statistics maximum-likelihood
statistics maximum-likelihood
edited Nov 15 at 10:06
asked Nov 13 at 16:29
Hasan
62
62
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09
add a comment |
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996946%2fsimplifying-a-likelihood-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Welcome to MSE Hasan. I kindly suggest that you type out what's written in the links (in Latex/Mathjax -> just put dollar signs around your expressions!), or otherwise I'm afraid your post will not be received well by the community. Thank you.
– Math_QED
Nov 14 at 14:18
Thank you for the help Math_QED. Is the question now clear?
– Hasan
Nov 15 at 10:02
The question is great now. Well done. I voted to reopen it. If everything is all right, you might get an answer soon.
– Math_QED
Nov 15 at 18:09