Consistency of an estimator - a Bernoulli-Poisson mixture
up vote
0
down vote
favorite
I have difficulties with the consistency part of the following question.
The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?
My solution
1) The MLE part:
For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
begin{align}
L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
end{align}
where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
begin{align}
l(W vert p) = X log(p) + (n-X) log(1-p)
end{align}
which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.
2) The consistency in probability part:
We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
begin{align}
mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
end{align}
and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?
How to solve the last part of the question is again unclear to me. Any help would be appreciated!
probability-theory statistics probability-distributions
add a comment |
up vote
0
down vote
favorite
I have difficulties with the consistency part of the following question.
The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?
My solution
1) The MLE part:
For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
begin{align}
L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
end{align}
where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
begin{align}
l(W vert p) = X log(p) + (n-X) log(1-p)
end{align}
which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.
2) The consistency in probability part:
We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
begin{align}
mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
end{align}
and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?
How to solve the last part of the question is again unclear to me. Any help would be appreciated!
probability-theory statistics probability-distributions
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have difficulties with the consistency part of the following question.
The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?
My solution
1) The MLE part:
For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
begin{align}
L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
end{align}
where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
begin{align}
l(W vert p) = X log(p) + (n-X) log(1-p)
end{align}
which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.
2) The consistency in probability part:
We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
begin{align}
mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
end{align}
and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?
How to solve the last part of the question is again unclear to me. Any help would be appreciated!
probability-theory statistics probability-distributions
I have difficulties with the consistency part of the following question.
The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?
My solution
1) The MLE part:
For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
begin{align}
L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
end{align}
where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
begin{align}
l(W vert p) = X log(p) + (n-X) log(1-p)
end{align}
which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.
2) The consistency in probability part:
We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
begin{align}
mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
end{align}
and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?
How to solve the last part of the question is again unclear to me. Any help would be appreciated!
probability-theory statistics probability-distributions
probability-theory statistics probability-distributions
edited Nov 16 at 22:39
asked Nov 16 at 18:34
Mateusz Eggink
10210
10210
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3001487%2fconsistency-of-an-estimator-a-bernoulli-poisson-mixture%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown