Consistency of an estimator - a Bernoulli-Poisson mixture











up vote
0
down vote

favorite
1












I have difficulties with the consistency part of the following question.



The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?



My solution



1) The MLE part:

For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
begin{align}
L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
end{align}

where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
begin{align}
l(W vert p) = X log(p) + (n-X) log(1-p)
end{align}

which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.



2) The consistency in probability part:



We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
begin{align}
mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
end{align}

and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?



How to solve the last part of the question is again unclear to me. Any help would be appreciated!










share|cite|improve this question




























    up vote
    0
    down vote

    favorite
    1












    I have difficulties with the consistency part of the following question.



    The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?



    My solution



    1) The MLE part:

    For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
    begin{align}
    L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
    end{align}

    where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
    begin{align}
    l(W vert p) = X log(p) + (n-X) log(1-p)
    end{align}

    which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.



    2) The consistency in probability part:



    We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
    begin{align}
    mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
    end{align}

    and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?



    How to solve the last part of the question is again unclear to me. Any help would be appreciated!










    share|cite|improve this question


























      up vote
      0
      down vote

      favorite
      1









      up vote
      0
      down vote

      favorite
      1






      1





      I have difficulties with the consistency part of the following question.



      The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?



      My solution



      1) The MLE part:

      For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
      begin{align}
      L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
      end{align}

      where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
      begin{align}
      l(W vert p) = X log(p) + (n-X) log(1-p)
      end{align}

      which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.



      2) The consistency in probability part:



      We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
      begin{align}
      mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
      end{align}

      and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?



      How to solve the last part of the question is again unclear to me. Any help would be appreciated!










      share|cite|improve this question















      I have difficulties with the consistency part of the following question.



      The question: Let $Y_1, ldots, Y_n$ be a random sample from a Poisson distribution with parameter $lambda > 0$. One observes only $W_i = mathbb{1}_{Y_i>0}$ for $i = 1, ldots, n$. Compute the likelihood associated with the sample $(W_1, ldots, W_n)$ and the MLE in $lambda$. Show that it is consistent in probability. Is it consistent in MSE (mean squared error)?



      My solution



      1) The MLE part:

      For all $i$, the distribution of $W_i$ is Bernoulli with parameter $ p = 1 - sumlimits_{k=1}^{infty}frac{e^{-lambda}lambda^k}{k!}$ (it follows that $p =1 - e^{-lambda}$). So the likelihood of the sample $(W_1, ldots, W_n)$ is:
      begin{align}
      L(W vert p) = prodlimits_{i=1}^n p^{w_i}(1-p)^{1-w_i} = p^{sumlimits_{i=1}^n w_i}(1-p)^{n - sumlimits_{i=1}^n w_i} = p^X(1-p)^{n-X}
      end{align}

      where $X = sumlimits_{i=1}^n w_i$, is the number of success in n trials. The log-likelihood is:
      begin{align}
      l(W vert p) = X log(p) + (n-X) log(1-p)
      end{align}

      which is maximised at $hat{p} = frac{X}{n} = frac{1}{n}sumlimits_{i=1}^n w_i = overline{W}$, which is the mean of $W$. The MLE for $lambda$ is then $hat{lambda} = - log(1- overline{W})$.



      2) The consistency in probability part:



      We need to show that $forall epsilon > 0$, the probability $mathbb{P}(verthat{lambda} - lambda vert > epsilon) rightarrow 0 $ as $n rightarrow infty$. We can write this as:
      begin{align}
      mathbb{P}(vert(hat{lambda} - lambda vert > epsilon) = mathbb{P}(hat{lambda} - lambda > epsilon) + mathbb{P}(-hat{lambda} + lambda > epsilon) = mathbb{P}(n(1-e^{-epsilon - lambda}) < X) + mathbb{P}(n(1-e^{epsilon - lambda}) > X)
      end{align}

      and now we can use that $X$ has a Binomial$(n,p)$ distribution. But it becomes unclear to me how to proceed. Possibly there is some easier approach that I am missing?



      How to solve the last part of the question is again unclear to me. Any help would be appreciated!







      probability-theory statistics probability-distributions






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 16 at 22:39

























      asked Nov 16 at 18:34









      Mateusz Eggink

      10210




      10210



























          active

          oldest

          votes











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














           

          draft saved


          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3001487%2fconsistency-of-an-estimator-a-bernoulli-poisson-mixture%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown






























          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes
















           

          draft saved


          draft discarded



















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3001487%2fconsistency-of-an-estimator-a-bernoulli-poisson-mixture%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          QoS: MAC-Priority for clients behind a repeater

          Ивакино (Тотемский район)

          Can't locate Autom4te/ChannelDefs.pm in @INC (when it definitely is there)