Prove that the sum and the absolute difference of 2 Bernoulli(0.5) random variables are not independent





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}






up vote
1
down vote

favorite












Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question






















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    Nov 27 at 23:19








  • 1




    I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    Nov 28 at 0:35

















up vote
1
down vote

favorite












Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question






















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    Nov 27 at 23:19








  • 1




    I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    Nov 28 at 0:35













up vote
1
down vote

favorite









up vote
1
down vote

favorite











Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question













Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.







probability self-study independence bernoulli-distribution






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 27 at 22:25









MSE

768




768












  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    Nov 27 at 23:19








  • 1




    I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    Nov 28 at 0:35


















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    Nov 27 at 23:19








  • 1




    I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    Nov 28 at 0:35
















Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber
Nov 27 at 23:19






Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber
Nov 27 at 23:19






1




1




I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35




I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
Nov 28 at 0:35










2 Answers
2






active

oldest

votes

















up vote
1
down vote



accepted










The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$



To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$



Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






share|cite|improve this answer




























    up vote
    2
    down vote













    When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
    See Independence of $X+Y$ and $X-Y$






    share|cite|improve this answer





















    • I want to mark your solution as correct, too! Thanks, @user158565.
      – MSE
      Nov 28 at 0:39











    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "65"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



    However, the joint density is defined on a smaller space:
    $$
    {0,0} cup {1,1} cup {2, 0}.
    $$



    To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
    $$
    P(W,T) = 0 neq P(W)P(T).
    $$



    Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






    share|cite|improve this answer

























      up vote
      1
      down vote



      accepted










      The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



      However, the joint density is defined on a smaller space:
      $$
      {0,0} cup {1,1} cup {2, 0}.
      $$



      To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
      $$
      P(W,T) = 0 neq P(W)P(T).
      $$



      Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






      share|cite|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



        However, the joint density is defined on a smaller space:
        $$
        {0,0} cup {1,1} cup {2, 0}.
        $$



        To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
        $$
        P(W,T) = 0 neq P(W)P(T).
        $$



        Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






        share|cite|improve this answer












        The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



        However, the joint density is defined on a smaller space:
        $$
        {0,0} cup {1,1} cup {2, 0}.
        $$



        To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
        $$
        P(W,T) = 0 neq P(W)P(T).
        $$



        Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 27 at 22:48









        Taylor

        11.4k11744




        11.4k11744
























            up vote
            2
            down vote













            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer





















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              Nov 28 at 0:39















            up vote
            2
            down vote













            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer





















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              Nov 28 at 0:39













            up vote
            2
            down vote










            up vote
            2
            down vote









            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer












            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Nov 27 at 22:28









            user158565

            4,5491316




            4,5491316












            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              Nov 28 at 0:39


















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              Nov 28 at 0:39
















            I want to mark your solution as correct, too! Thanks, @user158565.
            – MSE
            Nov 28 at 0:39




            I want to mark your solution as correct, too! Thanks, @user158565.
            – MSE
            Nov 28 at 0:39


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            AnyDesk - Fatal Program Failure

            How to calibrate 16:9 built-in touch-screen to a 4:3 resolution?

            QoS: MAC-Priority for clients behind a repeater