Projection of Gaussian distribution along a vector.











up vote
2
down vote

favorite












Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.



Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?



Any help would be much appreciated!! Thanks










share|cite|improve this question


























    up vote
    2
    down vote

    favorite












    Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.



    Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?



    Any help would be much appreciated!! Thanks










    share|cite|improve this question
























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.



      Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?



      Any help would be much appreciated!! Thanks










      share|cite|improve this question













      Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.



      Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?



      Any help would be much appreciated!! Thanks







      probability normal-distribution






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Aug 5 '15 at 16:34









      rrr

      1924




      1924






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          0
          down vote













          See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.






          share|cite|improve this answer

















          • 1




            Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
            – rrr
            Aug 5 '15 at 19:17


















          up vote
          0
          down vote













          In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
          $$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
          That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
          $$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
          Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
          $$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
          Finally
          $$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
          It's a linear transformation of $X$, so you can easily calculate the expectation and variance.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
            $$
            ysimmathcal{N}(mu_y, sigma_y^2),
            $$
            where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
            $$
            mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
            $$
            and
            $$
            sigma_y^2 = Bbb{E}[(y-mu_y)^2]
            =
            Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
            =
            Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
            =
            Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
            =
            Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
            =
            mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
            =
            mathbf{v}^topSigma_xmathbf{v}.
            $$






            share|cite|improve this answer





















            • I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
              – Albert Chen
              Sep 26 at 3:37










            • @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
              – fuji
              Nov 6 at 12:13











            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














             

            draft saved


            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1385624%2fprojection-of-gaussian-distribution-along-a-vector%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            3 Answers
            3






            active

            oldest

            votes








            3 Answers
            3






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            0
            down vote













            See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.






            share|cite|improve this answer

















            • 1




              Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
              – rrr
              Aug 5 '15 at 19:17















            up vote
            0
            down vote













            See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.






            share|cite|improve this answer

















            • 1




              Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
              – rrr
              Aug 5 '15 at 19:17













            up vote
            0
            down vote










            up vote
            0
            down vote









            See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.






            share|cite|improve this answer












            See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Aug 5 '15 at 16:59









            user2566092

            21.4k1945




            21.4k1945








            • 1




              Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
              – rrr
              Aug 5 '15 at 19:17














            • 1




              Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
              – rrr
              Aug 5 '15 at 19:17








            1




            1




            Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
            – rrr
            Aug 5 '15 at 19:17




            Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
            – rrr
            Aug 5 '15 at 19:17










            up vote
            0
            down vote













            In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
            $$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
            That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
            $$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
            Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
            $$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
            Finally
            $$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
            It's a linear transformation of $X$, so you can easily calculate the expectation and variance.






            share|cite|improve this answer

























              up vote
              0
              down vote













              In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
              $$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
              That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
              $$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
              Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
              $$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
              Finally
              $$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
              It's a linear transformation of $X$, so you can easily calculate the expectation and variance.






              share|cite|improve this answer























                up vote
                0
                down vote










                up vote
                0
                down vote









                In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
                $$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
                That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
                $$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
                Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
                $$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
                Finally
                $$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
                It's a linear transformation of $X$, so you can easily calculate the expectation and variance.






                share|cite|improve this answer












                In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
                $$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
                That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
                $$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
                Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
                $$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
                Finally
                $$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
                It's a linear transformation of $X$, so you can easily calculate the expectation and variance.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Dec 18 '16 at 1:12









                ctNGUYEN

                1196




                1196






















                    up vote
                    0
                    down vote













                    Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
                    $$
                    ysimmathcal{N}(mu_y, sigma_y^2),
                    $$
                    where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
                    $$
                    mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
                    $$
                    and
                    $$
                    sigma_y^2 = Bbb{E}[(y-mu_y)^2]
                    =
                    Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
                    =
                    Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
                    =
                    mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
                    =
                    mathbf{v}^topSigma_xmathbf{v}.
                    $$






                    share|cite|improve this answer





















                    • I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                      – Albert Chen
                      Sep 26 at 3:37










                    • @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                      – fuji
                      Nov 6 at 12:13















                    up vote
                    0
                    down vote













                    Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
                    $$
                    ysimmathcal{N}(mu_y, sigma_y^2),
                    $$
                    where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
                    $$
                    mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
                    $$
                    and
                    $$
                    sigma_y^2 = Bbb{E}[(y-mu_y)^2]
                    =
                    Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
                    =
                    Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
                    =
                    mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
                    =
                    mathbf{v}^topSigma_xmathbf{v}.
                    $$






                    share|cite|improve this answer





















                    • I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                      – Albert Chen
                      Sep 26 at 3:37










                    • @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                      – fuji
                      Nov 6 at 12:13













                    up vote
                    0
                    down vote










                    up vote
                    0
                    down vote









                    Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
                    $$
                    ysimmathcal{N}(mu_y, sigma_y^2),
                    $$
                    where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
                    $$
                    mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
                    $$
                    and
                    $$
                    sigma_y^2 = Bbb{E}[(y-mu_y)^2]
                    =
                    Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
                    =
                    Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
                    =
                    mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
                    =
                    mathbf{v}^topSigma_xmathbf{v}.
                    $$






                    share|cite|improve this answer












                    Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
                    $$
                    ysimmathcal{N}(mu_y, sigma_y^2),
                    $$
                    where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
                    $$
                    mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
                    $$
                    and
                    $$
                    sigma_y^2 = Bbb{E}[(y-mu_y)^2]
                    =
                    Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
                    =
                    Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
                    =
                    Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
                    =
                    mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
                    =
                    mathbf{v}^topSigma_xmathbf{v}.
                    $$







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Dec 22 '17 at 14:08









                    nullgeppetto

                    1,3041926




                    1,3041926












                    • I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                      – Albert Chen
                      Sep 26 at 3:37










                    • @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                      – fuji
                      Nov 6 at 12:13


















                    • I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                      – Albert Chen
                      Sep 26 at 3:37










                    • @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                      – fuji
                      Nov 6 at 12:13
















                    I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                    – Albert Chen
                    Sep 26 at 3:37




                    I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
                    – Albert Chen
                    Sep 26 at 3:37












                    @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                    – fuji
                    Nov 6 at 12:13




                    @nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
                    – fuji
                    Nov 6 at 12:13


















                     

                    draft saved


                    draft discarded



















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1385624%2fprojection-of-gaussian-distribution-along-a-vector%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    AnyDesk - Fatal Program Failure

                    How to calibrate 16:9 built-in touch-screen to a 4:3 resolution?

                    QoS: MAC-Priority for clients behind a repeater