Projection of Gaussian distribution along a vector.
up vote
2
down vote
favorite
Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.
Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?
Any help would be much appreciated!! Thanks
probability normal-distribution
add a comment |
up vote
2
down vote
favorite
Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.
Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?
Any help would be much appreciated!! Thanks
probability normal-distribution
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.
Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?
Any help would be much appreciated!! Thanks
probability normal-distribution
Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.
Eg. Consider a Gaussian $mathbf{X} sim N (mu,Sigma)$ where $mu = [3,2]^T$ and $Sigma = begin{bmatrix} 4 & 0 \ 0 & 7 end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?
Any help would be much appreciated!! Thanks
probability normal-distribution
probability normal-distribution
asked Aug 5 '15 at 16:34
rrr
1924
1924
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
up vote
0
down vote
See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
add a comment |
up vote
0
down vote
In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
$$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
$$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
$$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
Finally
$$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
It's a linear transformation of $X$, so you can easily calculate the expectation and variance.
add a comment |
up vote
0
down vote
Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
$$
ysimmathcal{N}(mu_y, sigma_y^2),
$$
where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
$$
mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
$$
and
$$
sigma_y^2 = Bbb{E}[(y-mu_y)^2]
=
Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
=
Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
=
mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
=
mathbf{v}^topSigma_xmathbf{v}.
$$
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
add a comment |
up vote
0
down vote
See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
add a comment |
up vote
0
down vote
up vote
0
down vote
See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.
See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.
answered Aug 5 '15 at 16:59
user2566092
21.4k1945
21.4k1945
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
add a comment |
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
1
1
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
Thanks for the reply. Yes, I understand its a linear combination of the variables, but I do not quite understand how to get to the equivalent 1D gaussian for the example case I showed. What I am trying to understand is what that linear combination is for the example I just stated. Could you help me perhaps with the steps that I need to follow to get to the linear combination for the given direction vector that I gave. I am fairly new to statistics, so I would appreciate a little pointers as to how to proceed with the actual analysis.
– rrr
Aug 5 '15 at 19:17
add a comment |
up vote
0
down vote
In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
$$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
$$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
$$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
Finally
$$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
It's a linear transformation of $X$, so you can easily calculate the expectation and variance.
add a comment |
up vote
0
down vote
In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
$$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
$$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
$$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
Finally
$$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
It's a linear transformation of $X$, so you can easily calculate the expectation and variance.
add a comment |
up vote
0
down vote
up vote
0
down vote
In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
$$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
$$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
$$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
Finally
$$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
It's a linear transformation of $X$, so you can easily calculate the expectation and variance.
In general for $mathbb{R}^n$ space, given a column matrix $V$ where each column $V^j$ is a vector in $mathbb{R}^n$, the projection to the subspace generated by $V^j$ is $V(V^tV)^{-1}V^t$ (let's assume $V^j$ are all independent so we don't have any issue with matrix rank). That is, for any vector $b$, it's orthogonal projection into the subspace generated by $V^j$ is
$$p^{V}(b) = [V(V^tV)^{-1}V^t]b$$
That is the same for a Gaussian vector, when project to a subspace of $mathbb{R}^n$, one just need to write the projection matrix, and then
$$p^{V}(X) = [V(V^tV)^{-1}V^t]X$$
Back to your example, you have the subspace generated by a single vector $vin mathbb{R}^2$, its projection matrix will be
$$p^{v} = v (v^{t}v)^{-1} v = begin{bmatrix}2 \4 end{bmatrix}([2,4],begin{bmatrix}2 \4 end{bmatrix})^{-1}[2,4]=frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}$$
Finally
$$p^{v}(X) = frac{1}{5}begin{bmatrix}1 & 2 \2&4 end{bmatrix}begin{bmatrix}X_1 \X_2 end{bmatrix}$$
It's a linear transformation of $X$, so you can easily calculate the expectation and variance.
answered Dec 18 '16 at 1:12
ctNGUYEN
1196
1196
add a comment |
add a comment |
up vote
0
down vote
Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
$$
ysimmathcal{N}(mu_y, sigma_y^2),
$$
where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
$$
mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
$$
and
$$
sigma_y^2 = Bbb{E}[(y-mu_y)^2]
=
Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
=
Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
=
mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
=
mathbf{v}^topSigma_xmathbf{v}.
$$
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
add a comment |
up vote
0
down vote
Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
$$
ysimmathcal{N}(mu_y, sigma_y^2),
$$
where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
$$
mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
$$
and
$$
sigma_y^2 = Bbb{E}[(y-mu_y)^2]
=
Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
=
Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
=
mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
=
mathbf{v}^topSigma_xmathbf{v}.
$$
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
add a comment |
up vote
0
down vote
up vote
0
down vote
Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
$$
ysimmathcal{N}(mu_y, sigma_y^2),
$$
where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
$$
mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
$$
and
$$
sigma_y^2 = Bbb{E}[(y-mu_y)^2]
=
Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
=
Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
=
mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
=
mathbf{v}^topSigma_xmathbf{v}.
$$
Let $mathbf{x}simmathcal{N}(mu_x, Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=mathbf{v}^topmathbf{x}$, where $mathbf{v}inBbb{R}^n$, it holds that
$$
ysimmathcal{N}(mu_y, sigma_y^2),
$$
where $mu_y=mathbf{v}^topmu_x$ and $sigma_y^2=mathbf{v}^topSigma_xmathbf{v}$, since
$$
mu_y=Bbb{E}[y]=Bbb{E}[mathbf{v}^topmathbf{x}]=mathbf{v}^topBbb{E}[mathbf{x}]=mathbf{v}^topmu_x
$$
and
$$
sigma_y^2 = Bbb{E}[(y-mu_y)^2]
=
Bbb{E}[(mathbf{v}^topmathbf{x}-mathbf{v}^topmu_x)^2]
=
Bbb{E}[(mathbf{v}^top(mathbf{x}-mu_x))^2]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)mathbf{v}^top(mathbf{x}-mu_x)]
=
Bbb{E}[mathbf{v}^top(mathbf{x}-mu_x)(mathbf{x}-mu_x)^topmathbf{v}]
=
mathbf{v}^topBbb{E}[(mathbf{x}-mu_x)(mathbf{x}-mu_x)^top]mathbf{v}
=
mathbf{v}^topSigma_xmathbf{v}.
$$
answered Dec 22 '17 at 14:08
nullgeppetto
1,3041926
1,3041926
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
add a comment |
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
I think you are calculating the inner product but not the projection. The projection is $v^T x / | x |$
– Albert Chen
Sep 26 at 3:37
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
@nullgeppetto probably assumes that $mathbf{v}$ is a unit vector
– fuji
Nov 6 at 12:13
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1385624%2fprojection-of-gaussian-distribution-along-a-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown