'Sign' of normalized eigenvector for singular value decomposition
up vote
4
down vote
favorite
I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.
I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:
$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$
When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.
When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.
For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?
linear-algebra eigenvalues-eigenvectors matrix-decomposition svd
add a comment |
up vote
4
down vote
favorite
I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.
I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:
$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$
When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.
When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.
For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?
linear-algebra eigenvalues-eigenvectors matrix-decomposition svd
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45
add a comment |
up vote
4
down vote
favorite
up vote
4
down vote
favorite
I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.
I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:
$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$
When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.
When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.
For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?
linear-algebra eigenvalues-eigenvectors matrix-decomposition svd
I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.
I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:
$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$
When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.
When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.
For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?
linear-algebra eigenvalues-eigenvectors matrix-decomposition svd
linear-algebra eigenvalues-eigenvectors matrix-decomposition svd
asked Mar 18 '16 at 4:16
Christiaan Swanepoel
999
999
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45
add a comment |
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.
See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes
add a comment |
up vote
0
down vote
If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is
$$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$
or realistically within the machine precision.
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.
See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes
add a comment |
up vote
0
down vote
Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.
See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes
add a comment |
up vote
0
down vote
up vote
0
down vote
Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.
See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes
Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.
See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes
answered Aug 2 '17 at 20:42
alex
1011
1011
add a comment |
add a comment |
up vote
0
down vote
If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is
$$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$
or realistically within the machine precision.
add a comment |
up vote
0
down vote
If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is
$$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$
or realistically within the machine precision.
add a comment |
up vote
0
down vote
up vote
0
down vote
If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is
$$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$
or realistically within the machine precision.
If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is
$$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$
or realistically within the machine precision.
answered Oct 17 at 20:35
Ryan Howe
2,3891323
2,3891323
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1702730%2fsign-of-normalized-eigenvector-for-singular-value-decomposition%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40
No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45