'Sign' of normalized eigenvector for singular value decomposition











up vote
4
down vote

favorite
3












I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.



I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:



$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$



When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.



When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.



For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?










share|cite|improve this question






















  • Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
    – Friedrich Philipp
    Mar 18 '16 at 4:40










  • No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
    – Christiaan Swanepoel
    Mar 18 '16 at 4:45















up vote
4
down vote

favorite
3












I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.



I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:



$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$



When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.



When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.



For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?










share|cite|improve this question






















  • Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
    – Friedrich Philipp
    Mar 18 '16 at 4:40










  • No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
    – Christiaan Swanepoel
    Mar 18 '16 at 4:45













up vote
4
down vote

favorite
3









up vote
4
down vote

favorite
3






3





I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.



I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:



$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$



When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.



When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.



For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?










share|cite|improve this question













I'm working on an SV decomposition script in Python. I am getting incorrect results because of the 'indeterminacy' associated with normalizing the singular vectors.



I understand that the sign of the vectors does not matter in terms of their behaviour as eigenvectors, but it does give incorrect results for SV decomposition. My example is this matrix:



$$
A = begin{bmatrix}
3 & 2\
1 & -1\
end{bmatrix}
$$



When I use numpy.linalg.eig to calculate the normalized eigenvectors for the singular vectors, some of them are the opposite sign to the singular vectors returned by numpy.linalg.svd (i.e. negative of each other) - if I understand correctly both should be valid normalized eigenvectors. In all other respects my algorithm's results are the same as Numpy's.



When I expand the factorisation, a lot of the time mine is incorrect while Numpy's always is. I believe the problem is that numpy.linalg.eig just happens to return the 'wrongly signed' eigenvectors.



For singular value decomposition, is there any easy/deterministic way to check which 'sign' your singular vectors need to have?







linear-algebra eigenvalues-eigenvectors matrix-decomposition svd






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Mar 18 '16 at 4:16









Christiaan Swanepoel

999




999












  • Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
    – Friedrich Philipp
    Mar 18 '16 at 4:40










  • No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
    – Christiaan Swanepoel
    Mar 18 '16 at 4:45


















  • Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
    – Friedrich Philipp
    Mar 18 '16 at 4:40










  • No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
    – Christiaan Swanepoel
    Mar 18 '16 at 4:45
















Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40




Are all entries of opposite sign? Then take into account that when $v$ is an eigenvector, then also $-v$ is.
– Friedrich Philipp
Mar 18 '16 at 4:40












No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45




No, not all of them - I can't find any pattern for when they are different. I understand that, but how can I check whether it is $v$ or $-v$ I need for a correct decomposition?
– Christiaan Swanepoel
Mar 18 '16 at 4:45










2 Answers
2






active

oldest

votes

















up vote
0
down vote













Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.



See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes






share|cite|improve this answer




























    up vote
    0
    down vote













    If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is



    $$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$



    or realistically within the machine precision.






    share|cite|improve this answer





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1702730%2fsign-of-normalized-eigenvector-for-singular-value-decomposition%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      0
      down vote













      Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.



      See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes






      share|cite|improve this answer

























        up vote
        0
        down vote













        Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.



        See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes






        share|cite|improve this answer























          up vote
          0
          down vote










          up vote
          0
          down vote









          Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.



          See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes






          share|cite|improve this answer












          Eigenvectors are not unique. Multiplying by any constant, including -1 (which simply changes the sign), gives another valid eigenvector.



          See: https://stackoverflow.com/questions/18152052/matlab-eig-returns-inverted-signs-sometimes







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Aug 2 '17 at 20:42









          alex

          1011




          1011






















              up vote
              0
              down vote













              If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is



              $$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$



              or realistically within the machine precision.






              share|cite|improve this answer

























                up vote
                0
                down vote













                If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is



                $$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$



                or realistically within the machine precision.






                share|cite|improve this answer























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is



                  $$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$



                  or realistically within the machine precision.






                  share|cite|improve this answer












                  If you have a matrix $ A in mathbb{R}^{m times n}$ then the singular vectors are unique only up to sign because Gram-Schmidt is simply trying to produce orthogonal columns. As long as you take the two matrices you come up with $A_{1}, A_{2}$ and the norm is $0$ most likely you're fine. That is



                  $$ | A - U_{1} Sigma_{1} V_{1}^{T} | = 0 \ | A - U_{2} Sigma_{2} V_{2}^{T}| = 0 $$



                  or realistically within the machine precision.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Oct 17 at 20:35









                  Ryan Howe

                  2,3891323




                  2,3891323






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1702730%2fsign-of-normalized-eigenvector-for-singular-value-decomposition%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      AnyDesk - Fatal Program Failure

                      How to calibrate 16:9 built-in touch-screen to a 4:3 resolution?

                      QoS: MAC-Priority for clients behind a repeater