SVD - reconstruction from U,S,V











up vote
0
down vote

favorite












I am learning some linear algebra for image compression and I am stuck at this point:



Suppose I have a matrix $R$,



$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$



Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $



I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$










share|cite|improve this question
























  • You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
    – Charles Madeline
    Nov 17 at 9:29










  • @CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
    – michcs
    Nov 17 at 9:47








  • 1




    @michcs This comments just explained with a couterexample that you can't in general.
    – Jean-Claude Arbaut
    Nov 17 at 9:49








  • 1




    @Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
    – michcs
    Nov 17 at 9:52






  • 1




    Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
    – Jean-Claude Arbaut
    Nov 17 at 9:52

















up vote
0
down vote

favorite












I am learning some linear algebra for image compression and I am stuck at this point:



Suppose I have a matrix $R$,



$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$



Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $



I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$










share|cite|improve this question
























  • You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
    – Charles Madeline
    Nov 17 at 9:29










  • @CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
    – michcs
    Nov 17 at 9:47








  • 1




    @michcs This comments just explained with a couterexample that you can't in general.
    – Jean-Claude Arbaut
    Nov 17 at 9:49








  • 1




    @Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
    – michcs
    Nov 17 at 9:52






  • 1




    Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
    – Jean-Claude Arbaut
    Nov 17 at 9:52















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am learning some linear algebra for image compression and I am stuck at this point:



Suppose I have a matrix $R$,



$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$



Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $



I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$










share|cite|improve this question















I am learning some linear algebra for image compression and I am stuck at this point:



Suppose I have a matrix $R$,



$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$



Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $



I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$







linear-algebra matrices svd






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 17 at 9:29









Parcly Taxel

41k137198




41k137198










asked Nov 17 at 9:24









michcs

1




1












  • You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
    – Charles Madeline
    Nov 17 at 9:29










  • @CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
    – michcs
    Nov 17 at 9:47








  • 1




    @michcs This comments just explained with a couterexample that you can't in general.
    – Jean-Claude Arbaut
    Nov 17 at 9:49








  • 1




    @Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
    – michcs
    Nov 17 at 9:52






  • 1




    Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
    – Jean-Claude Arbaut
    Nov 17 at 9:52




















  • You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
    – Charles Madeline
    Nov 17 at 9:29










  • @CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
    – michcs
    Nov 17 at 9:47








  • 1




    @michcs This comments just explained with a couterexample that you can't in general.
    – Jean-Claude Arbaut
    Nov 17 at 9:49








  • 1




    @Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
    – michcs
    Nov 17 at 9:52






  • 1




    Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
    – Jean-Claude Arbaut
    Nov 17 at 9:52


















You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29




You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29












@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47






@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47






1




1




@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49






@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49






1




1




@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52




@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52




1




1




Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52






Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52












1 Answer
1






active

oldest

votes

















up vote
0
down vote













Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.



Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?






share|cite|improve this answer

















  • 2




    The solution is not unique though, and quite probably not the original $R$.
    – Qidi
    Nov 17 at 10:03










  • Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
    – michcs
    Nov 19 at 7:49












  • Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
    – michcs
    Nov 19 at 8:08













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002130%2fsvd-reconstruction-from-u-s-v%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.



Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?






share|cite|improve this answer

















  • 2




    The solution is not unique though, and quite probably not the original $R$.
    – Qidi
    Nov 17 at 10:03










  • Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
    – michcs
    Nov 19 at 7:49












  • Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
    – michcs
    Nov 19 at 8:08

















up vote
0
down vote













Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.



Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?






share|cite|improve this answer

















  • 2




    The solution is not unique though, and quite probably not the original $R$.
    – Qidi
    Nov 17 at 10:03










  • Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
    – michcs
    Nov 19 at 7:49












  • Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
    – michcs
    Nov 19 at 8:08















up vote
0
down vote










up vote
0
down vote









Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.



Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?






share|cite|improve this answer












Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.



Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Nov 17 at 9:42









tonychow0929

17112




17112








  • 2




    The solution is not unique though, and quite probably not the original $R$.
    – Qidi
    Nov 17 at 10:03










  • Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
    – michcs
    Nov 19 at 7:49












  • Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
    – michcs
    Nov 19 at 8:08
















  • 2




    The solution is not unique though, and quite probably not the original $R$.
    – Qidi
    Nov 17 at 10:03










  • Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
    – michcs
    Nov 19 at 7:49












  • Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
    – michcs
    Nov 19 at 8:08










2




2




The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03




The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03












Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49






Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49














Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08






Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002130%2fsvd-reconstruction-from-u-s-v%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

QoS: MAC-Priority for clients behind a repeater

Ивакино (Тотемский район)

Can't locate Autom4te/ChannelDefs.pm in @INC (when it definitely is there)