SVD - reconstruction from U,S,V
up vote
0
down vote
favorite
I am learning some linear algebra for image compression and I am stuck at this point:
Suppose I have a matrix $R$,
$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$
Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $
I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$
linear-algebra matrices svd
|
show 1 more comment
up vote
0
down vote
favorite
I am learning some linear algebra for image compression and I am stuck at this point:
Suppose I have a matrix $R$,
$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$
Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $
I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$
linear-algebra matrices svd
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
1
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
1
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
1
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52
|
show 1 more comment
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am learning some linear algebra for image compression and I am stuck at this point:
Suppose I have a matrix $R$,
$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$
Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $
I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$
linear-algebra matrices svd
I am learning some linear algebra for image compression and I am stuck at this point:
Suppose I have a matrix $R$,
$$ begin{bmatrix}
5 & 7\
2 & 1\end{bmatrix} $$
Then I compute the covariance matrix s.t. $$Sigma =frac12R^TR$$
And I performed SVD with a Matlab function s.t. $ [U, S, V] = svd(Sigma) $
I can see that $USV = Sigma$ but how can I solve this equation below for $R$:
$Sigma=frac12R^TR$
linear-algebra matrices svd
linear-algebra matrices svd
edited Nov 17 at 9:29
Parcly Taxel
41k137198
41k137198
asked Nov 17 at 9:24
michcs
1
1
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
1
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
1
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
1
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52
|
show 1 more comment
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
1
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
1
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
1
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
1
1
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
1
1
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
1
1
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52
|
show 1 more comment
1 Answer
1
active
oldest
votes
up vote
0
down vote
Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.
Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.
Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
add a comment |
up vote
0
down vote
Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.
Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
add a comment |
up vote
0
down vote
up vote
0
down vote
Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.
Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?
Given $Sigma = frac{1}{2}R^TR$, you first let $Sigma = LL^T$. Find out $L$ by Cholesky factorization.
Now $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$. Can you continue from here?
answered Nov 17 at 9:42
tonychow0929
17112
17112
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
add a comment |
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
2
2
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
The solution is not unique though, and quite probably not the original $R$.
– Qidi
Nov 17 at 10:03
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Thank you, with the $R$ above, I computed $Sigma$ to be; $$ begin{bmatrix} 14.5 & 18.5\ 18.5 & 25\end{bmatrix} $$ and the cholesky factorization gives L as; $$ begin{bmatrix} 3.8079 & 0\ 4.8583 & 1.1813\end{bmatrix} $$ From $(frac{1}{sqrt{2}})^2(R^T)(R^T)^T=LL^T$, it seems like I am back at the same problem of $R^T*R$, can you give me a little more guidance please? Thank you -
– michcs
Nov 19 at 7:49
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
Additionally, I have noticed that my toy example $R$ is non-Hermitian positive definite matrix. Suppose that I find the closest approximation to make it positive definite, how can I proceed?
– michcs
Nov 19 at 8:08
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002130%2fsvd-reconstruction-from-u-s-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You can't for $R =begin{pmatrix} 1&0\0&1end{pmatrix}$ and for $R=begin{pmatrix}0&1\1&0end{pmatrix}$, you have $Sigma=frac{1}{2}begin{pmatrix}1&0\0&1end{pmatrix}$
– Charles Madeline
Nov 17 at 9:29
@CharlesMadeline That is true, I just did the maths on paper and I see your point. What I was wondering about is, given $Sigma = 1/2 R^T R$, how can I solve for $R$?
– michcs
Nov 17 at 9:47
1
@michcs This comments just explained with a couterexample that you can't in general.
– Jean-Claude Arbaut
Nov 17 at 9:49
1
@Jean-ClaudeArbaut Thank you, if I understand this correctly, if I perform an SVD on a covariance matrix to get $U, S, V$, I can only reconstruct the covariance matrix, but not the $R$ I used in $Sigma = 1/2 R^T R$.
– michcs
Nov 17 at 9:52
1
Yes, that's right. Actually it has nothing to do with the SVD: once you compute $Sigma=frac12R^TR$, $R$ is lost, whatever you do on $Sigma$, since the mapping $Rto R^TR$ is not injective.
– Jean-Claude Arbaut
Nov 17 at 9:52