Covariance, how to deduce from linear regression
up vote
0
down vote
favorite
1
This is mainly concerning machine learning and linear regression, but I think my question still is mathrelated and for that reason I post my question here. I have a linear regression looking like this: $$t_i = w_0x_1 +w_1 + epsilon = -1.5x_i - 0.5 + epsilon$$ where $epsilon sim mathcal{N}(0,sigma)$ , $sigma = 0.3$ . My issue is from this point to deduce the distribution of the prior, that is $p(w)simmathcal{N}(w_mu,Sigma_w).$ I'm going to claim that the mean $w_mu=0$ since I want to induce so called "sceptical prior". My issue is that I dont know what to select my $Sigma_w$ as, the easiest would be to choose a diagonal matrix with $sigma=0.3$ but what arguments do I have for doing this claim?
covariance
...