Projects for regression: 
11 of
1
shown.



Gibbs sampler for multiple linear regression
THE FOLLOWING FUNCTION CONDUCTS MULTIPLE LINEAR REGRESSION PARAMETER ESTIMATION VIA MCMC USING GIBBS SAMPLING.
The point of this code is to show how to do it with a conjugate distribution such that the method can easily be plugged into LAMBDA.
The function assumes that:
1)the B coefficients are normally distributed ~N(prior_mean,prior_s2)and thus have a normal distribution prior. Large prior variances produce suitably "diffuse" priors
2) the model variance is gamma distributed Ga(alpha, beta), with an associated prior. Small values of alpha and beta (<.001) produce a suitably "diffuse" prior. Note that Matlab's parameterization of their gamma function is a bit wonky compared to the standard. Basically, Matlab uses Ga(alpha,1/beta) ie. inverse of the standard scale parameter.
The R code contains 2 versions of Bayesian linear regression. The first (univariate.all) uses the multivariate normal distribution to sample regression coefficients; the second (univariate.vat) does variable at a time sampling. Both R versions are nearly identical, allow a flexible range of priors for B, and allow the prior variance to be specified either as the shapescale (e.g. gamma, scaled inv chi sq) or in terms of the meanvariance. Output includes saved parameter draws, which can be fed into CODA, and the calculation of the log(Bayes Factor) to be used for model selection.
Projects for regression: 
11 of
1
shown.

