讲解:Bayesian linear、Python,Java,c/c++、regression modelHaskell

2019 Midterm due at Noon on Feb 23Please submit the solutions be email to me or place in my mailbox on the second floor of OldChemistry.Pr. 1.1 Consider the data set in Lab 1. This is the Advertisement.csv data set.1. Compute the leave-on-out cross validation error for a Gaussian process regressionmodel as well as a Bayesian linear regression model.2. Plot the predictive posterior distribution when observations 1, 50, 100, 150 are respectivelyleft out of the training set and you are asked to predict their response.3. Use a bootstrap procedure to output confidence intervals when observations1, 50, 100,150 are respectively left out of the training set and you are asked to use ordinary leastsquares as your regression method.Pr. 1.2 Write out the EM update steps for a mixture of multinomials model. Specfically,consider the following likelihoodf(x1, ..., xn; π, {θ1, ..., θ7}) = Yni=1 X7k=1πkf(xi; θk)#where x takes values 1, ..., 4 (there are four categories) andf(x = c; θ) = θc, θc ≥ 0 and Xcθc = 1.Pr. 1.3 Given the classification data set in Lab 2 run regularized logistic regression versus SVMand compare classification accuracy on a test-train split.Pr. 1.4 Show that the EM algorithm does not decrease with respect to the likelihood value ateach step.Pr. 1.5 Sketch how the Least Angle Regression problem implements a form of sparse regression.Pr. 1.6 Sketch how the Least Angle Regression problem implements a form of sparse regression.Pr. 1.7 Run the sklearn.mixture program with 8 components. Output the probability assignmentfor each observation. Explain the difference between a hard assignment versus a softassignment for each observation.转自:http://ass.3daixie.com/2019022018336039.html

你可能感兴趣的:(讲解:Bayesian linear、Python,Java,c/c++、regression modelHaskell)