讲解:STAT6011、Python、Python、STATISTICSDatabase|R

THE UNIVERSITY OF HONG KONGDEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCESTAT6011/7611/8305 COMPUTATIONAL STATISTICS(2019 Fall)Assignment 3, due on November 28All numerical computation MUST be conducted in Python, and attachthe Python code.1. Consider an integral.(a) Plot the above integrand function in the range of (−2, 5).(b) Use the Gaussian Legendre, Chebyshev 1, Chebyshev 2, and Jacobi quadratureswith 10 nodes and weights to approximate the integral respectively.Present the nodes, weights, and the approximation results.2. Use the dataset q2.csv, the observed data y = (y1, . . . , yn) are from a mixtureof normal distributions, i.e., Yi ∼Pkj=1 ωjfj (y), i = 1, . . . , n = 1000, whereeach fjis a normal density function N(µj, σ2j), and ωjis the mixing probabilityand Pkj=1 ωj = 1. Consider the complete data (yi, ui), where the missing datauiindicates which distribution yiis from.(a) Write out the complete-data likelihood.(b) Derive the marginal distribution of yi.(c) Suppose that we know k = 2, σ21 = σ22 = 1 (j = 1, 2), and ω1 = ω2 = 0.5,but µj’s are unknown. Derive the Q(µ|µ(0)) function in the E step, andderive the estimators {µ(1)j} given the previous step values {µ(0)j} in theM step. Use the (sample-based) EM algorithm to estimate µj.(d) Repeat (c) using population EM, i.e., taking the expectation of Yi basedon its true mixture density function f(y) = 0.5N(−1, 1) + 0.5N(1, 1)where the expectation can be computed using Monte Carlo (MCEM).Comment on results in (c)STAT6011代做、代写Python程序设计、Python and (d).(e) Suppose that we know k = 2, σ21 = σ22 = 1 (j = 1, 2), but µj and ωj are unknown.If we treat the ui’s as missing data, derive the Q(ω, µ|ω(0), µ(0))function in the E step, and derive the estimators in a closed form, i.e.,the iterative equation between {ω(1)j, µ(1)j} and {ω(0)j, µ(0)j} in the M step.Use the (sample-based) EM algorithm to estimate µj and ωj.(f) Repeat (e) using population EM, i.e., taking the expectation of Yi basedon its true mixture density function f(y) = ω1N(−1, 1)+ω2N(1, 1) wherethe expectation can be computed using Gaussian Hermite quadrature.1Comment on results in (c)–(f), i.e., knowing the true weights helps convergenceor not (e.g., how many iterations are needed for convergence).3. Use the EM algorithm to estimate the parameters in the random effects logisticmodel, for i = 1, . . . , I and j = 1, . . . , J,Yij = β0 + β1xij + ui + �ij ,The unknown parameter vector θ = (β0, β1, σ2u, σ2)T.(a) Write out the complete-data likelihood.(b) Derive the Q-function and the M-step of the EM algorithm.(c) Conduct simulations as follows. Set the parameters β0 = 0.5, β1 = 1,σu = 1, σ= 1, I = 100, and J = 2. For each dataset, simulate xij fromUniform(0, 1), simulate ij and uifrom the corresponding normal distributions,and then obtain yij . Use the EM algorithm to obtain the parameterestimates based on each simulated dataset. Repeat the simulationprocess 1000 times and present the bias (averaged over 1000 simulations)and standard deviation for θ. Comment on your findings.2转自:http://www.daixie0.com/contents/3/4375.html

你可能感兴趣的:(讲解:STAT6011、Python、Python、STATISTICSDatabase|R)