MCMC算法之Metropolis-Hastings(MHs)算法(Matlab代码)

1. Problem:

An MH step of invariant distribution p(x) p ( x ) and proposal distribution q(x|x) q ( x ∗ | x ) involves sampling a candidate value x x ∗ given the current value x x according to q(x|x) q ( x ∗ | x ) . The Markov chain then moves towards x x ∗ with acceptance probability A(x,x)=min{1,[p(x)q(x|x)]1p(x)q(x|x)} A ( x , x ∗ ) = m i n { 1 , [ p ( x ) q ( x ∗ | x ) ] − 1 p ( x ∗ ) q ( x | x ∗ ) } , otherwise it remains at x x . The pseudo code is shown in the figure 1, while the following figures show the results of running the MHs algorithm with a Gaussian proposal distribution q(x|x(i))=N(x(i),10) q ( x ∗ | x ( i ) ) = N ( x ( i ) , 10 ) and a bimodal target distribution p(x)0.3 exp(0.2x2)+0.7 exp(0.2(x10)2) p ( x ) ∝ 0.3   e x p ( − 0.2 x 2 ) + 0.7   e x p ( − 0.2 ( x − 10 ) 2 ) for 10000 iterations. As expected, the histogram of the samples approximates the target distribution.

2. Pseudo code:

MCMC算法之Metropolis-Hastings(MHs)算法(Matlab代码)_第1张图片

3. MHs cases:

Case 1: General acceptance probability:
Acceptance probability: A(x,x)=min{1,p(x)q(x|x)p(x)q(x|x)} A ( x , x ∗ ) = m i n { 1 , p ( x ∗ ) q ( x | x ∗ ) p ( x ) q ( x ∗ | x ) }
Case 2: Metropolis algorithm:
Acceptance probability: q(x|x)=q(x|x)A(x,x)=min{1,p(x)p(x)} q ( x ∗ | x ) = q ( x | x ∗ ) ⇒ A ( x , x ∗ ) = m i n { 1 , p ( x ∗ ) p ( x ) }
Case 3: Independent sampler:
Acceptance probability: q(x|x)=q(x)A(x,x)=min{1,p(x)q(x)p(x)q(x)} q ( x ∗ | x ) = q ( x ∗ ) ⇒ A ( x , x ∗ ) = m i n { 1 , p ( x ∗ ) q ( x ) p ( x ) q ( x ∗ ) }

4. Matlab code:

% Metropolis(-Hastings) algorithm
% true (target) pdf is p(x) where we know it but can't sample data. 
% proposal (sample) pdf is q(x*|x)=N(x,10) where we can sample.
%% 
clc
clear; 

X(1)=0; 
N=1e4;
p = @(x) 0.3*exp(-0.2*x.^2) + 0.7*exp(-0.2*(x-10).^2); 
dx=0.5; xx=-10:dx:20; fp=p(xx); plot(xx,fp) % plot the true p(x)
%% MH algorithm
sig=(10); 
for i=1:N-1
    u=rand;
    x=X(i); 
    xs=normrnd(x,sig); % new sample xs based on existing x from proposal pdf.
    pxs=p(xs);
    px=p(x); 
    qxs=normpdf(xs,x,sig);
    qx=normpdf(x,xs,sig); % get p,q.
     if u1,pxs*qx/(px*qxs))  % case 1: pesudo code
%     if u
%     if u
        X(i+1)=xs;
    else
        X(i+1)=x; 
    end
end
% compare pdf of the simulation result with true pdf.
N0=1;  close all;figure; %N/5; 
nb=histc(X(N0+1:N),xx); 
bar(xx+dx/2,nb/(N-N0)/dx); % plot samples.
A=sum(fp)*dx; 
hold on; plot(xx,fp/A,'r') % compare.
% figure(2); plot(N0+1:N,X(N0+1:N)) % plot the traces of x.

% compare cdf with true cdf.
F1(1)=0;
F2(1)=0;
for i=2:length(xx) 
  F1(i)=F1(i-1)+nb(i)/(N-N0); 
  F2(i)=F2(i-1)+fp(i)*dx/A;
end

figure
plot(xx,[F1' F2'])
max(F1-F2) % this is the true possible measure of accuracy.

5. Results:

Result of Case 1:

MCMC算法之Metropolis-Hastings(MHs)算法(Matlab代码)_第2张图片

Result of Case 2:

MCMC算法之Metropolis-Hastings(MHs)算法(Matlab代码)_第3张图片

Result of Case 3:

MCMC算法之Metropolis-Hastings(MHs)算法(Matlab代码)_第4张图片


References:

  1. An Introduction to MCMC for Machine Learning
  2. http://www2.mae.ufl.edu/haftka/vvuq/lectures/MCMC-Metropolis-practice.pptx

你可能感兴趣的:(机器学习,Matlab,数据挖掘)