MATLAB粒子群优化算法实现(PSO)

PSO(PSO——Particle Swarm Optimization)(基于种群的随机优化技术算法)
粒子群算法模仿昆虫、兽群、鸟群和鱼群等的群集行为,这些群体按照一种合作的方式寻找食物,群体中的每个成员通过学习它自身的经验和其他成员的经验来不断改变其搜索模式。

概述请见:http://www.omegaxyz.com/2017/05/04/introductionofpso/

Python代码请见:http://www.omegaxyz.com/2018/01/12/python_pso/

更多内容访问omegaxyz.com

MATLAB代码:

%------初始格式化--------------------------------------------------
clear all;
clc;
format long;
%------给定初始化条件----------------------------------------------
c1=2;             %学习因子1
c2=2;             %学习因子2
w=0.7298;              %惯性权重
MaxDT=200;            %最大迭代次数
% D=2;                  %搜索空间维数(未知数个数)
N=20;                  %初始化群体个体数目
%eps=10^(-6);           %设置精度(在已知最小值时候用)
Vmax=1;
Vmin=-1;
popmax=5;
popmin=-5;
%------初始化种群的个体(可以在这里限定位置和速度的范围)------------
for i=1:N
        pop(i,:)=popmin+(popmax-popmin)*rand(1,2);  %随机初始化位置

        V(i,:)=rand(1,2); %随机初始化速度
        fitness(i)=ackley(pop(i,:));

    end
%------先计算各个粒子的适应度,并初始化PiPg----------------------
[fitnessgbest bestindex]=min(fitness);
gbest=pop(bestindex,:);
pbest=pop;
fitnesspbest=fitness;

for i=1:MaxDT
    for j=1:N
        V(j,:)=w*V(j,:)+c1*rand*(pbest(j,:)-pop(j,:))+c2*rand*(gbest-pop(j,:));
        V(j,find(V(j,:)>Vmax))=Vmax;
        V(j,find(V(j,:)<Vmin))=Vmin;
        pop(j,:)=pop(j,:)+V(j,:);
        pop(j,find(pop(j,:)>popmax))=popmax;
        pop(j,find(pop(j,:)if rand>0.8
%             k=ceil(2*rand);
%             pop(j,k)=rand;
%         end        
        fitness(j)=ackley(pop(j,:));


       if fitness(j):)=pop(j,:);
            fitnesspbest(j)=fitness(j);
       end

       if fitness(j):);
           fitnessgbest=fitness(j);  
       end

    end
   yy(i)=fitnessgbest;

end
%------最后给出计算结果
plot(yy)
title(['适应度曲线 ' '终止次数=' num2str(MaxDT)]);
xlabel('进化代数');
ylabel('适应度')
%------算法结束---DreamSun GL & HF-----------------------------------

优化的函数为ackley函数:

% ackley.m
% Ackley's function, from http://www.cs.vu.nl/~gusz/ecbook/slides/16
% and further shown at: 
% http://clerc.maurice.free.fr/pso/Semi-continuous_challenge/Semi-continuous_challenge.htm
%
% commonly used to test optimization/global minimization problems
%
% f(x)= [ 20 + e ...
%        -20*exp(-0.2*sqrt((1/n)*sum(x.^2,2))) ...
%        -exp((1/n)*sum(cos(2*pi*x),2))];
%
% dimension n = # of columns of input, x1, x2, ..., xn
% each row is processed independently,
% you can feed in matrices of timeXdim no prob
%
% example: cost = ackley([1,2,3;4,5,6])

function [out]=ackley(in)

% dimension is # of columns of input, x1, x2, ..., xn
 n=length(in(1,:));

 x=in;
 e=exp(1);

 out = (20 + e ...
       -20*exp(-0.2*sqrt((1/n).*sum(x.^2,2))) ...
       -exp((1/n).*sum(cos(2*pi*x),2)));
 return

更多内容访问omegaxyz.com

你可能感兴趣的:(机器学习,MATLAB,徐奕的专栏,机器学习)