【优化求解】基于加权黑猩猩算法WCHoA求解单目标问题matlab源码

一、黑猩猩算法

These days, there are a sizable number of meta-heuristic algorithms that are utilized to address many problems with numerous variables and huge complexity. One of the most popular swarm intelligence-based meta-heuristic methods is Chimp Optimization Algorithm (ChOA) inspired by the individual intelligence and sexual motivation of chimps in their group hunting. This paper proposes a Weighted ChOA (WChOA) alternative to tackle two main issues that occur in large-scale numerical optimization problems such as low convergence speed and trapping in local optima in solving high-dimensional problems. The main difference between the standard ChOA and WChOA is that a position-weighted equation is offered to enhance convergence speed and avoid local optima. Moreover, the balance between exploration and exploitation is carried out in the proposed method that is crucial in the swarm intelligence-based algorithms. The presented WChOA method is evaluated in different conditions to prove that it is the best. For this purpose, a‏ ‏classical set of 30 unimodal, multimodal, and fixed-dimension multimodal benchmark functions is applied to investigate the pros and cons of characteristics of WChOA. Besides, WChOA is tested on the IEEE Congress of Evolutionary Computation benchmark test functions (CECC06, 2019 Competition). To shed more light on probing the performance of WChOA in large-scale numerical optimization and real-world problems, WChOA is examined by 13 high-dimensional and 10 real-world optimization problems. The results show that the WChOA outperforms in terms of convergence speed, the probability of getting stuck in local minimums, exploration, and exploitation compared to state-of-the-art‏ ‏methods in literature such as ChOA, PSO, BBO, WOA, BH, ALO, GA, SCA, and GWO.

【优化求解】基于加权黑猩猩算法WCHoA求解单目标问题matlab源码_第1张图片

二、部分代码

 
  

%___________________________________________________________________%

% Chimp Optimization Algorithm (ChOA) source codes version 1.0   

% By: M. Khishe, M. R. Musavi

% [email protected]

%For more information please refer to the following papers:

% M. Khishe, M. R. Mosavi, 揅himp Optimization Algorithm,�Expert Systems

% With Applications, 2020.

% Please note that some files and functions are taken from the GWO algorithm

% such as: Get_Functions_details, PSO,  

%  For more information please refer to the following papers:

% Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey Wolf Optimizer. Advances in engineering software, 69, 46-61.            %

%___________________________________________________________________%

% You can simply define your cost in a seperate file and load its handle to fobj 

% The initial parameters that you need are:

%__________________________________________

% fobj = @YourCostFunction

% dim = number of your variables

% Max_iteration = maximum number of generations

% SearchAgents_no = number of search agents

% lb=[lb1,lb2,...,lbn] where lbn is the lower bound of variable n

% ub=[ub1,ub2,...,ubn] where ubn is the upper bound of variable n

% If all the variables have equal lower bound you can just

% define lb and ub as two single number numbers

%

%__________________________________________

clear all 

clc

SearchAgents_no=30; % Number of search agents

N=SearchAgents_no;

Function_name='F14'; % Name of the test function that can be from F1 to F23 (Table 3,4,5 in the paper)

Max_iteration=500; % Maximum numbef of iterations

Max_iter=Max_iteration;

% Load details of the selected benchmark function

[lb,ub,dim,fobj]=Get_Functions_details(Function_name);

[ABest_scoreChimp,ABest_posChimp,Chimp_curve]=Chimp(SearchAgents_no,Max_iteration,lb,ub,dim,fobj);

[WABest_scoreChimp,WABest_posChimp,WChimp_curve]=WChimp(SearchAgents_no,Max_iteration,lb,ub,dim,fobj);

[PSO_gBestScore,PSO_gBest,PSO_cg_curve]=PSO(N,Max_iteration,lb,ub,dim,fobj);

[TACPSO_gBestScore,TACPSO_gBest,TACPSO_cg_curve]=TACPSO(N,Max_iteration,lb,ub,dim,fobj);

[MPSO_gBestScore,MPSO_gBest,MPSO_cg_curve]=MPSO(N,Max_iteration,lb,ub,dim,fobj);

% PSO_cg_curve=PSO(SearchAgents_no,Max_iteration,lb,ub,dim,fobj); % run PSO to compare to results

figure('Position',[500 500 660 290])

%Draw search space

subplot(1,2,1);

func_plot(Function_name);

title('Parameter space')

xlabel('x_1');

ylabel('x_2');

zlabel([Function_name,'( x_1 , x_2 )'])

%Draw objective space

subplot(1,2,2);

semilogy(MPSO_cg_curve,'Color','g')

hold on

semilogy(PSO_cg_curve,'Color','b')

hold on

semilogy(TACPSO_cg_curve,'Color','y')

hold on

semilogy(Chimp_curve,'--r')

hold on

semilogy(WChimp_curve,'r')

title('Objective space')

xlabel('Iteration');

ylabel('Best score obtained so far');

axis tight

grid on

box on

legend('MPSO','PSO','TACPSO','Chimp','WChimp')

img =gcf;  %获取当前画图的句柄

print(img, '-dpng', '-r600', './img.png')         %即可得到对应格式和期望dpi的图像

display(['The best optimal value of the objective funciton found by TACPSO is : ', num2str(TACPSO_gBestScore)]);

display(['The best optimal value of the objective funciton found by PSO is : ', num2str(PSO_gBestScore)]);

display(['The best optimal value of the objective funciton found by PSO is : ', num2str(MPSO_gBestScore)]);

display(['The best optimal value of the objective funciton found by Chimp is : ', num2str(ABest_scoreChimp)]);

display(['The best optimal value of the objective funciton found by WChimp is : ', num2str(WABest_scoreChimp)]);

四、仿真结果

【优化求解】基于加权黑猩猩算法WCHoA求解单目标问题matlab源码_第2张图片

五、参考文献

Khishe, M., and M. R. Mosavi. “Chimp Optimization Algorithm.” Expert Systems with Applications, vol. 149, Elsevier BV, July 2020, p. 113338, doi:10.1016/j.eswa.2020.113338.

你可能感兴趣的:(优化求解,matlab,算法)