《CARS: Continuous Evolution for Efficient Neural Architecture Search》阅读记录

现有的三类搜索方法的优缺点

1、Evolution Algorithm(EA) based: time-consuming.

2、Reinforcement Learning(RL) based: make the searching stage less efficient.

3、Gradient based: lack of variety.

PS: HHH, 我现在的方法是不是属于第四类了,基于概率的行不行哈哈哈哈。

Innocation / Contribution

1、Propose an efficient EA-based neural architecture search framework with a continuous evolution strategy.

2、Exploit a protection mechanism(pNSGA-III) to avoid the small model trap problem.

Algorithm

Part1: Parameter Optimazation

Part2: Architecture Optimization

Considering original NSGA-III:
《CARS: Continuous Evolution for Efficient Neural Architecture Search》阅读记录_第1张图片
To protecting the larger models, propose pNSGA-III, which takes the increasing speed of the accuracy into consideration.
Merge two Pareto stages

The integrated algorithm

《CARS: Continuous Evolution for Efficient Neural Architecture Search》阅读记录_第2张图片

思路要点

1、多目标任务的理解角度——

Considering multiple complementary objectives, i.e., accuracy, the number of parameters, float operations (FLOPs), energy, and latency, there is no single architecture that surpasses all the others on all the objectives. Therefore,
architectures within the Pareto front are desired.

2、算法中,将模型参数训练和模型结构训练彻底分开,模型参数训练还是使用的梯度下降,而模型结构训练只用到了EA算法。这里面的内在联系和缘由要理理清楚。

《CARS: Continuous Evolution for Efficient Neural Architecture Search》阅读记录_第3张图片

3、我可能对EA算法有什么误解,这里用到的EA算法也太快了?

4、pNSGA-III的提出挺有意思的:

For NSGA-III method, the nondominated sorting algorithm considers two different objectives and selects individuals according to the sorted Pareto stages.

For the proposed pNSGA-III, besides considering the number of parameters and accuracy, we also conduct a non-dominated sorting algorithm that considers the increasing speed of the accuracy and the number of parameters.

Then the two different Pareto stages are merged.

你可能感兴趣的:(论文阅读记录)