Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression

摘要

边框回归是目标检测的关键步骤。在现有的方法中,虽然在边框回归中广泛采用了n范数损失,但并不是针对评价指标IoU进行定制的。近年来,人们提出了IoU损失和G-IoU损失两种可以有利于提升IoU指标的方法,但仍存在收敛速度慢、回归不准确等问题。在这篇论文中,通过合并预测框和目标框之间的归一化距离,我们提出了一种Distance -IoU损失,它在训练过程中比IoU和GIoU损失收敛得更快。此外,本文还总结了边界框回归中的三个几何因素:重叠面积,中心点距离和长宽比。在此基础上,提出了一个Complete IoU (CIoU)损失,从而可以更快的收敛并得到更好的性能。通过将DIoU和CIoU损失合并到最先进的目标检测算法中,例如YOLO v3、SSD和Faster RCNN,我们在IoU度量和GIoU度量方面都取得了显著的性能改善。另外,DIoU可以很容易的被引入到非最大抑制(non-maximum suppression, NMS)中作为判据,进一步促进了性能的提升。

创新点:

1. A Distance-IoU loss, i.e., DIoU loss, is proposed for bounding box regression,which has faster convergence than IoU and GIoU losses.

2. A Complete IoU loss, i.e., CIoU loss, is further proposed by considering three geometric measures, i.e., overlap area, central point distance and aspect ratio, which better describes the regression of rectangular boxes.

3. DIoU is deployed in NMS, and is more robust than original NMS for suppressing redundant boxes.

4. The proposed methods can be easily incorporated into the state-of-the-art detection algorithms, achieving notable performance gains.

开源项目:

https://github.com/Zzh-tju/DIoU

https://github.com/generalized-iou/g-darknet  

https://github.com/JaryHuang/awesome SSD FPN GIoU

https://github.com/generalized-iou/Detectron.pytorch

你可能感兴趣的:(总结,yolo,loss)