Improving Object Detection With One Line of Code

本文提出Soft-NMS算法,通过连续衰减得分而非简单剔除重叠检测框,有效提升目标检测精度。该方法无需额外训练,在PASCAL VOC 2007及MS COCO等标准数据集上取得显著效果。

论文地址:https://arxiv.org/abs/1704.04503

Github项目:https://github.com/bharatsingh430/soft-nms

Improving Object Detection With One Line of Code

Non-maximum suppression is an integral part of the object detection pipeline. First, it sorts all detection boxes on the basis of their scores. The detection box M with the maximum score is selected and all other detection boxes with a significant overlap (using a pre-defined threshold) with M are suppressed. This process is recursively applied on the remaining boxes. As per the design of the algorithm, if an object lies within the predefined overlap threshold, it leads to a miss. To this end, we propose Soft-NMS, an algorithm which decays the detection scores of all other objects as a continuous function of their overlap with M. Hence, no object is eliminated in this process. Soft-NMS obtains consistent improvements for the coco-style mAP metric on standard datasets like PASCAL VOC 2007 (1.7\% for both R-FCN and Faster-RCNN) and MS-COCO (1.3\% for R-FCN and 1.1\% for Faster-RCNN) by just changing the NMS algorithm without any additional hyper-parameters. Further, the computational complexity of Soft-NMS is the same as traditional NMS and hence it can be efficiently implemented. Since Soft-NMS does not require any extra training and is simple to implement, it can be easily integrated into any object detection pipeline. Code for Soft-NMS is publicly available on GitHub \url{ this http URL}.
Comments: ICCV 2017 submission
Subjects: Computer Vision and Pattern Recognition (cs.CV)
Cite as: arXiv:1704.04503 [cs.CV]
  (or arXiv:1704.04503v1 [cs.CV] for this version)

### Soft-NMS算法的发展历程 Soft-NMS(Non-Maximum Suppression)是一种改进的传统NMS方法,旨在解决传统NMS中存在的问题。传统的NMS通过设定一个固定的阈值来筛选检测框,可能会导致部分重叠较高的真实目标被错误抑制。而Soft-NMS则引入了一种平滑的方式,通过对高重叠度的目标赋予较低的置信度分数,从而保留更多可能的真实目标。 #### 背景与发展 Soft-NMS的概念最早由Bodla等人于2017年提出,在论文《Soft-NMS – Improving Object Detection With One Line of Code》中首次介绍[^4]。该论文指出,传统NMS存在一定的局限性,即当两个边界框具有较高IoU(Intersection over Union)时,即使其中一个可能是真实的正样本,也会因为超过预设阈值而被移除。为了缓解这一问题,Soft-NMS采用一种连续函数调整其他候选框的得分,而不是简单地将其剔除。 具体而言,Soft-NMS的核心思想是对与当前最高分预测框有较大交并比的其他框施加惩罚因子,其公式如下: ```python score_i = score_i * exp(-gamma * IoU(box_i, box_max)^2) ``` 其中`box_max`表示当前得分最高的边界框,`box_i`为其余待评估的边界框,`gamma`是一个超参数用于控制衰减程度。这种方式能够更灵活地处理多个高度重叠的对象实例。 #### 进一步的研究进展 随着目标检测技术的进步,后续许多工作基于Soft-NMS进行了扩展和优化。例如,一些研究尝试结合注意力机制或者神经网络模型来自适应地调节权重分配策略;还有学者探索了多阶段式的非极大值抑制流程以进一步提升性能表现。然而值得注意的是,尽管这些变体形式各有特色,但它们大多继承自最初的Soft-NMS设计理念,并在此基础上做出针对性改良。 对于希望深入了解此领域的朋友来说,除了上述提到的基础文献外还可以关注近年来CVPR/ECCV等顶级会议上的相关投稿文章获取最新动态和发展趋势信息[^5]。 ```python def soft_nms(dets, sigma=0.5, Nt=0.3, threshold=0.001, method=1): """ Implementation example of the Soft-NMS algorithm. Parameters: dets (list): List of detections with format [x1,y1,x2,y2,score]. sigma (float): Sigma parameter used in Gaussian decay function. Nt (float): Overlap threshold. threshold (float): Minimum detection confidence to keep a bounding box. method (int): Method type for applying suppression {1: linear, 2: gaussian}. Returns: list: Remaining boxes after performing Soft-NMS operation. """ pass # Placeholder implementation details omitted here. ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值