Ross B. Girshick在2016年提出了新的Faster RCNN,在结构上,Faster RCNN已经将特征抽取(feature extraction),proposal提取,bounding box regression(rect refine),classification都整合在了一个网络中,使得综合性能有较大提高,在检测速度方面尤为明显。
Faster R-CNN
Faster R-CNN是截止目前,RCNN系列算法的最杰出产物,two-stage中最为经典的物体检测算法。
主要是两大阶段的处理:
第一阶段先找出图片中待检测物体的anchor矩形框(对背景、待检测物体进行二分类)。
第二阶段对anchor框内待检测物体进行分类。
同时主要包括两个模块:一个是深度全卷积网络RPN,该网络用来产生候选区域;另一个是Fast R-CNN检测器,使用RPN网络产生的候选区域进行分类与边框回归计算。
参考来源:
https://zhuanlan.zhihu.com/p/31426458
https://zhuanlan.zhihu.com/p/82185598
https://blog.youkuaiyun.com/fengbingchun/article/details/87195597
https://blog.youkuaiyun.com/weixin_43198141/article/details/90178512
论文原文:https://arxiv.org/abs/1506.01497
实例://别人的。
内容暂且不管,深度学习检测应用,先看结果。
*
以终为始
引文说明:
%% Forward Collision Warning Using Sensor Fusion
% This example shows how to perform forward collision warning by fusing
% data from vision and radar sensors to track objects in front of the
% vehicle.
%
%% Overview
% Forward collision warning (FCW) is an important feature in driver
% assistance and automated driving systems, where the goal is to provide
% correct, timely, and reliable warnings to the driver before an impending
% collision with the vehicle in front. To achieve the goal, vehicles are
% equipped with forward-facing vision and radar sensors. Sensor fusion is
% required to increase the probability of accurate warnings and minimize
% the probability of false warnings.
%
% For the purposes of this example, a test car (the ego vehicle) was
% equipped with various sensors and their outputs were recorded. The
% sensors used for this example were:
%
% # Vision sensor, which provided lists of observed objects with their
% classification and information about lane boundaries. The object lists
% were reported 10 times per second. Lane boundaries were reported 20
% times per second.
% # Radar sensor with medium and long range modes, which provided lists of
% unclassified observed objects. The object lists were reported 20 times
% per second.
% # IMU, which reported the speed and turn rate of the ego vehicle 20 times
% per second.
% # Video camera, which recorded a video clip of the scene in front of the
% car. Note: This video is not used by the tracker and only serves to
% display the tracking results on video for verification.
%
% The process of providing a forward collision warning comprises the
% following steps:
%
% # Obtain the data from the sensors.
% # Fuse the sensor data to get a list of tracks, i.e., estimated
% positions and velocities of the objects in front of the car.
% # Issue warnings based on the tracks and FCW criteria. The FCW criteria
% are based on the Euro NCAP AEB test procedure and take into account the
% relative distance and relative speed to the object in front of the car.
%
% For more information about tracking multiple objects, see
% <matlab:helpview(fullfile(docroot