论文标题

基于事件的对象跟踪的自适应时间表面上的异步跟踪

Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for Event-based Object Tracking

论文作者

Chen, Haosheng, Wu, Qiangqiang, Liang, Yanjie, Gao, Xinbo, Wang, Hanzi

论文摘要

事件摄像机是异步生物启发的视觉传感器,在各种情况下(例如快速运动和低照明场景)显示出巨大的潜力。但是,大多数基于事件的对象跟踪方法都是为具有未纹理对象和整洁背景的场景而设计的。基于事件的对象跟踪方法几乎没有支持基于框的对象跟踪。这项工作背后的主要思想是为基于通用边界框的对象跟踪提出一种基于异步事件的跟踪(ETD)方法。为了实现这一目标,我们使用线性时间衰减(ATSLTD)事件转换算法提出了自适应时间表,该算法是异步有效地扭曲异步视网膜事件的时空信息,以与清晰对象CONTOURS的ATSLTD框架序列。我们将ATSLTD框架的顺序馈送到提出的ETD方法中,以执行准确有效的对象跟踪,该方法利用事件摄像机的高时间分辨率属性。我们将提出的ETD方法与七种流行对象跟踪方法进行比较,这些方法基于常规摄像机或事件摄像机以及ETD的两个变体。实验结果表明,所提出的ETD方法在处理各种具有挑战性的环境中的优越性。

Event cameras, which are asynchronous bio-inspired vision sensors, have shown great potential in a variety of situations, such as fast motion and low illumination scenes. However, most of the event-based object tracking methods are designed for scenarios with untextured objects and uncluttered backgrounds. There are few event-based object tracking methods that support bounding box-based object tracking. The main idea behind this work is to propose an asynchronous Event-based Tracking-by-Detection (ETD) method for generic bounding box-based object tracking. To achieve this goal, we present an Adaptive Time-Surface with Linear Time Decay (ATSLTD) event-to-frame conversion algorithm, which asynchronously and effectively warps the spatio-temporal information of asynchronous retinal events to a sequence of ATSLTD frames with clear object contours. We feed the sequence of ATSLTD frames to the proposed ETD method to perform accurate and efficient object tracking, which leverages the high temporal resolution property of event cameras. We compare the proposed ETD method with seven popular object tracking methods, that are based on conventional cameras or event cameras, and two variants of ETD. The experimental results show the superiority of the proposed ETD method in handling various challenging environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源