论文标题

利用框架间的区域相关性以进行有效的行动识别

Exploiting Inter-Frame Regional Correlation for Efficient Action Recognition

论文作者

Xu, Yuecong, Yang, Jianfei, Mao, Kezhi, Yin, Jianxiong, See, Simon

论文摘要

时间功能提取是基于视频的动作识别的重要问题。光流是提取时间功能的一种流行方法,由于其在连续帧之间捕获像素级相关信息的能力,它可以产生出色的性能。但是,以高计算复杂性和大存储资源为代价提取这种像素级相关性。在本文中,我们提出了一种新型的时间特征提取方法,即通过探索特定区域内的框架间相关性,称为“ ACTICE相关性时间特征(ACTF)”。所提出的ACTF利用了区域级别连续帧之间的双线性和线性相关性。我们的方法具有与基于光流的方法相当或更好的同时,在避免引入光学流量的同时,获得的性能具有优于或更好的性能。实验结果表明,我们提出的方法可在UCF101上获得96.3%的最新性能,而HMDB51基准数据集则达到了76.3%。

Temporal feature extraction is an important issue in video-based action recognition. Optical flow is a popular method to extract temporal feature, which produces excellent performance thanks to its capacity of capturing pixel-level correlation information between consecutive frames. However, such a pixel-level correlation is extracted at the cost of high computational complexity and large storage resource. In this paper, we propose a novel temporal feature extraction method, named Attentive Correlated Temporal Feature (ACTF), by exploring inter-frame correlation within a certain region. The proposed ACTF exploits both bilinear and linear correlation between successive frames on the regional level. Our method has the advantage of achieving performance comparable to or better than optical flow-based methods while avoiding the introduction of optical flow. Experimental results demonstrate our proposed method achieves the state-of-the-art performances of 96.3% on UCF101 and 76.3% on HMDB51 benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源