论文标题

Easnet:搜索弹性和准确的网络体系结构以进行立体声匹配

EASNet: Searching Elastic and Accurate Network Architecture for Stereo Matching

论文作者

Wang, Qiang, Shi, Shaohuai, Zhao, Kaiyong, Chu, Xiaowen

论文摘要

最近的高级研究花费了大量的人类努力来优化网络体系结构以进行立体声匹配,但几乎无法实现高精度和快速推理速度。为了简化网络设计中的工作量,神经体系结构搜索(NAS)已在各种稀疏的预测任务(例如图像分类和对象检测)上获得了巨大成功。但是,现有关于密集预测任务的NAS研究,尤其是立体声匹配,仍然无法有效地部署在不同计算功能的设备上。为此,我们建议训练一个弹性和准确的网络,以进行立体声匹配(EASNET),该网络支持具有不同计算功能的设备上的各种3D架构设置。鉴于目标设备的部署延迟约束,我们可以在无需额外培训的情况下快速从完整的Easnet中提取子网络,而仍可以维护子网的准确性。广泛的实验表明,就模型的准确性和推理速度而言,我们的Easnet在场景流和MPI Sintel数据集上的最先进的人设计和基于NAS的体系结构都优于最先进的人设计和基于NAS的体系结构。特别是,部署在推理GPU上,Easnet在场景流量数据集中以100 ms的速度获得了新的SOTA EPE,比具有更好质量型号的Leastereo快4.5 $ \ times $。

Recent advanced studies have spent considerable human efforts on optimizing network architectures for stereo matching but hardly achieved both high accuracy and fast inference speed. To ease the workload in network design, neural architecture search (NAS) has been applied with great success to various sparse prediction tasks, such as image classification and object detection. However, existing NAS studies on the dense prediction task, especially stereo matching, still cannot be efficiently and effectively deployed on devices of different computing capabilities. To this end, we propose to train an elastic and accurate network for stereo matching (EASNet) that supports various 3D architectural settings on devices with different computing capabilities. Given the deployment latency constraint on the target device, we can quickly extract a sub-network from the full EASNet without additional training while the accuracy of the sub-network can still be maintained. Extensive experiments show that our EASNet outperforms both state-of-the-art human-designed and NAS-based architectures on Scene Flow and MPI Sintel datasets in terms of model accuracy and inference speed. Particularly, deployed on an inference GPU, EASNet achieves a new SOTA 0.73 EPE on the Scene Flow dataset with 100 ms, which is 4.5$\times$ faster than LEAStereo with a better quality model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源