论文标题

通过肌电图的力感知界面自然VR/AR相互作用

Force-Aware Interface via Electromyography for Natural VR/AR Interaction

论文作者

Zhang, Yunxiang, Liang, Benjamin, Chen, Boyuan, Torrens, Paul, Atashzar, S. Farokh, Lin, Dahua, Sun, Qi

论文摘要

尽管为虚拟和增强现实(VR/AR)取得了巨大的视觉和听觉现实主义进步,但在虚拟世界中引入了合理的身体感仍然具有挑战性。缩小现实世界中的身体和沉浸式虚拟体验之间的差距需要一个封闭的互动循环:将用户施加的物理力应用于虚拟环境,并将触觉感觉产生给用户。但是,现有的VR/AR解决方案要么完全忽略用户的力输入,要么依靠损害用户体验的令人难以置信的传感设备。 通过在参与VR/AR时识别用户的肌肉激活模式,我们为自然和直观力量输入设计了基于学习的神经界面。具体来说,我们表明,轻型肌电图传感器无创地依靠用户的前臂皮肤,告知并建立了对其复杂手动活动的强有力理解。在基于神经网络的模型的推动下,我们的界面可以以3.3%的平均误差为实时解码手指力,并概括为校准的新用户。通过一项互动心理物理研究,我们表明人的界面可以显着增强人类对虚拟物体物理特性(例如刚度)的看法。我们进一步证明了我们的界面可以通过手指敲击实现无处不在的控制。最终,我们设想我们的发现,将研究推向未来的VR/AR中更现实的身体。

While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源