论文标题

Ctrgan:循环变压器GAN用于步态转移

CTrGAN: Cycle Transformers GAN for Gait Transfer

论文作者

Mahpod, Shahar, Gaash, Noam, Hoffman, Hay, Ben-Artzi, Gil

论文摘要

我们介绍了一种新颖的方法,用于从不受约束的视频中进行步态转移。与运动转移相反,这里的目的不是模仿目标的源动作,而是在转移目标的典型步态时用目标代替行走源。我们的方法只能通过多个来源对一次训练,并能够从看不见的来源转移目标的步态,从而消除了独立的每个新来源的重新训练的需求。此外,我们提出了一个基于步态识别模型的步态转移的新颖指标,该指标能够量化转移步态的质量,并表明现有技术产生了可以轻松检测到的差异。 我们介绍了由解码器和编码器组成的Cycle Transformers gan(Ctrgan),这两个变压器都在完整图像之间的时间域而不是斑块之间的空间结构域之间的时间域。我们使用广泛使用的步态识别数据集,我们证明我们的方法能够比现有方法更现实的个性化步态,即使与训练过程中没有可用的来源一起使用,也能够产生更真实的个性化步态。作为解决方案的一部分,我们提出了一个检测器,该检测器可以确定视频是真实的还是由我们的模型生成的。

We introduce a novel approach for gait transfer from unconstrained videos in-the-wild. In contrast to motion transfer, the objective here is not to imitate the source's motions by the target, but rather to replace the walking source with the target, while transferring the target's typical gait. Our approach can be trained only once with multiple sources and is able to transfer the gait of the target from unseen sources, eliminating the need for retraining for each new source independently. Furthermore, we propose a novel metrics for gait transfer based on gait recognition models that enable to quantify the quality of the transferred gait, and show that existing techniques yield a discrepancy that can be easily detected. We introduce Cycle Transformers GAN (CTrGAN), that consist of a decoder and encoder, both Transformers, where the attention is on the temporal domain between complete images rather than the spatial domain between patches. Using a widely-used gait recognition dataset, we demonstrate that our approach is capable of producing over an order of magnitude more realistic personalized gaits than existing methods, even when used with sources that were not available during training. As part of our solution, we present a detector that determines whether a video is real or generated by our model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源