论文标题

未知的面部表现攻击通过多个内核的局部学习

Unknown Face Presentation Attack Detection via Localised Learning of Multiple Kernels

论文作者

Arashloo, Shervin Rahimzadeh

论文摘要

本文研究面对欺骗,又名表演攻击检测(PAD)在苛刻的未知攻击情况下。尽管较早的研究揭示了集合方法的好处,尤其是解决问题的多种内核学习方法,但这种技术的一个局限性是它们通常类似地对待整个观察空间,而忽略了数据固有的任何可变性和局部结构。这项工作研究了面部表现攻击检测问题与单级环境中多个内核学习有关的方面,以受益于真正的面部样品中的内在局部结构。更具体地说,灵感来自单级Fisher null形式主义的成功,我们通过对局部核重量的收集收集并推断出零拍的一级不受欢迎的攻击检测来制定凸的凸局化多个内核学习算法。 我们提出了使用Rademacher复杂性来表征其概括能力的理论研究,对所提出的局部MKL算法进行了研究,并证明了所提出的技术的优势,而不是其他一些选项。对一般对象图像数据集的建议方法的评估说明了其对异常和新颖性检测的功效,而Face Pad数据集的实验结果验证了其检测未知/未见面部表现攻击的潜力。

The paper studies face spoofing, a.k.a. presentation attack detection (PAD) in the demanding scenarios of unknown types of attack. While earlier studies have revealed the benefits of ensemble methods, and in particular, a multiple kernel learning approach to the problem, one limitation of such techniques is that they typically treat the entire observation space similarly and ignore any variability and local structure inherent to the data. This work studies this aspect of the face presentation attack detection problem in relation to multiple kernel learning in a one-class setting to benefit from intrinsic local structure in bona fide face samples. More concretely, inspired by the success of the one-class Fisher null formalism, we formulate a convex localised multiple kernel learning algorithm by imposing a joint matrix-norm constraint on the collection of local kernel weights and infer locally adaptive weights for zero-shot one-class unseen attack detection. We present a theoretical study of the proposed localised MKL algorithm using Rademacher complexities to characterise its generalisation capability and demonstrate the advantages of the proposed technique over some other options. An assessment of the proposed approach on general object image datasets illustrates its efficacy for abnormality and novelty detection while the results of the experiments on face PAD datasets verifies its potential in detecting unknown/unseen face presentation attacks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源