论文标题
没有免费的午餐量子机器学习
No Free Lunch for Quantum Machine Learning
论文作者
论文摘要
通过获得著名的无午餐(NFL)定理的概括,研究了量子数据量子学习的最终限制。我们发现对量子风险的下限(在随机输入中表现出训练的假设是不正确的),该量子学习算法是通过成对输入和输出状态训练的量子学习算法的。使用最近引入的QNN体系结构进行了限制。
The ultimate limits for the quantum machine learning of quantum data are investigated by obtaining a generalisation of the celebrated No Free Lunch (NFL) theorem. We find a lower bound on the quantum risk (the probability that a trained hypothesis is incorrect when presented with a random input) of a quantum learning algorithm trained via pairs of input and output states when averaged over training pairs and unitaries. The bound is illustrated using a recently introduced QNN architecture.