论文标题

通过在编译器映射中纳入有效的实时错误启发式方法,在量子电路中近乎最佳的保真度

Near-Optimal Fidelity in Quantum Circuits through Incorporating Efficient Real-time Error Based Heuristics in Compiler Mappings

论文作者

Muttakin, Md Nurul

论文摘要

要在真实设备中运行量子程序,编译器将逻辑Qubits映射到物理Qubt。这是编译量子电路的最关键步骤。因为量子电路的保真度在很大程度上取决于此映射过程。但是,此Qubit映射问题是NP完整的。因此,我们应该求助于启发式方法,以找到高保真映射。在本文中,我们专注于寻找有效的启发式技术,以结合实时错误反馈和设备连接信息,以实现量子电路的高忠诚度映射。我们基于两种基线算法进行了广泛的分析和实验研究。我们对不同错误率和启发式技术的各种组合进行了实验。因此,我们设计了非常优雅的技术,以考虑所有类型的实时错误反馈和连接信息。我们表明,我们最好的启发式方法的表现\ textbf {1.62x}(平均而言)比一个基线要好于一个基线和\ textbf {1.934x}(平均而言)(平均)比随机基准的其他基线要好。最后,我们将最好的启发式(CAE)与代表性基准的最先进的启发式映射算法进行了比较。我们发现,在成功率方面,CAES执行的\ TextBf {1.7X}(平均)比最先进的状态更好。

To run a quantum program in the real device, the compiler maps the logical qubits to physical qubits. This is the most crucial step of compiling a quantum circuit. Because the fidelity of a quantum circuit depends heavily on this mapping process. However, this qubit mapping problem is NP-complete. Therefore, we should resort to heuristics to find high-fidelity mappings. In this paper, we focused on finding efficient heuristic techniques to incorporate real-time error feedback and device connectivity information in order to achieve high fidelity mapping of the quantum circuits. We performed extensive analysis and experimental study based on two baseline algorithms. We performed our experimentation on various combinations of different error rates and heuristic techniques. Consequently, we designed very elegant techniques to consider both all types of real-time error feedback and connectivity information. We showed that our best heuristic approach performs \textbf{1.62x} ( on average) better than one baseline and \textbf{1.934x} ( on average ) better than the other baseline on random benchmarks. Finally, we compared our best heuristic ( CAES ) with the state-of-the-art heuristic-based mapping algorithm on representative benchmarks. We found that CAES performed \textbf{1.7x} ( on average ) better than the state of the art in terms of success rate.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源