论文标题

将多臂强盗与本地搜索纳入Maxsat

Incorporating Multi-armed Bandit with Local Search for MaxSAT

论文作者

Zheng, Jiongzhi, He, Kun, Zhou, Jianrong, Jin, Yan, Li, Chu-Min, Manyà, Felip

论文摘要

部分MAXSAT(PMS)和加权PMS(WPM)是MaxSAT问题的两个实际概括。在本文中,我们为这些问题(称为bandhs)提出了一种本地搜索算法,该算法在逃脱本地Optima时应用了两个多臂匪徒来指导搜索方向。将一个强盗与所有软从句结合在一起,以帮助选择算法以满足适当的软从句,而另一个匪徒则使用硬性条款中的所有文字来帮助算法选择适当的文字以满足硬条款。这两个土匪可以在可行和不可行的解决方案空间中提高算法的搜索能力。我们进一步提出了一种初始化方法(w)pms,在产生初始解决方案时优先考虑单元和二进制条款。广泛的实验表明,我们提出的方法具有出色的性能和概括能力,从而极大地提高了最先进的本地搜索算法SATLIKE3.0和最先进的基于SAT的不完整求解器NUWLS-C。

Partial MaxSAT (PMS) and Weighted PMS (WPMS) are two practical generalizations of the MaxSAT problem. In this paper, we propose a local search algorithm for these problems, called BandHS, which applies two multi-armed bandits to guide the search directions when escaping local optima. One bandit is combined with all the soft clauses to help the algorithm select to satisfy appropriate soft clauses, and the other bandit with all the literals in hard clauses to help the algorithm select appropriate literals to satisfy the hard clauses. These two bandits can improve the algorithm's search ability in both feasible and infeasible solution spaces. We further propose an initialization method for (W)PMS that prioritizes both unit and binary clauses when producing the initial solutions. Extensive experiments demonstrate the excellent performance and generalization capability of our proposed methods, that greatly boost the state-of-the-art local search algorithm, SATLike3.0, and the state-of-the-art SAT-based incomplete solver, NuWLS-c.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源