论文标题

基于上下文集成的基于变压器的神经网络拍卖设计

A Context-Integrated Transformer-Based Neural Network for Auction Design

论文作者

Duan, Zhijian, Tang, Jingwu, Yin, Yutong, Feng, Zhe, Yan, Xiang, Zaheer, Manzil, Deng, Xiaotie

论文摘要

拍卖设计中的主要问题之一是开发一种兼容激励兼容的机制,可最大程度地提高拍卖师的预期收入。尽管理论方法在多项目拍卖中遇到了瓶颈,但最近在通过深度学习找到最佳机制方面取得了很多进展。但是,这些作品要么着重于固定的竞标者和项目,要么将拍卖限制为对称。在这项工作中,我们通过将投标人和项目的上下文信息考虑到拍卖学习框架中来克服此类限制。我们提出了$ \ mathtt {Citransnet} $,这是一种基于上下文集成变压器的神经网络,用于最佳拍卖设计,该网络可维持对投标和上下文的排列量表,同时能够找到不对称的解决方案。我们通过广泛的实验表明,$ \ mathtt {citransnet} $可以在单项设置中恢复已知的最佳解决方案,在多项目拍卖中的表现优于强大的基线,并且可以很好地推广到培训中的案例以外的其他案例。

One of the central problems in auction design is developing an incentive-compatible mechanism that maximizes the auctioneer's expected revenue. While theoretical approaches have encountered bottlenecks in multi-item auctions, recently, there has been much progress on finding the optimal mechanism through deep learning. However, these works either focus on a fixed set of bidders and items, or restrict the auction to be symmetric. In this work, we overcome such limitations by factoring \emph{public} contextual information of bidders and items into the auction learning framework. We propose $\mathtt{CITransNet}$, a context-integrated transformer-based neural network for optimal auction design, which maintains permutation-equivariance over bids and contexts while being able to find asymmetric solutions. We show by extensive experiments that $\mathtt{CITransNet}$ can recover the known optimal solutions in single-item settings, outperform strong baselines in multi-item auctions, and generalize well to cases other than those in training.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源