论文标题

基于知识蒸馏的电子商务产品搜索的上下文相关性匹配

Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search

论文作者

Liu, Ziyang, Wang, Chaokun, Feng, Hao, Wu, Lingfei, Yang, Liqun

论文摘要

在线相关性匹配是电子商务产品搜索的必不可少的任务,以增强搜索引擎的实用性并确保使用平稳的用户体验。以前的工作采用经典相关性匹配模型或变压器风格的模型来解决它。但是,他们忽略了电子商务产品搜索日志中无处不在的固有的二分图结构,而无法在线部署。在本文中,我们为电子商务相关性匹配设计了一个有效的知识蒸馏框架,以整合变压器风格的模型和经典相关性匹配模型的相应优势。特别是对于框架的核心学生模型,我们建议使用$ K $ - 订单相关建模的新方法。大规模现实世界数据的实验结果(尺寸为6 $ \ sim $ 1.74亿美元)表明,提出的方法显着提高了人类相关性判断的预测准确性。我们将方法部署到匿名在线搜索平台。 A/B测试结果表明,在价格排序模式下,我们的方法可显着提高UV值的5.7%。

Online relevance matching is an essential task of e-commerce product search to boost the utility of search engines and ensure a smooth user experience. Previous work adopts either classical relevance matching models or Transformer-style models to address it. However, they ignore the inherent bipartite graph structures that are ubiquitous in e-commerce product search logs and are too inefficient to deploy online. In this paper, we design an efficient knowledge distillation framework for e-commerce relevance matching to integrate the respective advantages of Transformer-style models and classical relevance matching models. Especially for the core student model of the framework, we propose a novel method using $k$-order relevance modeling. The experimental results on large-scale real-world data (the size is 6$\sim$174 million) show that the proposed method significantly improves the prediction accuracy in terms of human relevance judgment. We deploy our method to the anonymous online search platform. The A/B testing results show that our method significantly improves 5.7% of UV-value under price sort mode.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源