论文标题

直接和单步反归结的最先进的增强NLP变压器模型

State-of-the-Art Augmented NLP Transformer models for direct and single-step retrosynthesis

论文作者

Tetko, Igor V., Karpov, Pavel, Van Deursen, Ruud, Godin, Guillaume

论文摘要

我们研究了不同训练方案对使用化学反应(微笑)和自然语言处理神经网络变形金刚结构的文本样式预测化合物(复古)合成(复古)的影响。我们表明,数据增强是在图像处理中使用的一种强大方法,它消除了神经网络的数据记忆的影响,并提高了其对新序列预测的性能。当同时使用增强量同时使用增强作用和目标数据时,观察到了这种效果。对于USPTO-50K测试数据集的最大片段的预测,前5个精度为84.8%(从而确定了经典重新合成的主转换),并且是通过相结合的微笑增强和光束搜索算法来实现的。相同的方法为预测单步uspto-MIT测试集的直接反应提供了明显更好的结果。我们的模型在挑战性混合套装方面获得了90.6%的TOP-1和96.1%的前5位准确性,而USPTO-MIT分离套件的TOP-5精度为97%。对于TOP-1和前十名精确度的uspto-full设置的单步反折返的结果也显着改善了结果。最丰富的微笑的外观频率与预测结果很好地相关,可以用作衡量反应预测质量的量度。

We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture. We showed that data augmentation, which is a powerful method used in image processing, eliminated the effect of data memorization by neural networks, and improved their performance for the prediction of new sequences. This effect was observed when augmentation was used simultaneously for input and the target data simultaneously. The top-5 accuracy was 84.8% for the prediction of the largest fragment (thus identifying principal transformation for classical retro-synthesis) for the USPTO-50k test dataset and was achieved by a combination of SMILES augmentation and a beam search algorithm. The same approach provided significantly better results for the prediction of direct reactions from the single-step USPTO-MIT test set. Our model achieved 90.6% top-1 and 96.1% top-5 accuracy for its challenging mixed set and 97% top-5 accuracy for the USPTO-MIT separated set. It also significantly improved results for USPTO-full set single-step retrosynthesis for both top-1 and top-10 accuracies. The appearance frequency of the most abundantly generated SMILES was well correlated with the prediction outcome and can be used as a measure of the quality of reaction prediction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源