Application of Transformers to Chemical Synthesis.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Shuangxi Gu, Dong Jin, Yuli Liang, Haifeng Wang, Zihao Xiong, Xiaojie Yang, Jie Zeng

Ngôn ngữ: eng

Ký hiệu phân loại:

Thông tin xuất bản: Switzerland : Molecules (Basel, Switzerland) , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 78458

Efficient chemical synthesis is critical for the production of organic chemicals, particularly in the pharmaceutical industry. Leveraging machine learning to predict chemical synthesis and improve the development efficiency has become a significant research focus in modern chemistry. Among various machine learning models, the Transformer, a leading model in natural language processing, has revolutionized numerous fields due to its powerful feature-extraction and representation-learning capabilities. Recent applications demonstrated that Transformer models can also significantly enhance the performance in chemical synthesis tasks, particularly in reaction prediction and retrosynthetic planning. This article provides a comprehensive review of the applications and innovations of Transformer models in the qualitative prediction tasks of chemical synthesis, with a focus on technical approaches, performance advantages, and the challenges associated with applying the Transformer architecture to chemical reactions. Furthermore, we discuss the future directions for improving the applications of Transformer models in chemical synthesis.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH