MOLGAECL: Molecular Graph Contrastive Learning via Graph Auto-Encoder Pretraining and Fine-Tuning Based on Drug-Drug Interaction Prediction.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Shi-Hong Chen, Lin-Xuan Hou, Yu Li, Cheng-Gang Mi, Hai-Cheng Yi, Zhu-Hong You, Yang Yuan, Jia Zheng

Ngôn ngữ: eng

Ký hiệu phân loại: 551.5271 Meteorology

Thông tin xuất bản: United States : Journal of chemical information and modeling , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 694332

Drug-drug interactions influence drug efficacy and patient prognosis, providing substantial research value. Some existing methods struggle with the challenges posed by sparse networks or lack the capability to integrate data from multiple sources. In this study, we propose MOLGAECL, a novel approach based on graph autoencoder pretraining and molecular graph contrastive learning. Initially, a large number of unlabeled molecular graphs are pretrained using a graph autoencoder, where graph contrastive learning is applied for more accurate representation of the drugs. Subsequently, a full-parameter fine-tuning is performed on different data sets to adapt the model for drug interaction-related prediction tasks. To assess the effectiveness of MOLGAECL, comparison experiments with state-of-the-art methods, fine-tuning comparison experiments, and parameter sensitivity analysis are conducted. Extensive experimental results demonstrate the superior performance of MOLGAECL. Specifically, MOLGAECL achieves an average increase of 6.13% in accuracy, 6.14% in AUROC, and 8.16% in AUPRC across all data sets.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH