AttentionMGT-DTA: A multi-modal drug-target affinity prediction using graph transformer and attention mechanismShow others and affiliations
2024 (English)In: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 169, p. 623-636Article in journal (Refereed) Published
Abstract [en]
The accurate prediction of drug-target affinity (DTA) is a crucial step in drug discovery and design. Traditional experiments are very expensive and time-consuming. Recently, deep learning methods have achieved notable performance improvements in DTA prediction. However, one challenge for deep learning-based models is appropriate and accurate representations of drugs and targets, especially the lack of effective exploration of target representations. Another challenge is how to comprehensively capture the interaction information between different instances, which is also important for predicting DTA. In this study, we propose AttentionMGT-DTA, a multi-modal attention-based model for DTA prediction. AttentionMGT-DTA represents drugs and targets by a molecular graph and binding pocket graph, respectively. Two attention mechanisms are adopted to integrate and interact information between different protein modalities and drug-target pairs. The experimental results showed that our proposed model outperformed state-of-the-art baselines on two benchmark datasets. In addition, AttentionMGT-DTA also had high interpretability by modeling the interaction strength between drug atoms and protein residues. Our code is available at https://github.com/JK-Liu7/AttentionMGT-DTA. © 2023 The Author(s)
Place, publisher, year, edition, pages
Oxford: Elsevier, 2024. Vol. 169, p. 623-636
Keywords [en]
Attention mechanism, Drug–target affinity, Graph neural network, Graph transformer, Multi-modal learning
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:hh:diva-52421DOI: 10.1016/j.neunet.2023.11.018PubMedID: 37976593Scopus ID: 2-s2.0-85181262524OAI: oai:DiVA.org:hh-52421DiVA, id: diva2:1829327
Note
Funding: The National Natural Science Foundation of China(62073231, 62176175, 62172076), National Research Project (2020YFC2006602), Provincial Key Laboratory for Computer Information Processing Technology, Soochow University (KJS2166), Opening Topic Fund of Big Data Intelligent Engineering Laboratory of Jiangsu Province (SDGC2157), Postgraduate Research and Practice Innovation Program of Jiangsu Province, Zhejiang Provincial Natural Science Foundation of China (Grant No. LY23F020003), and the Municipal Government of Quzhou, China (Grant No. 2023D038).
2024-01-182024-01-182024-01-18Bibliographically approved