Next Article in Journal
A β–Sitosterol Encapsulated Biocompatible Alginate/Chitosan Polymer Nanocomposite for the Treatment of Breast Cancer
Next Article in Special Issue
DoubleSG-DTA: Deep Learning for Drug Discovery: Case Study on the Non-Small Cell Lung Cancer with EGFRT790M Mutation
Previous Article in Journal
Alginate/Chitosan-Based Hydrogel Film Containing α-Mangostin for Recurrent Aphthous Stomatitis Therapy in Rats
Previous Article in Special Issue
The Finite Element Analysis Research on Microneedle Design Strategy and Transdermal Drug Delivery System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions

1
Department of Bio-AI Convergence, Chungnam National University, Daejeon 34134, Korea
2
College of Pharmacy, Chungnam National University, Daejeon 34134, Korea
3
Department of Computer Convergence, Chungnam National University, Daejeon 34134, Korea
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Pharmaceutics 2022, 14(8), 1710; https://doi.org/10.3390/pharmaceutics14081710
Submission received: 22 July 2022 / Revised: 9 August 2022 / Accepted: 11 August 2022 / Published: 16 August 2022
(This article belongs to the Special Issue Computational Intelligence (CI) Tools in Drug Discovery and Design)

Abstract

The identification of optimal drug candidates is very important in drug discovery. Researchers in biology and computational sciences have sought to use machine learning (ML) to efficiently predict drug–target interactions (DTIs). In recent years, according to the emerging usefulness of pretrained models in natural language process (NLPs), pretrained models are being developed for chemical compounds and target proteins. This study sought to improve DTI predictive models using a Bidirectional Encoder Representations from the Transformers (BERT)-pretrained model, ChemBERTa, for chemical compounds. Pretraining features the use of a simplified molecular-input line-entry system (SMILES). We also employ the pretrained ProBERT for target proteins (pretraining employed the amino acid sequences). The BIOSNAP, DAVIS, and BindingDB databases (DBs) were used (alone or together) for learning. The final model, taught by both ChemBERTa and ProtBert and the integrated DBs, afforded the best DTI predictive performance to date based on the receiver operating characteristic area under the curve (AUC) and precision-recall-AUC values compared with previous models. The performance of the final model was verified using a specific case study on 13 pairs of subtrates and the metabolic enzyme cytochrome P450 (CYP). The final model afforded excellent DTI prediction. As the real-world interactions between drugs and target proteins are expected to exhibit specific patterns, pretraining with ChemBERTa and ProtBert could teach such patterns. Learning the patterns of such interactions would enhance DTI accuracy if learning employs large, well-balanced datasets that cover all relationships between drugs and target proteins.
Keywords: drug–target interaction; bidirectional encoder representations from transformers (BERT); ChemBERTa; ProBert; pretrained model; self-supervised learning drug–target interaction; bidirectional encoder representations from transformers (BERT); ChemBERTa; ProBert; pretrained model; self-supervised learning

Share and Cite

MDPI and ACS Style

Kang, H.; Goo, S.; Lee, H.; Chae, J.-w.; Yun, H.-y.; Jung, S. Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions. Pharmaceutics 2022, 14, 1710. https://doi.org/10.3390/pharmaceutics14081710

AMA Style

Kang H, Goo S, Lee H, Chae J-w, Yun H-y, Jung S. Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions. Pharmaceutics. 2022; 14(8):1710. https://doi.org/10.3390/pharmaceutics14081710

Chicago/Turabian Style

Kang, Hyeunseok, Sungwoo Goo, Hyunjung Lee, Jung-woo Chae, Hwi-yeol Yun, and Sangkeun Jung. 2022. "Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions" Pharmaceutics 14, no. 8: 1710. https://doi.org/10.3390/pharmaceutics14081710

APA Style

Kang, H., Goo, S., Lee, H., Chae, J.-w., Yun, H.-y., & Jung, S. (2022). Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions. Pharmaceutics, 14(8), 1710. https://doi.org/10.3390/pharmaceutics14081710

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop