Evaluating the Effectiveness of Transformer Models for Molecular Property Prediction: A Review

Transformer models, originally successful in natural language processing, are now being applied to chemical and biological studies, excelling in areas such as molecular property prediction, material science, and drug discovery. BERT, a Transformer-based model, has become foundational in cheminformatics, particularly for QSAR (Quantitative Structure-Activity Relationship) modeling and ADMET (Absorption, Distribution, Metabolism, Excretion, and Toxicity) evaluations in drug discovery. However, achieving higher accuracy often requires designing more complex models, which can compromise their interpretability. This posing a challenge for researchers who need to understand the reasoning behind the predictions. The trade-off between accuracy and interpretability presents a critical challenge in applying black box models to real-world problems in cheminformatics. This work compares Transformer-based models with traditional machine learning and deep learning approaches, focusing on both interpretability and performance. The goal is to highlight the strengths and limitations of each method, offering insights into their optimal use in drug discovery and material science.

Authors:
Alyssa Imani, Bens Pardamean

2024 International Conference on Information Technology and Digital Applications (ICITDA)

Read Full Article