Double-head transformer neural network for molecular property prediction
- PMID: 36823530
- PMCID: PMC9951429
- DOI: 10.1186/s13321-023-00700-4
Double-head transformer neural network for molecular property prediction
Abstract
Existing molecular property prediction methods based on deep learning ignore the generalization ability of the nonlinear representation of molecular features and the reasonable assignment of weights of molecular features, making it difficult to further improve the accuracy of molecular property prediction. To solve the above problems, an end-to-end double-head transformer neural network (DHTNN) is proposed in this paper for high-precision molecular property prediction. For the data distribution characteristics of the molecular dataset, DHTNN specially designs a new activation function, beaf, which can greatly improve the generalization ability of the nonlinear representation of molecular features. A residual network is introduced in the molecular encoding part to solve the gradient explosion problem and ensure that the model can converge quickly. The transformer based on double-head attention is used to extract molecular intrinsic detail features, and the weights are reasonably assigned for predicting molecular properties with high accuracy. Our model, which was tested on the MoleculeNet [1] benchmark dataset, showed significant performance improvements over other state-of-the-art methods.
Keywords: Deep learning; Molecular property prediction; Residual network; Transformer.
© 2023. The Author(s).
Conflict of interest statement
The authors declare that they have no competing interests.
Figures
![Fig. 1](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig1_HTML.gif)
![Fig. 2](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig2_HTML.gif)
![Fig. 3](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig3_HTML.gif)
![Fig. 4](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig4_HTML.gif)
![Fig. 5](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig5_HTML.gif)
![Fig. 6](https://www.ncbi.nlm.nih.gov/pmc/articles/instance/9951429/bin/13321_2023_700_Fig6_HTML.gif)
Similar articles
-
CT-based transformer model for non-invasively predicting the Fuhrman nuclear grade of clear cell renal cell carcinoma.Front Oncol. 2022 Sep 28;12:961779. doi: 10.3389/fonc.2022.961779. eCollection 2022. Front Oncol. 2022. PMID: 36249050 Free PMC article.
-
INTransformer: Data augmentation-based contrastive learning by injecting noise into transformer for molecular property prediction.J Mol Graph Model. 2024 May;128:108703. doi: 10.1016/j.jmgm.2024.108703. Epub 2024 Jan 5. J Mol Graph Model. 2024. PMID: 38228013
-
ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction.J Cheminform. 2023 Feb 26;15(1):29. doi: 10.1186/s13321-023-00698-9. J Cheminform. 2023. PMID: 36843022 Free PMC article.
-
A novel hybrid framework based on temporal convolution network and transformer for network traffic prediction.PLoS One. 2023 Sep 8;18(9):e0288935. doi: 10.1371/journal.pone.0288935. eCollection 2023. PLoS One. 2023. PMID: 37682829 Free PMC article.
-
Blended fabric with integrated neural network based on attention mechanism qualitative identification method of near infrared spectroscopy.Spectrochim Acta A Mol Biomol Spectrosc. 2022 Aug 5;276:121214. doi: 10.1016/j.saa.2022.121214. Epub 2022 Mar 30. Spectrochim Acta A Mol Biomol Spectrosc. 2022. PMID: 35395464 Review.
Cited by
-
Attention is all you need: utilizing attention in AI-enabled drug discovery.Brief Bioinform. 2023 Nov 22;25(1):bbad467. doi: 10.1093/bib/bbad467. Brief Bioinform. 2023. PMID: 38189543 Free PMC article. Review.
-
Meta-learning for transformer-based prediction of potent compounds.Sci Rep. 2023 Sep 26;13(1):16145. doi: 10.1038/s41598-023-43046-5. Sci Rep. 2023. PMID: 37752164 Free PMC article.
References
-
- Cheng J, Zhang C, Dong L. A geometric-information-enhanced crystal graph network for predicting properties of materials. Commun Mater. 2021;2(1):1–11. doi: 10.1038/s43246-021-00194-3. - DOI
Grants and funding
LinkOut - more resources
Full Text Sources