A transformer architecture for retention time prediction in liquid chromatography mass spectrometry-based proteomics.
Thang V PhamVinh V NguyenDuong VuAlex A HennemanRobin A RichardsonSander R PiersmaConnie R JimenezPublished in: Proteomics (2023)
Accurate retention time prediction is important for spectral library-based analysis in data-independent acquisition mass spectrometry-based proteomics. The deep learning approach has demonstrated superior performance over traditional machine learning methods for this purpose. The transformer architecture is a recent development in deep learning that delivers state-of-the-art performance in many fields such as natural language processing, computer vision and biology. We assess the performance of the transformer architecture for retention time prediction using datasets from five deep learning models Prosit, DeepDIA, AutoRT, DeepPhospho, and AlphaPeptDeep. The experimental results on holdout datasets and independent datasets exhibit state-of-the-art performance of the transformer architecture. The software and evaluation datasets are publicly available for future development in the field. This article is protected by copyright. All rights reserved.
Keyphrases
- deep learning
- mass spectrometry
- liquid chromatography
- machine learning
- artificial intelligence
- convolutional neural network
- high resolution
- rna seq
- high resolution mass spectrometry
- gas chromatography
- high performance liquid chromatography
- capillary electrophoresis
- tandem mass spectrometry
- big data
- simultaneous determination
- autism spectrum disorder
- data analysis
- electronic health record
- optical coherence tomography
- magnetic resonance imaging
- computed tomography
- solid phase extraction
- label free