4.5 Article

A transformer architecture for retention time prediction in liquid chromatography mass spectrometry-based proteomics

期刊

PROTEOMICS
卷 23, 期 7-8, 页码 -

出版社

WILEY
DOI: 10.1002/pmic.202200041

关键词

deep learning; DIA-MS; retention time prediction; spectral library; transformer architecture

向作者/读者索取更多资源

Accurate RT prediction is crucial for spectral library-based analysis in DIA-based proteomics. The transformer architecture has shown state-of-the-art performance in RT prediction, outperforming traditional machine learning methods.
Accurate retention time (RT) prediction is important for spectral library-based analysis in data-independent acquisition mass spectrometry-based proteomics. The deep learning approach has demonstrated superior performance over traditional machine learning methods for this purpose. The transformer architecture is a recent development in deep learning that delivers state-of-the-art performance in many fields such as natural language processing, computer vision, and biology. We assess the performance of the transformer architecture for RT prediction using datasets from five deep learning models Prosit, DeepDIA, AutoRT, DeepPhospho, and AlphaPeptDeep. The experimental results on holdout datasets and independent datasets exhibit state-of-the-art performance of the transformer architecture. The software and evaluation datasets are publicly available for future development in the field.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据