Large-scale foundation model on single-cell transcriptomics.
Minsheng HaoJing GongXin ZengChiming LiuYucheng GuoXingyi ChengTaifeng WangJianzhu MaXuegong ZhangLe SongPublished in: Nature methods (2024)
Large pretrained models have become foundation models leading to breakthroughs in natural language processing and related fields. Developing foundation models for deciphering the 'languages' of cells and facilitating biomedical research is promising yet challenging. Here we developed a large pretrained model scFoundation, also named 'xTrimoscFoundation α ', with 100 million parameters covering about 20,000 genes, pretrained on over 50 million human single-cell transcriptomic profiles. scFoundation is a large-scale model in terms of the size of trainable parameters, dimensionality of genes and volume of training data. Its asymmetric transformer-like architecture and pretraining task design empower effectively capturing complex context relations among genes in a variety of cell types and states. Experiments showed its merit as a foundation model that achieved state-of-the-art performances in a diverse array of single-cell analysis tasks such as gene expression enhancement, tissue drug response prediction, single-cell drug response classification, single-cell perturbation prediction, cell type annotation and gene module inference.
Keyphrases
- single cell
- rna seq
- high throughput
- gene expression
- genome wide
- genome wide identification
- dna methylation
- endothelial cells
- machine learning
- induced apoptosis
- oxidative stress
- transcription factor
- big data
- high resolution
- cell death
- artificial intelligence
- bone marrow
- working memory
- cell cycle arrest
- adverse drug
- virtual reality
- high density