Education, Science, Technology, Innovation and Life
Open Access
Sign In

Advances in foundation models for genomics: A detailed exploration of developments

Download as PDF

DOI: 10.23977/acss.2024.080709 | Downloads: 9 | Views: 109

Author(s)

Kaizhuang Jing 1, Shuangkai Han 1

Affiliation(s)

1 School of Information Science and Technology, Yunnan Normal University, Kunming, China

Corresponding Author

Shuangkai Han

ABSTRACT

Foundation models (FMs) are a class of deep learning models originating from natural language processing (NLP), trained on large-scale datasets through self-supervised techniques. After pre-training, these models can be fine-tuned with labeled data to accomplish a variety of downstream tasks. FMs have demonstrated outstanding performance across numerous NLP tasks and have been successfully applied in the fields of biology and medicine, exhibiting remarkable efficacy. However, despite the development of multiple FMs specifically tailored for genomics, referred to as genomic foundation models (GFMs), there remains a lack of systematic analysis of these models. This review provides an overview of the current applications and developments of GFMs, offering a comprehensive analysis of their strengths and weaknesses and categorizing their underlying principles. Given the inherent differences between DNA sequences and natural language, designing FMs suitable for genomics presents significant challenges. This paper aims to provide researchers with a detailed analytical report and valuable insights to guide the further development of high-quality GFMs.

KEYWORDS

Genomics, foundation models, large language models, DNA sequences, deep learning

CITE THIS PAPER

Kaizhuang Jing, Shuangkai Han, Advances in foundation models for genomics: A detailed exploration of developments. Advances in Computer, Signals and Systems (2024) Vol. 8: 71-80. DOI: http://dx.doi.org/10.23977/acss.2024.080709.

REFERENCES

[1] Consens M E, Dufault C, Wainberg M, et al. To transformers and beyond: large language models for the genome [J]. arXiv preprint arXiv:2311.07621, 2023.
[2] Benegas G, Ye C, Albors C, et al. Genomic Language Models: Opportunities and Challenges[J]. arXiv preprint arXiv:2407.11435, 2024.
[3] Dalla-Torre, H., Gonzalez, L., Mendoza Revilla, J., et al. The Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics. bioRxiv preprint, 2023.
[4] Karollus, A., Hingerl, J., Gankin, D., et al. Species-aware DNA language models capture regulatory elements and their evolu- tion. Genome Biology 25, 83, 2024.
[5] Zhou, Z., Ji, Y., Li, W., et al. DNABERT-2: Efficient foundation model and benchmark for multi-species genome. arXiv preprint arXiv:2306.15006, 2023.
[6] Chen, K., Zhou, Y., Ding, M., et al. Self-supervised learning on millions of primary RNA sequences from 72 vertebrates improves sequence-based RNA splicing prediction. Briefings in Bioinformatics 25, bbae163, 2024.
[7] Nguyen, E., Poli, M., Faizi, M., et al. HyenaDNA: Long- Range Genomic Sequence Modeling at Single Nucleotide Resolution. Advances in Neural Information Processing Systems vol. 36. Curran Associates, Inc. 43177–43201, 2023.
[8] Schiff, Y., Kao, C.-H., Gokaslan, A., et al. Caduceus: Bi-directional equivariant long-range DNA sequence modeling. arXiv preprint arXiv:2403.03234, 2024. 
[9] Benegas, G., Batra, S. S., and Song, Y. S. DNA language models are powerful pre- dictors of genome-wide variant effects. Proceedings of the National Academy of Sciences 120, e2311219120, 2023.

Downloads: 27683
Visits: 469010

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.