Login / Signup

Optimized model architectures for deep learning on genomic data.

Hüseyin Anil GündüzRené MrechesJulia MoosbauerGary RobertsonXiao-Yin ToEric A FranzosaCurtis HuttenhowerMina RezaeiAlice Carolyn McHardyBernd BischlPhilipp C MünchMartin Binder
Published in: Communications biology (2024)
The success of deep learning in various applications depends on task-specific architecture design choices, including the types, hyperparameters, and number of layers. In computational biology, there is no consensus on the optimal architecture design, and decisions are often made using insights from more well-established fields such as computer vision. These may not consider the domain-specific characteristics of genome sequences, potentially limiting performance. Here, we present GenomeNet-Architect, a neural architecture design framework that automatically optimizes deep learning models for genome sequence data. It optimizes the overall layout of the architecture, with a search space specifically designed for genomics. Additionally, it optimizes hyperparameters of individual layers and the model training procedure. On a viral classification task, GenomeNet-Architect reduced the read-level misclassification rate by 19%, with 67% faster inference and 83% fewer parameters, and achieved similar contig-level accuracy with ~100 times fewer parameters compared to the best-performing deep learning baselines.
Keyphrases
  • deep learning
  • artificial intelligence
  • convolutional neural network
  • machine learning
  • big data
  • electronic health record
  • single cell
  • sars cov
  • genome wide
  • minimally invasive
  • copy number
  • solar cells