This article explores deep learning model design, drawing inspiration from the omnigenic model and genetic heterogeneity concepts, to improve schizophrenia prediction using genotype data. It introduces an innovative three-step approach leveraging neural networks' capabilities to efficiently handle genetic interactions. A locally connected network initially routes input data from variants to their corresponding genes. The second step employs an Encoder-Decoder to capture relationships among identified genes. The final model integrates knowledge from the first two and incorporates a parallel component to consider the effects of additional genes. This expansion enhances prediction scores by considering a larger number of genes. Trained models achieved an average AUC of 0.83, surpassing other genotype-trained models and matching gene expression dataset-based approaches. Additionally, tests on held-out sets reported an average sensitivity of 0.72 and an accuracy of 0.76, aligning with schizophrenia heritability predictions. Moreover, the study addresses genetic heterogeneity challenges by considering diverse population subsets.