Multi-turn dialogue generation is an essential and challenging subtask of text generation in the question answering system. Existing methods focused on extracting latent topic-level relevance or utilizing relevant external background knowledge. However, they are prone to ignore the fact that relying too much on latent aspects will lose subjective key information. Furthermore, there is not so much relevant external knowledge that can be used for referencing or a graph that has complete entity links. Dependency tree is a special structure that can be extracted from sentences, it covers the explicit key information of sentences. Therefore, in this paper, we proposed the EAGS model, which combines the subjective pivotal information from the explicit dependency tree with sentence implicit semantic information. The EAGS model is a knowledge graph enabled multi-turn dialogue generation model, and it doesn't need extra external knowledge, it can not only extract and build a dependency knowledge graph from existing sentences, but also prompt the node representation, which is shared with Bi-GRU each time step word embedding in node semantic level. We store the specific domain subgraphs built by the EAGS, which can be retrieved as external knowledge graph in the future multi-turn dialogue generation task. We design a multi-task training approach to enhance semantics and structure local feature extraction, and balance with the global features. Finally, we conduct experiments on Ubuntu large-scale English multi-turn dialogue community dataset and English Daily dialogue dataset. Experiment results show that our EAGS model performs well on both automatic evaluation and human evaluation compared with the existing baseline models.