Login / Signup

Bedrock radioactivity influences the rate and spectrum of mutation.

Nathanaelle SaclierPatrick ChardonFlorian MalardLara Konecny-DupréDavid EmeArnaud BellecVincent BretonLaurent DuretTristan LefébureChristophe J Douady
Published in: eLife (2020)
All organisms on Earth are exposed to low doses of natural radioactivity but some habitats are more radioactive than others. Yet, documenting the influence of natural radioactivity on the evolution of biodiversity is challenging. Here, we addressed whether organisms living in naturally more radioactive habitats accumulate more mutations across generations using 14 species of waterlice living in subterranean habitats with contrasted levels of radioactivity. We found that the mitochondrial and nuclear mutation rates across a waterlouse species' genome increased on average by 60% and 30%, respectively, when radioactivity increased by a factor of three. We also found a positive correlation between the level of radioactivity and the probability of G to T (and complementary C to A) mutations, a hallmark of oxidative stress. We conclude that even low doses of natural bedrock radioactivity influence the mutation rate possibly through the accumulation of oxidative damage, in particular in the mitochondrial genome.
Keyphrases
  • oxidative stress
  • genome wide
  • dna damage
  • gene expression
  • induced apoptosis
  • heat shock