Frequency of antibiotic application drives rapid evolutionary adaptation of Escherichia coli persistence.
Bram Van den BerghJoran Elie MichielsTom WenseleersEtthel M WindelsPieterjan Vanden BoerDonaat KestemontLuc De MeesterKevin J VerstrepenNatalie VerstraetenMaarten FauvartJan MichielsPublished in: Nature microbiology (2016)
The evolution of antibiotic resistance is a major threat to society and has been predicted to lead to 10 million casualties annually by 2050(1). Further aggravating the problem, multidrug tolerance in bacteria not only relies on the build-up of resistance mutations, but also on some cells epigenetically switching to a non-growing antibiotic-tolerant 'persister' state(2-6). Yet, despite its importance, we know little of how persistence evolves in the face of antibiotic treatment(7). Our evolution experiments in Escherichia coli demonstrate that extremely high levels of multidrug tolerance (20-100%) are achieved by single point mutations in one of several genes and readily emerge under conditions approximating clinical, once-daily dosing schemes. In contrast, reversion to low persistence in the absence of antibiotic treatment is relatively slow and only partially effective. Moreover, and in support of previous mathematical models(8-10), we show that bacterial persistence quickly adapts to drug treatment frequency and that the observed rates of switching to the persister state can be understood in the context of 'bet-hedging' theory. We conclude that persistence is a major component of the evolutionary response to antibiotics that urgently needs to be considered in both diagnostic testing and treatment design in the battle against multidrug tolerance.