Windows of Selection for Anti-Malarial Drug Treatments
Windows of Selection for Anti-Malarial Drug Treatments
Artemisinin combination therapy (ACT) is effective, first-line treatment for uncomplicated Plasmodium falciparum malaria and now widely deployed in most endemic countries. The artemisinin component is extremely fast acting and highly potent but rapidly eliminated so current ACT all contain a second 'partner' drug with a longer half-life. The long half-lives of the partner drugs are clinically beneficial: they may persist at active concentrations for weeks after treatment, providing valuable prophylaxis against infection during this period. However, the rapid elimination of artemisinin means the partner drugs are present as a monotherapy during this prophylactic period, which may be a potent force selecting for resistant parasites. This process can drive increasing drug tolerance to the partner drug and the eventual loss of therapeutic effectiveness, resulting in failure of the ACT. This ability of parasites to evolve resistance to the partner drug is a possible Achilles heel of ACT.
The putative consequences of long half-life partner drugs driving resistance enters debates on the politics of anti-malarial drug deployment. Mass drug deployment (MDA) policies were widely used in the 1950s and 1960s (reviewed by Von Seidlein and Greenwood), but fell from favour as, used in isolation, they had only a transitory effect on malaria transmission and drove drug resistance to high levels. MDA policies are once again under active consideration as part of a comprehensive toolkit for eradicating malaria populations, particularly those believed to have evolved artemisinin resistance. Intermittent preventative treatment (IPT) programmes also deploy anti-malarial drugs and have proven highly effective in protecting vulnerable populations, but as malaria transmission declines it is unclear at what point their short-term clinical benefit is outweighed by the putative longer-term consequences of driving resistance. Anti-malarial drug use in MDA and IPT programmes drives resistance in two ways: (1) they select for parasites able to survive treatment; and, (2) their long half-lives mean they persist for long periods after treatment, selecting for newly acquired resistant parasites able to survive residual drug levels as they emerge from the liver and attempt to establish a viable infection. Mathematical modelling suggests the first force is easy to measure: it is simply, and intuitively, the proportion of existing asexual blood infections that are treated (often called simply the 'drug coverage'). The second force is more difficult to quantify but note that the size of its effect is, somewhat counter-intuitively, not affected by local transmission intensity (Box 1 of). The use of drugs in MDA and IPT is almost guaranteed to increase selection for resistance, but to what extent? This paper attempts to provide quantitative answers to this second force, selection due to residual drug levels, through the application of pharmacological modelling of anti-malarial drug treatment.
The genetic process whereby parasites evolve increasing tolerance to the partner drug is usually quantified as a window of selection (WoS). As a specific example, Watkins and Mosobo noted parasites with the dhfr108 mutation could be observed in patients 15 days after treatment with sulfadoxine-pyrimethamine (SP), whereas wild-type infections were only observed after 50 days, thus implying a WoS of 35 days. Similarly, Sisowath et al. estimated a WoS of 15 days associated with the pfmdr1 D1246Y mutation after lumefantrine treatment. Routine genotyping in clinical trials means such data are readily available and have been used previously to estimate WoS. People in endemic areas often have residual drug levels resulting from previous chemoprophylaxis or direct treatment. This is likely to result in frequent selection windows for most drugs. For example, if five courses of SP are taken per year and, as above, assuming a 35-day WoS, then there will be 5 × 35 = 175 days per person in which persisting drug concentrations are selecting for the dhfr108 mutation. This implies that selection for resistance via WoS may be widespread and intense.
These WoS estimates, obtained from clinical observations, are widely cited and refered to here as 'clinical' WoS to denote their origin in clinical observations. In fact, the 'true' WoS is the period during which infections bearing the mutation can emerge from the liver (assuming the drugs do not kill the parasites while in the liver stage) and survive to produce a viable infection, while sensitive parasites are killed by residual drug concentrations. Unfortunately, it is impossible to directly observe the 'true' WoS because the 10 parasites that emerge from the liver are below patency. It has, therefore, been widely assumed that the clinical WoS reflects the true window (op cit). However, it is not clear how well the clinical WoS estimates the true WoS and there are several plausible reasons why they may be poor estimators, discussed further in Fig. 1. This paper uses the pharmacokinetic–pharmacodynamic (PK/PD) model described previously to simulate the WoS for increasingly resistant infections, with the aim to quantifying how accurately clinical WoS estimate the 'true' window of selection.
(Enlarge Image)
Figure 1.
Potential problems with using patent parasitaemia to estimate 'windows of selection' (WoS). Field studies typically measure the WoS by comparing the times different genotypes become detectable ('patent') in patients. As an example, the clinical data of Sisowath et al. [3], showed that resistant parasites bearing the pfmdr1 D1246Y mutation (top row; blue squares) first become patent in patients approximately 20 days after treatment and about 15 days earlier than sensitive parasites (bottom row; green squares). Each new infection was assumed to comprise 10 parasites emerging from the liver and to become patent if/when their numbers reached 10 parasites (horizontal dashed lines). The black line shows the decrease in drug concentration over time (using a single drug dose for illustration). Lines A and B show two clones that emerge on the same day but grow at different rates because of their differing IC50s and become patent several days apart. Here, field estimates based on patent parasitaemia would quantify a WoS of around 5 days when in fact there is none. Lines C1 –C3 illustrate how the earliest emerging parasites may not necessarily correspond to the first patent infection. C1 is the earliest emerging clone but residual drug levels cause an initial drop in parasite numbers and so C2, which emergences slightly later, is able to become patent sooner. Similarly, by the time C3 emergences from the liver, drug levels have fallen sufficiently that it no longer effects the newly emerged clone and so C3 becomes patent before both C1 and C2. The 'clinical' WoS is actually wrong: the true order of survival is C1, C2, C3 but the clinical order is C3, C2, C1.
Background
Artemisinin combination therapy (ACT) is effective, first-line treatment for uncomplicated Plasmodium falciparum malaria and now widely deployed in most endemic countries. The artemisinin component is extremely fast acting and highly potent but rapidly eliminated so current ACT all contain a second 'partner' drug with a longer half-life. The long half-lives of the partner drugs are clinically beneficial: they may persist at active concentrations for weeks after treatment, providing valuable prophylaxis against infection during this period. However, the rapid elimination of artemisinin means the partner drugs are present as a monotherapy during this prophylactic period, which may be a potent force selecting for resistant parasites. This process can drive increasing drug tolerance to the partner drug and the eventual loss of therapeutic effectiveness, resulting in failure of the ACT. This ability of parasites to evolve resistance to the partner drug is a possible Achilles heel of ACT.
The putative consequences of long half-life partner drugs driving resistance enters debates on the politics of anti-malarial drug deployment. Mass drug deployment (MDA) policies were widely used in the 1950s and 1960s (reviewed by Von Seidlein and Greenwood), but fell from favour as, used in isolation, they had only a transitory effect on malaria transmission and drove drug resistance to high levels. MDA policies are once again under active consideration as part of a comprehensive toolkit for eradicating malaria populations, particularly those believed to have evolved artemisinin resistance. Intermittent preventative treatment (IPT) programmes also deploy anti-malarial drugs and have proven highly effective in protecting vulnerable populations, but as malaria transmission declines it is unclear at what point their short-term clinical benefit is outweighed by the putative longer-term consequences of driving resistance. Anti-malarial drug use in MDA and IPT programmes drives resistance in two ways: (1) they select for parasites able to survive treatment; and, (2) their long half-lives mean they persist for long periods after treatment, selecting for newly acquired resistant parasites able to survive residual drug levels as they emerge from the liver and attempt to establish a viable infection. Mathematical modelling suggests the first force is easy to measure: it is simply, and intuitively, the proportion of existing asexual blood infections that are treated (often called simply the 'drug coverage'). The second force is more difficult to quantify but note that the size of its effect is, somewhat counter-intuitively, not affected by local transmission intensity (Box 1 of). The use of drugs in MDA and IPT is almost guaranteed to increase selection for resistance, but to what extent? This paper attempts to provide quantitative answers to this second force, selection due to residual drug levels, through the application of pharmacological modelling of anti-malarial drug treatment.
The genetic process whereby parasites evolve increasing tolerance to the partner drug is usually quantified as a window of selection (WoS). As a specific example, Watkins and Mosobo noted parasites with the dhfr108 mutation could be observed in patients 15 days after treatment with sulfadoxine-pyrimethamine (SP), whereas wild-type infections were only observed after 50 days, thus implying a WoS of 35 days. Similarly, Sisowath et al. estimated a WoS of 15 days associated with the pfmdr1 D1246Y mutation after lumefantrine treatment. Routine genotyping in clinical trials means such data are readily available and have been used previously to estimate WoS. People in endemic areas often have residual drug levels resulting from previous chemoprophylaxis or direct treatment. This is likely to result in frequent selection windows for most drugs. For example, if five courses of SP are taken per year and, as above, assuming a 35-day WoS, then there will be 5 × 35 = 175 days per person in which persisting drug concentrations are selecting for the dhfr108 mutation. This implies that selection for resistance via WoS may be widespread and intense.
These WoS estimates, obtained from clinical observations, are widely cited and refered to here as 'clinical' WoS to denote their origin in clinical observations. In fact, the 'true' WoS is the period during which infections bearing the mutation can emerge from the liver (assuming the drugs do not kill the parasites while in the liver stage) and survive to produce a viable infection, while sensitive parasites are killed by residual drug concentrations. Unfortunately, it is impossible to directly observe the 'true' WoS because the 10 parasites that emerge from the liver are below patency. It has, therefore, been widely assumed that the clinical WoS reflects the true window (op cit). However, it is not clear how well the clinical WoS estimates the true WoS and there are several plausible reasons why they may be poor estimators, discussed further in Fig. 1. This paper uses the pharmacokinetic–pharmacodynamic (PK/PD) model described previously to simulate the WoS for increasingly resistant infections, with the aim to quantifying how accurately clinical WoS estimate the 'true' window of selection.
(Enlarge Image)
Figure 1.
Potential problems with using patent parasitaemia to estimate 'windows of selection' (WoS). Field studies typically measure the WoS by comparing the times different genotypes become detectable ('patent') in patients. As an example, the clinical data of Sisowath et al. [3], showed that resistant parasites bearing the pfmdr1 D1246Y mutation (top row; blue squares) first become patent in patients approximately 20 days after treatment and about 15 days earlier than sensitive parasites (bottom row; green squares). Each new infection was assumed to comprise 10 parasites emerging from the liver and to become patent if/when their numbers reached 10 parasites (horizontal dashed lines). The black line shows the decrease in drug concentration over time (using a single drug dose for illustration). Lines A and B show two clones that emerge on the same day but grow at different rates because of their differing IC50s and become patent several days apart. Here, field estimates based on patent parasitaemia would quantify a WoS of around 5 days when in fact there is none. Lines C1 –C3 illustrate how the earliest emerging parasites may not necessarily correspond to the first patent infection. C1 is the earliest emerging clone but residual drug levels cause an initial drop in parasite numbers and so C2, which emergences slightly later, is able to become patent sooner. Similarly, by the time C3 emergences from the liver, drug levels have fallen sufficiently that it no longer effects the newly emerged clone and so C3 becomes patent before both C1 and C2. The 'clinical' WoS is actually wrong: the true order of survival is C1, C2, C3 but the clinical order is C3, C2, C1.