Personalized AnticoagulationCLINICAL PERSPECTIVE
Optimizing Warfarin Management Using Genetics and Simulated Clinical Trials
Background—Clinical trials testing pharmacogenomic-guided warfarin dosing for patients with atrial fibrillation have demonstrated conflicting results. Non–vitamin K antagonist oral anticoagulants are expensive and contraindicated for several conditions. A strategy optimizing anticoagulant selection remains an unmet clinical need.
Methods and Results—Characteristics from 14 206 patients with atrial fibrillation were integrated into a validated warfarin clinical trial simulation framework using iterative Bayesian network modeling and a pharmacokinetic–pharmacodynamic model. Individual dose–response for patients was simulated for 5 warfarin protocols—a fixed-dose protocol, a clinically guided protocol, and 3 increasingly complex pharmacogenomic-guided protocols. For each protocol, a complexity score was calculated using the variables predicting warfarin dose and the number of predefined international normalized ratio (INR) thresholds for each adjusted dose. Study outcomes included optimal time in therapeutic range ≥65% and clinical events. A combination of age and genotype identified different optimal protocols for various subpopulations. A fixed-dose protocol provided well-controlled INR only in normal responders ≥65, whereas for normal responders <65 years old, a clinically guided protocol was necessary to achieve well-controlled INR. Sensitive responders ≥65 and <65 and highly sensitive responders ≥65 years old required pharmacogenomic-guided protocols to achieve well-controlled INR. However, highly sensitive responders <65 years old did not achieve well-controlled INR and had higher associated clinical events rates than other subpopulations.
Conclusions—Under the assumptions of this simulation, patients with atrial fibrillation can be triaged to an optimal warfarin therapy protocol by age and genotype. Clinicians should consider alternative anticoagulation therapy for patients with suboptimal outcomes under any warfarin protocol.
Warfarin remains the most commonly prescribed oral anticoagulant for patients with atrial fibrillation (AF) despite increasing use of non–vitamin K antagonist oral anticoagulants (NOACs).1–4 Well-controlled warfarin, as reflected by a minimum time in therapeutic international normalized ratio (INR) range (TTR) between 65% and 75%, or NOACs offer optimal anticoagulation outcomes.5–8 Data from 2 clinical trials demonstrated that 1.58% of patients with well-controlled warfarin (TTR>75%) experienced a major bleeding event, whereas 3.85% of patients with poorly controlled warfarin (TTR<60%) experienced a major bleeding event.9,10 Compared with NOACs, warfarin offers a more affordable, accessible, and potentially superior option for patients with a history of medication nonadherence.11 However, significant challenges to optimal efficacy and safety outcomes persist with warfarin. Warfarin has a narrow therapeutic window and an up to a 20-fold interindividual variation in therapeutic dose. As a result, the surrogate outcome most often used in clinical practice, TTR, is often lower than optimal range.7,8 Also, warfarin has been one of the most commonly implicated medications for adverse events leading to emergency department visits and hospital admissions.12 Despite these challenges, warfarin remains a widely used treatment option.
See Editorial by Goto and Goto
Efforts thus far to optimize therapy have resulted in >50 algorithm-guided warfarin therapy protocols of varying complexity. The most complex protocols require as many as 14 independent variables, including demographic, clinical, and genotypic characteristics and suggest up to 7 predefined INR thresholds to predict warfarin dose and manage dose titration (Table IA in the Data Supplement).13,14 Some evidence from clinical trials indicated that pharmacogenomic-guided dosing algorithms more accurately predicted therapeutic dose, resulting in a revised warfarin label by US Food and Drug Administration (Figure 1).15–20 However, the majority of clinical trials offering algorithm-guided protocols were not powered to assess clinical events in diverse populations.21–24 Furthermore, despite algorithm-guided protocols showing modest increase in TTR from several clinical trials, this increase did not necessarily translate to significant reductions in clinical events.24 Data from recent major clinical trials such as the Clarification of Optimal Anticoagulation through Genetics and the European pharmacogenetics of anticoagulant therapy produced conflicting conclusions on the superiority of the fixed-dose standard or clinically guided protocols versus pharmacogenomic-guided protocols to optimize warfarin therapy.23,24 Therefore, optimization of warfarin therapy through clinically guided or pharmacogenomic-guided protocols remains controversial, and continued efforts to optimize warfarin therapy in patients with AF remain warranted.
In this study, our objective was to study warfarin therapy optimization with a surrogate end point of TTR, efficacy end point of ischemic stroke, and safety end point of intracranial hemorrhage in patients with newly diagnosed AF. We executed a prospective, 5-arm simulated clinical trial incorporating fixed-dose control, clinically guided, and 3 pharmacogenomic-guided warfarin therapy protocols. To this end, we enhanced the clinical trial simulation framework previously developed and validated by our group, which allows simultaneous comparative effectiveness research by incorporating predicted clinical events, surrogate outcomes such as TTR, and a complexity score associated with each protocol.25 Our study population is derived from the electronic medical records (EMRs) of patients diagnosed with AF and treated within a large hospital network in eastern Wisconsin and northern Illinois. Patient EMRs were then entered into a simulation platform that both simulates the clinical and demographic characteristics of the study population and predicts the clinical outcomes from different warfarin therapy protocols. In light of recent pharmacogenomic-guided clinical trials lacking evaluation of both TTR and clinical events in the same study, using a clinical trial simulation framework provides modeled estimates of differences in clinical events between multiple warfarin therapy protocols given the assumptions made in our simulation platform. We hypothesized that more complex warfarin therapy protocols would demonstrate both improved surrogate safety and efficacy outcomes across the entire study population.
Our enhanced clinical trial simulation framework consists of the following 5 scalable components:
longitudinal EMRs of warfarin-receiving populations to train and validate a Bayesian network model (BNM), which is subsequently used to simulate statistically identical EMRs (thereafter clinical avatars);
an empirical and domain knowledge-based method to set study conditions, length of study, number of parallel clinical trials, protocols associated with each clinical trial arm, and the total number of clinical avatars;
a library of warfarin therapy protocols where each protocol comprises an initial, adjustment, and maintenance dosing algorithm;
an automated warfarin therapy simulator, based on a validated pharmacokinetic/pharmacodynamic model, which generates predicted warfarin concentration and INR26; and
a clinical outcome calculator that generates treatment outcome metrics (eg, TTR, therapeutic dose of warfarin, efficacy, and safety end points).
We conducted these 5 phases throughout the study (Figure 2). Further details of each phase, analytic methods, and study materials are provided in the Data Supplement to other researchers for purposes of reproducing the results or replicating the procedure.
We designed a 5-arm, 90-day clinical trial simulation to test the comparative effectiveness of 5 warfarin therapy protocols ranging from less to more complex (Table 1). Each of the 5 protocols is composed of 3 dosing algorithms that calculate initial warfarin dose, adjustment warfarin dose, and maintenance warfarin dose, which titrate warfarin dose based on INR thresholds. Protocols are described in full in the Data Supplement.
Current warfarin therapy protocols differ in the number of demographic, clinical, and genetic variables used to personalize warfarin dose and the number of INR-based thresholds used to adjust the dose. Protocol complexity has important implications for patient satisfaction, protocol adherence, cost-effectiveness, and potentially clinical outcomes.27–30 We quantified the degree of complexity by counting the number of demographic, clinical, and genetic variables used to personalize warfarin dose and the number of INR-based thresholds used to adjust the dose. The sum of these 2 counts was aggregated into a Complexity Score for each protocol. The score was then normalized against the fixed-dose AAA control protocol (Table 2). Such a measure of complexity facilitates comparison of multiple warfarin therapy protocols and provides an additional metric to interpret parity in clinical outcomes.
We identified and extracted longitudinal EMRs of patients with AF who were taking warfarin at Aurora Health Care (Aurora) over the period of 2002 to 2012. Patient data were deidentified per Institutional Review Board approval before distribution to the research team. Patient EMR data were then used in a BNM to generate 1 500 000 simulated clinical avatars. The number of clinical avatars was calculated by a power analysis, described in detail in the Data Supplement. The 1 500 000 clinical avatars were randomly sampled and distributed over a total of 100 parallel trials. The required number of parallel trials was calculated by a separate power analysis, described in the Data Supplement. Each parallel trial was composed of 15 000 clinical avatars, and the same clinical avatar population was tested under each of the 5 arms of the study. The details of the cohort identification process and data preprocessing are provided in the Data Supplement.
The primary outcome metric was TTR with therapeutic INR defined as 2.0 to 3.0 over the first 90 days of simulated treatment, measured by Rosendaal linear interpolation.31 Well-controlled INR was defined as TTR ≥65%. Additional thresholds of TTR ≥70% and ≥75% are evaluated in the Data Supplement.
Secondary outcome metrics included the safety and efficacy events of intracranial hemorrhage and ischemic stroke. The number and incidence rate of intracranial hemorrhagic events were predicted for each subpopulation based on daily INR values of individual clinical avatars.32 We used a similar daily calculation to estimate the number and incidence rate of ischemic strokes.32 The number of events in each arm was rounded to the nearest integer, whereas the incidence rates were reported per 1000 clinical avatars. The details of event prediction are provided in the Data Supplement. Both primary and secondary outcome metrics were then stratified according to the study populations’ clinical characteristics and warfarin responder status as defined by the US Food and Drug Administration warfarin label based on the combined profile of CYP2C9 and VKORC1 genes (Figure 1).20
To facilitate interpretation and dissemination of the results, an interactive result visualization tool was developed in R and has been made publicly available online.33 Both 30-day and 90-day results are available through this tool. The user can interactively select subpopulations of interest for additional analysis of primary and secondary outcomes. The online interactive results visualization tool can be found at https://ari-cds-ar.shinyapps.io/circcvg_paper_app/.
To validate the BNM against the validation subset of the Aurora EMR data, χ2 analyses were conducted on each discrete variable, and independent t tests were used on continuous variables. The primary outcome metric of the simulated clinical trials was calculated by finding the mean TTR within a given population in each parallel clinical trial of 15 000 clinical avatars. We then averaged the mean across each arm of 100 parallel trials (ie, mean of means). SEM was similarly calculated for each parallel clinical trial, and then SEM was calculated for each arm. Results are expressed as mean of means for the primary outcome±the arm’s SEM unless otherwise specified.
Secondary outcome metrics were reported by summing the number of predicted clinical events within a given population in each parallel clinical trial of 15 000 clinical avatars and then finding the average number of clinical events between 100 parallel trials (ie, mean of sums). All statistical analyses were conducted with R.33 Unless otherwise noted, statistical significance was set at P<0.05. Two techniques were used to conduct hypothesis testing across all 100 parallel trials:
We determined the power of each comparison by calculating the number of parallel trials where a comparison was demonstrated to be significantly divided by the total number of replications (in this case 100). All 100 parallel trials were conducted via a random selection of clinical avatars; therefore, we conducted a 1-way, repeated-measures ANOVA over 5 protocols where the null hypothesis was that all protocols would have the same outcome. This was repeated independently for each of the 100 parallel clinical trials.
We conducted ANOVA for the combined data for all 100 trials via the R package Companion to Applied Regression.34 We used Tukey test for pairwise comparison of protocols to measure combined data for all 100 trials. We compared the outcomes, both TTR and clinical event rates, across all 5 protocols in each subpopulation. A P value of ≤0.05 was used as the level of significance for all statistical methods.
We identified Aurora patient EMRs indicating prescription anticoagulation agents, including warfarin, over a 10-year period. The longitudinal data set included the EMRs of 157 450 patients who were treated with any of 10 different anticoagulation agents. After data cleaning and quality assurance, the resulting data included 14 206 unique patients with AF prescribed warfarin who were then used for BNM training and validation. BNM of the Aurora patient population resulted in a directed acyclic graph (Figure IA in the Data Supplement) and associated conditional probability tables. Univariate and bivariate variable analysis demonstrated no significant difference between the model and the validation data. As described by the power analysis (Data Supplement), we estimated that 15 000 clinical avatars were necessary for each of the 100 parallel clinical trials. Subsequently, we generated 1 500 000 clinical avatars from the trained and validated BNM.
The study population’s characteristics had no statistically significant differences (P<0.05) between the Aurora warfarin population and the clinical avatar population for both continuous and discrete variables (Table 3).35 The clinical avatar populations had an average age of 67.2±14.47 with 53.10% female and 46.90% male. Racial characteristics included 95.22% white, 4.19% black, and <1% Asian, American Indian/Alaskan, or Pacific Islander. Stratifying the avatar population according to responder status, based on combination of CYP2C9 and VKORC1 used on the US Food and Drug Administration warfarin label (Figure 1), demonstrates 61% normal responders, 35% sensitive responders, and 4% highly sensitive responders.
Primary Outcome Metric
Across the whole population, the TTR of the control arm was 57.2%±0.17 (Table 4). Compared with the control arm, increasingly complex treatment protocols demonstrated significantly higher TTR (P<0.05). Only the moderately complex protocol, PGAA, demonstrated well-controlled INR (TTR≥65%) with TTR of 77.4%±0.14. In addition, results by different TTR thresholds for well-controlled INR, 70% and 75%, are also provided in the Tables IA and IIA in the Data Supplement.
The largest subpopulation, normal responders, ≥65 years (35%), had significantly higher TTR in all protocols than in the whole population. All protocols achieved well-controlled INR, including the least complex control protocol with a TTR of 81.6%. In contrast, normal responders <65 years old (26%) did not achieve well-controlled INR in the control protocol with a TTR of 53.4%. However, the clinically guided CAA protocol and the pharmacogenomic-guided PGAA, PGPGI, and PGPGA protocols all demonstrated well-controlled INR. The CAA protocol was the least complex protocol of the 3 and produced a TTR of 80.0%.
A smaller subpopulation of sensitive responders, ≥65 years old (20%), required pharmacogenomic-guided dosing to achieve well-controlled INR. This population had TTR lower than the whole population in all protocols. Only the moderately complex pharmacogenomic-guided protocol, PGAA, had well-controlled INR with a TTR of 74.3%. Similarly, sensitive responders <65 years old (15%) and highly sensitive responders ≥65 years old (2%) achieved well-controlled INR only in the PGAA protocol with TTR of 65.8% and 66.7%, respectively, which was significantly higher than the control arm TTR. However, highly sensitive responders <65 years old (2%) had very poor INR control in the control arm and clinically guided protocol with TTR of 12.9% and 18.9%, respectively. In this subpopulation, despite significantly improved TTR in the pharmacogenomic-guided protocols, the other complex protocols failed to reach TTR ≥65%.
Secondary Outcome Metrics
Comparing any personalized warfarin therapy protocol to the control protocol, there was a significant reduction in both the number of ischemic strokes and intracranial hemorrhagic events in the whole population and within some, but not all, subpopulations stratified by age and genotype. Increasing the complexity of warfarin therapy protocols had a greater impact in reducing the number of intracranial hemorrhages compared with the reduction in ischemic strokes (Tables IIIA and IVA in the Data Supplement.
The number of intracranial hemorrhagic events and the average incidence rate dropped most in pharmacogenomic-guided protocols compared with the control arm. PGPGA had the lowest number of intracranial hemorrhages with 78 and an incidence rate of 5.2 (P<0.05) over 90 days. Ischemic stroke events were also lowest in the pharmacogenomic-guided protocols compared with the less complex clinically guided or control protocol. The number of stroke events fell to 26, for a rate of slightly <2 in both the PGPGI and PGPGA arms (Table 5).
Subpopulations stratified by age and genotype demonstrated significantly different incidence rates of intracranial hemorrhages and ischemic strokes over 90 days. Normal responders, ≥65 years old, had a similar number of events and incidence of events in all protocols regardless of protocol complexity (Figure 3). Normal responders <65 years old saw a reduction in clinical events in more complex protocols compared with the control protocol. The most complex pharmacogenomic-guided protocols, PGPGI and PGPGA, saw the lowest combined number of ischemic stroke and intracranial hemorrhage events (Table 5). However, this was only marginally improved compared with the clinically guided CAA protocol over 90 days (Table 5).
Similarly, sensitive responders ≥65 years old also saw the greatest reduction in clinical events in the pharmacogenomic-guided protocols compared with the control protocol. Highly sensitive responders of any age and sensitive responders <65 years old had the greatest reduction of combined intracranial hemorrhages and ischemic strokes in the pharmacogenomic-guided protocols (Figure 3). The greatest reduction in ischemic stroke was seen in the PGPGI and the PGPGA compared with the control arm. Highly sensitive responders <65 years old fell from a maximum of 9 ischemic strokes per 1000 clinical avatars in the control arm to 2 in the PGPGI arm and 2 in the PGPGA arm over 90 days (P<0.05; Table IIIA in the Data Supplement.
Expanded results can be viewed through an interactive online viewer at https://ari-cds-ar.shinyapps.io/circcvg_paper_app/.
In this 90-day, 5-arm simulated warfarin therapy clinical trial, clinically guided or pharmacogenomic-guided protocols significantly improved primary and secondary outcomes across the whole study population when compared with the least complex control protocol. Furthermore, subpopulations stratified by age and genotype demonstrated significant differences in outcome according to protocol complexity.
In other studies, both responder status and age have been independently shown to predict TTR and risk of clinical events.36–38 On the basis of simulation results, including both TTR and clinical events, matching subpopulations defined by genotype and age with the least complex warfarin therapy protocols could produce optimal outcomes. Providing patients and clinicians with the least complex warfarin therapy could improve the medical errors associated with medication administration and decrease the challenges involved in implementing complex therapy protocols in every day clinical settings.27–30 Furthermore, providing patients with the least complex possible warfarin therapy, including once daily prescriptions and no cut pills can increase protocol adherence, increase TTR, and improve patient satisfaction.28
Using a simulation approach to personalize anticoagulation therapy offers several advantages, and similar techniques have been used previously to calculate the cost-effectiveness of different anticoagulation therapies.39 The simulation platform used in this study provides possible outcomes, including a clinical event calculation that can be applied across multiple warfarin therapy protocols for small subpopulations at a fraction of the time and expense of large clinical trials. In addition, because this approach uses clinical avatars, results can be made publicly available while maintaining the Health Insurance Portability and Accountability Act compliance. Furthermore, using clinical avatars allows for the same avatar to be tested across multiple protocols mitigating the counterfactual problem of clinical trials in the setting of simulation. The results generated through this simulation approach can be used to refine individual protocols for specific subpopulations. The refined protocols can then be simultaneously evaluated across the entire population to determine the most advantageous clinical trial design.
Normal responders ≥65 years old, representing 35% of the total study population, had both well-controlled INR and low rates of clinical events in the least complex control arm (Table 6).
However, normal responders <65 years old, representing 26% of the study population, demonstrated well-controlled INR and low clinical event rates only under the clinically guided or pharmacogenomic-guided protocols. Optimizing for the least complex protocol would suggest that the clinically guided protocol is best for these patient populations. Similar to prior studies, our results show that younger age has been associated with lower TTR under some protocols.32 Our results suggest that the AAA, CAA, and PGAA protocols do not appropriately account for the increase in warfarin metabolism for younger patients with AF with respect to either initial dose or dose adjustment (days 3–7). The PGPGI and PGPGA protocols, which have more frequent dose adjustment (days 3–7), show similar TTR for patients of all ages. This would suggest that younger patients with AF should have closer INR monitoring and dose adjustment, particularly in the first week of therapy after initiating warfarin.
Sensitive and highly sensitive responder patients, accounting for the remaining 39%, derived the greatest benefits from pharmacogenomic-guided warfarin therapy. Within the control arm, these patients demonstrated a significantly higher rate of clinical events compared with normal responders, consistent with a recently reported clinical trial.36 We observed no clinical benefit from increasing complexity beyond the moderately complex pharmacogenomic-guided protocol for these populations. This pharmacogenomic-guided protocol, with pharmacogenomic guidance for the initial dose alone, demonstrated the highest TTR across the entire population and all subpopulations. Similarly, this arm also had significantly lower rates of ischemic stroke and intracranial hemorrhagic events compared with the clinically guided and control arms. The benefits of pharmacogenomic-guided warfarin therapy are consistent with several additional recent studies suggesting that pharmacogenomic-guided warfarin dosing is associated with fewer clinical events.21–23,37 The most complex pharmacogenomic-guided protocols used in arms 4 and 5, which included pharmacogenomic guidance for initial dose and adjustment period, days 3 through 7, demonstrated significantly inferior TTR compared with either the least complex pharmacogenomic-guided arm or the clinically guided arm. These results are consistent with the outcomes from recent warfarin therapy clinical trials comparing pharmacogenomic-guided versus standard best practice and clinically guided dosing.23,24 For patients initiating warfarin therapy, it is important that providers administering warfarin have evidence to support the use of more or less complex warfarin therapy when appropriate.
Sensitive responders <65 years of age and highly sensitive responders ≥65 years old, 17% of the population, marginally achieved well-controlled INR only in the PGAA protocol according to a TTR threshold of 65%. This would suggest that either the PGAA protocol or NOACs might be the preferred anticoagulation therapy for these patients depending on patient and provider’s preference. A direct comparison between the PGAA protocol and NOACs would provide additional clarity on the optimal strategy. For highly sensitive responders <65 years old (2%) who did not achieve well-controlled INR and had higher associated clinical events rates compared with other subpopulations in any warfarin protocol, NOACs might be the only preferred anticoagulation therapy option.
Highly sensitive responders, in particular, are rare and account for<4% of our study population. But within our framework, they are large enough to conduct statistical analysis. Recent pharmacogenomic-guided dosing clinical trials did not report results for these small subpopulations, and their rarity in the general population could account for the differing clinical trials’ results.23,24 Fluctuating the minimum threshold of well-controlled INR between 65% and 75% affects the suggested optimal anticoagulation therapy for ≈39% of the study population, including sensitive and highly sensitive responders (Table 6; Tables IA and IIA in the Data Supplement). A direct comparative effectiveness study powered for appropriate subgroup analysis with these patients is necessary and appropriate to determine whether NOACs are particularly beneficial for these patients.
These results suggest that age (<65 or ≥65 years) and combination of VKORC1 and CYP2C9 genotype can be used to determine optimal anticoagulation therapy. Patients initiating warfarin therapy could be triaged to the optimal least complex warfarin therapy protocol, or potentially NOACs, using the combination of age and genotype. The possibility of identifying patients who would optimally benefit from more complex pharmacogenomic-guided dosing or NOAC therapy has important cost-saving and clinical outcome implications.
This study is based on a clinical outcome calculation model using a combination of patient EMR data and subsequent mathematical modeling. As such, there are necessary simplifications within the model, and it is potentially limited in its ability to account for all covariates and confounders of warfarin therapy. For example, outcomes for patients with certain rare CYP2C9 variants were excluded. These patients account for ≈1.5% of the US warfarin-receiving population and were not included in the pharmacokinetic/pharmacodynamic simulation, including *4 and *5 variants noted to be more prevalent in the black population. The simulated clinical trial framework was validated against results from the CoumaGen clinical trial and therefore does not predict how warfarin therapy is executed in daily clinical practice.25 Although INR prediction has been previously validated in a simulation setting, the clinical outcome calculation derived from daily INR values has not been validated in a simulation setting. Furthermore, clinical event prediction is predicated using daily INR values versus all potential variables related to clinical events in the real world. Therefore, the distribution of predicted clinical events between protocols and subpopulations is more pertinent than the absolute rate. Minor improvements in clinical outcomes demonstrate statistical significance because of the large study population and the number of parallel simulated clinical trials. However, statistical significance in the setting of large simulation does not necessarily confer robust evaluation of the null hypothesis. Treatment effects determined to be statistically significant may be too small to confer clinical relevance because of the large study population and number of parallel trials. Simulation results should thus be interpreted with caution and emphasis placed on outcome distribution between subpopulations and arms.
In a 5-arm, 90-day simulated clinical trial, more complex warfarin therapy protocols did not universally improve TTR or reduce clinical events. Subpopulations defined by age (<65 years old) and genotype information (combined variants of CYP2C9 and VKORC1) demonstrate significantly different outcomes from including clinical and genotype information in the warfarin therapy protocol. When considering warfarin therapy for newly diagnosed AF, triaging the patient by age and genotype information could optimize the choice of anticoagulation treatment. A prospective study investigating the surrogate outcome, TTR, as well as clinical events, could be considered to validate these results.
We thank Dr Maharaj Singh, principal biostatistician at Aurora Research Institute, for his scientific and statistical advice. We also thank Andrew Marek for his assistance in identifying, collecting, and warehousing the electronic medical record (EMR) data used in this study. We also thank Marcus Walz for his technical support and Sandra Kear for editing. Finally, we thank Dr Michael Michalkiewicz for his significant efforts to approve and facilitate the use of Aurora EMR data for this study and for scientific advice.
Sources of Funding
This work was supported by the National Institutes of Health Grant 1R01LM011566.
The Data Supplement is available at http://circgenetics.ahajournals.org/lookup/suppl/doi:10.1161/CIRCGENETICS.117.001804/-/DC1.
- Received March 22, 2017.
- Accepted September 20, 2017.
- © 2017 American Heart Association, Inc.
- Basaran O,
- Filiz Basaran N,
- Cekic EG,
- Altun I,
- Dogan V,
- Mert GO,
- et al
- Huisman MV,
- Rothman KJ,
- Paquette M,
- Teutsch C,
- Diener HC,
- Dubner SJ,
- et al
- Trusler M
- De Caterina R,
- Husted S,
- Wallentin L,
- Andreotti F,
- Arnesen H,
- Bachmann F,
- et al
- Harper P,
- McMichael I,
- Griffiths D,
- Harper J,
- Hill C
- Olsson SB
- Akins PT,
- Feldman HA,
- Zoble RG,
- Newman D,
- Spitzer SG,
- Diener HC,
- et al
- Bauer KA
- Bodin L,
- Verstuyft C,
- Tregouet DA,
- Robert A,
- Dubert L,
- Funck-Brentano C,
- et al
- Sconce EA,
- Khan TI,
- Wynne HA,
- Avery P,
- Monkhouse L,
- King BP,
- et al
- Zhu Y,
- Shennan M,
- Reynolds KK,
- Johnson NA,
- Herrnberger MR,
- Valdes R Jr.,
- et al
- 20.↵US Food and Drug Administration. Warfarin (Coumadin) product labeling. https://www.accessdata.fda.gov/drugsatfda_docs/label/2011/009218s107lbl.pdf. Accessed March 23, 2017.
- Anderson JL,
- Horne BD,
- Stevens SM,
- Grove AS,
- Barton S,
- Nicholas ZP,
- et al
- Anderson JL,
- Horne BD,
- Stevens SM,
- Woller SC,
- Samuelson KM,
- Mansfield JW,
- et al
- Fusaro VA,
- Patil P,
- Chi CL,
- Contant CF,
- Tonellato PJ
- Cruess DG,
- O’Leary K,
- Platt AB,
- Kimmel SE
- Muroi M,
- Shen JJ,
- Angosta A
- Al-Metwali B,
- Mulla H
- 33.↵R Core Team. R: A language and Environment for Statistical Computing [Computer Program]. Vienna, Austria: R Foundation for Statistical Computing; 2016.
- Fox J,
- Weisberg S
- Pink J,
- Pirmohamed M,
- Lane S,
- Hughes DA
The goal of Precision Medicine Initiative launched in 2015 was to transition medicine from a one-size-fits-all approach to one that accounts for the individual differences in people’s genes, environments, and lifestyles. Despite these efforts, the clinical trial framework for translating scientific evidence to the bedside remains largely unchanged. In this article, we suggest a new approach to enhance clinical trial design using electronic health records combined with computer simulation. This new method implies the creation of a virtual sandbox where clinicians and investigators can compare various clinical outcomes across diverse patient populations to support the most efficient and effective clinical trial design. This study applies the simulation approach to evaluate the thorny issue of anticoagulation therapy with warfarin for stroke prevention in atrial fibrillation. The simulated clinical trial in this study demonstrates that just knowing the age and genotype of a patient optimizes the selection of the appropriate anticoagulation regime and should be favored over a one-size-fits-all approach. An additional advantage of simulation is that resultant data can be made public. We therefore introduce readers to the idea of living papers with an embedded online tool that readers can interact with to navigate the study results by selecting subpopulations or dosing regimens of interest. Although simulation will not replace real-world clinical trials in the foreseeable future, it could help design better clinical trials, bringing us faster toward the bold initiative of Precision Medicine.