Tag Archives: precision medicine

21st Century Cures Act: Boosting biomedical research, but at what cost?

Co-authored by Andrew Hooper & Nafis Hasan

In a remarkable display of bipartisanship, the Senate passed HR 34 and President Obama signed the 21st Century Cures Act into law on Dec. 13, 2016. The original bill was introduced and sponsored by Rep. Suzanne Bonamici (D-OR) on Jan 2015 and garnered co-sponsors from both sides of the aisle, including the support of Rep. Lamar Smith (R-TX), Chairman of the House Committee on Space, Science and Technology. The House approved the original bill in Oct 2015 and after a year on the Senate floor where the bill underwent several amendments proposed by both Democrats and Republicans, the Senate approved the bill on Dec 6 2016 and passed the bill on to President Obama to be signed into law.

This law is meant to accelerate drug development and bring cutting edge treatment to patients, revise the current status of mental health research and treatment for disorders, with a strong focus on the current opioid crisis sweeping across the nation. The law is also of significant importance to biomedical scientists as it will expand funding  for certain fields, keeping in line with the Precision Medicine Initiative launched in 2015. More specifically, the Cures act will provide funding for specific NIH innovation projects such as the Precision Medicine Initiative ($4.5 billion through FY 2026), the BRAIN initiative ($1.51 billion through FY 2026), the Cancer Moonshot project ($1.8 billion through FY 2023) and the Regenerative Medicine (stem cells) program (30$ mn through FY 2026). In addition, this law will stimulate innovative research by awarding investigators with the Eureka Prize for “significant advances” or “improving health outcomes”. The law also seeks to promote new researchers through its Next Generation of Researchers Initiative, an attempt to solve the postdoc crisis in academia. As a response to the lack of women and underrepresented minorities in STEM fields, the law also contains provisions that will attract and retain such scientists in “priority research areas”.  Finally, to further encourage early-stage researchers, the law authorizes the establishment of programs to help in the repayment of student loans and raises the cap on the repayment assistance available to the researchers.

Besides ensuring funding for biomedical research, this law aims to address privacy concerns brought up by experts regarding patient information in the era of precision medicine (for more details, check out our analysis of the precision medicine initiative). Under this law, certificates of confidentiality will be provided to all NIH-funded researchers whose studies involve collection of sensitive patient information. This information will be withheld by the NIH, but can be accessed upon requests filed under the Freedom of Information Act. On the other hand, in order to make sure data sharing is made easier for scientists, this law will allow NIH to break out of red tape and regulations that obstruct scientists from attending scientific meetings and sharing data.

Despite the generally positive reception of the Cures Act by NIH officials and research scientists, the bill was not without its critics. The principal criticism of the final product is that it constitutes a handout to pharmaceutical and medical device companies by substantially weakening the FDA’s regulatory check on bringing new treatments into the clinic.

For example, Sydney Lupkin and Steven Findlay point to the $192 million worth of lobbying collectively expended by over a hundred pharmaceutical, medical device, and biotech companies on this and related pieces of legislation. The goal of this lobbying, Lupkin and Findlay assert, was to give the FDA “more discretion” in deciding how new drugs and other treatments gain approval for clinical use – presumably saving a great deal of money for the companies that develop them. Adding weight to their assertion is the fact that President Trump is reportedly considering venture capitalist Jim O’Neill for FDA commissioner. Mr. O’Neill is strongly supported by libertarian conservatives who see FDA regulations as inordinately expensive and cumbersome, so it seems reasonable to worry about how Mr. O’Neill would weigh safety against profit in applying his “discretion” as head of the FDA. On the other hand, under a wise and appropriately cautious commissioner with a healthy respect for scientific evidence, we might hope that maintaining high safety standards and reducing the current staggering cost of drug development are not mutually exclusive.

Additionally, Dr. David Gorski writes of one provision of the Cures Act that appears to specifically benefit a stem-cell entrepreneur who invested significantly in a lobbying firm pushing for looser approval standards at the FDA. Once again, it is not unreasonable to suspect that there is room to reduce cost and bureaucratic red tape without adversely impacting safety. And in fairness to the eventual nominee for FDA commissioners, previous commissioners have not been universally praised for their alacrity in getting promising treatments approved efficiently… at least, not within the financial sector. Still, the concerns expressed by medical professionals and regulatory experts over the FDA’s continued intellectual autonomy and ability to uphold rigorous safety standards are quite understandable, given the new administration’s enthusiasm for deregulation.

It appears that this law will also allow pharmaceutical companies to promote off-label use of their products to insurance companies without holding clinical trials. Additionally, pharma companies can utilize “data summaries” instead of detailed clinical trial data for using products for “new avenues”. It is possible that these provisions were created with the NIH basket trials in mind (details here). However, as Dr. Gorski argues, without clinical trial data, off label use of drugs will be based on “uncontrolled observational studies”, which, while beneficial for pharma companies, are risky for patients from the perspective of patient advocacy groups. These fears are not without evidence – a recent article from STAT describes how the off-label use of Lupron, a sex hormone suppressor used to treat endometriosis in women and prostate cancer in men, is resulting in a diverse array of health problems in 20-year olds who received the drug in their puberty.

Another “Easter egg”, albeit unpleasant, awaits scientists and policy-makers alike. Buried in Title V of the law is a $3.5 bn cut on Human and Health Services’ Prevention and Public Health fund, without a proper explanation added to such an act. Given the outcry on the lack of public health initiatives in the Precision Medicine Initiative, one is again left to wonder why 21st century cures are focusing only on treatment and drug development and not on policies directed towards promoting public health and prevention of diseases.

In conclusion, the implementation of this law will largely depend on the current administration. With the NIH budget for FY2017 still up in the air, the confirmation of nominees still hanging in balance, this law is far from being implemented. Based on the provisions, it appears that overall biomedical funding will be boosted in particular fields, designated “priority research areas”. However, it shouldn’t fail an observant reader that this bill also seems to allow pharma companies a higher chance to exploit the consumers. It, therefore, still remains a question of whose priorities (consumers/patients vs. investors/corporations) are being put forward first and the answer, in our humble opinion, will be determined by a dialogue between the people and the government.

Sources/Further Reading –

Precision Medicine: Too Big to Fail?

In January 2015, President Obama announced the launch of the “Precision Medicine Initiative”, proclaiming it to usher in “a new era of medicine that makes sure new jobs and new industries and new lifesaving treatments for diseases are created right here in the United States.” In addition, he remarked that the promise of this initiative laid in “delivering the right treatments, at the right time, every time to the right person”. This initiative, with bipartisan support in the Congress, provided a total of $215 million investment in 2016 for the NIH, along with the FDA and the Office of the National Coordinator for Health Information Technology (ONC), with a large portion of the money ($70 million) awarded to NCI to “scale up efforts to identify genomic drivers in cancer and apply that knowledge in the development of more effective approaches to cancer treatment”. The initiative doesn’t stop at the genome level, as Dr. Francis Collins, Director of the NIH, pointed out in an interview with PBS News Hour, and is meant to provide information about environmental exposures, lifestyle choices and habits and pretty much everything that can affect one’s health. Given the mass of information that will be generated (the initiative aims to enlist 1 million volunteers for its cohort), it is no surprise that patient privacy issues, as well as database infrastructure, are major concerns in this mammoth undertaking.

In addition to this initiative, the US government also launched its “Cancer Moonshot Program” a year later in January 2016. This program, under the leadership of Vice President Joe Biden, and with the help of an expert panel, the “Cancer Moonshot Task Force”, aims to “make more therapies available to more patients, while also improving our ability to prevent cancer and detect it at an early stage.” Since cancer is widely accepted to be a genetic disease, it seems fitting to serve as the poster child for an initiative that aims to cure and prevent diseases based on tailoring therapy for an individual using personal genetic information.

Tied to these two initiatives is also the latest approach to clinical trials at the NCI, commonly termed as “basket trials”. Based on findings from exceptional case reports where patients treated with drugs not commonly used for that type of cancer, the NCI was encouraged to try out drugs traditionally reserved for particular types of cancer for the ones that they weren’t developed for; thus, the Molecular Analysis for Therapy Choice (MATCH) and the Molecular Profiling-Based Assignment of Cancer Therapy (MPACT) trials were incorporated into the Precision Medicine initiative.  The NCI-MATCH trial aims to sequence tumor biopsy specimens from ~6,000 patients to identify mutations that will respond to targeted drugs selected for the trial; these drugs are already approved by the FDA for certain cancer types or are being tested in other clinical trials. On the other hand, the MPACT trial will compare whether patients with solid tumors fare better with targeted therapy vs non-targeted therapy.

The NCI-MATCH trial explained. Source: National Cancer Institute website.

Despite the initial fanfare, the recently released NCI-MATCH major interim analysis report does not paint a pretty picture for the trial’s outcome. While the enrollment was higher than expected (795 people registered in first 3 months compared to the projected 50 patients/month) and the labs were able to sequence most of the tumors (87%), it was also found that “most of the actual mutation prevalence rates were much lower than expected based on estimates from The Cancer Genome Atlas and other sources”. In fact, the overall expected mutation match rate was adjusted to 23% for the 24 treatment arms in the study as it continues.

While no endpoint has yet been reached to draw conclusive remarks about this trial, data available from other clinical trials that have taken a similar approach do not seem favorable. In the SHIVA trial, a randomized phase II trial carried out in France where 99 patients were treated based on identified mutation(s) compared to 96 patients treated with drugs of their physicians’ choice, median progression-free survival was 2.3 and 2 months, respectively. Current clinical data on patients with relapsed cancers, a major focus of the MATCH trial, do not seem favorable either. As Dr. Vinay Prasad, a haematologist-oncologist at Knight Cancer Institute, points out, only 30% of such patients respond to drugs based on biological markers and the median progression-free survival is 5.7 months. Based on this response rate, he estimated only 1.5% of patients with relapsed and refractory solid tumors to benefit from the precision medicine approach.

In a review of current clinical trials and past trials that have used the targeted therapy approach, Tannock & Hickman (NEJM, 2016) warn about the limitations of such an approach – heterogeneity and clonal evolution of cancer cells when challenged with targeted therapy, the inconsistency between expected and clinically achievable levels of inhibition of candidate molecules and of course, the efficacy of such therapies compared to currently available, standard but effective therapies such as aromatase inhibitors for breast cancer. While one can argue that heterogeneity in tumors can be countered with combination targeted therapy, the authors point out that “combinations of molecular targeted agents that target different pathways have often resulted in dose reduction because of toxic effects… in a review of 95 doublet combinations in 144 trials, approximately 50% of the combinations could use the full doses that were recommended for use as single agents, whereas other doublets required substantial dose reductions.” Even if it is possible that intratumoral heterogeneity can be countered with combination targeted therapy, a much-overlooked point in this initiative is the cost of such treatment strategy, considering the exorbitant costs of targeted cancer therapy. There already exists a disparity among cancer patients from a socio-economic standpoint and this initiative does little to address how to bridge such a gap. Questions such as how many drugs will a patient have to take, especially in cases of tumors that are highly heterogeneous, such as glioblastoma multiforme and how that would affect the living standard of a patient need to be considered before heralding a victory for the precision oncology approach even if the MATCH trial outcomes are favorable.

In another recent study, Dr. Victor Velculescu and his team from Johns Hopkins showed that sequencing only tumor genetic data can lead to false positives. After analyzing 815 cancer patients’ tumor sequencing data and comparing that data to the one from the patients’ healthy tissue, they found that 65% of genetic changes identified with tumor-only  sequencing data were unrelated to the cancer and therefore, “false positives”. The team also found that 33% of mutations, which are targets of currently available drugs, were also false positives when the patient’s germline genome was compared to the tumor genome; this affected 48% of the patients in their cohort.

This is not the first study of its kind to warn against false positives when trying to identify disease-causing mutations. Findings from the Exome Aggregation Consortium (ExAC), the largest catalogue of genetic variation in the protein-coding sequence of the human genome,  show that out of the 54 (on average) “pathogenic” mutations present in an individual’s genome, 41 of them “occur so frequently in the human population that they aren’t in fact likely to cause severe disease”. This is in direct contrast with studies that seem to enforce the idea that there are many more “oncogenes” to be found that can serve as novel drug targets.

The paradigm behind the MATCH trial, and in general the Precision Medicine initiative, seems to be blind to an obvious aspect of biology – context matters, and more so, in case of mutations that are deemed to be “carcinogenic”. As outlined in a recent paper by Zhu et al (Cell, 2016) and the famous “bad luck” paper by Tomasetti and Vogelstein,  it appears that the stem cells and their differential regenerative properties in different tissue types are responsible for the differential rates of carcinogenesis in various tissue types, a finding that again, buttresses the idea that tissue specificity matters. In fact, Iorio et al (Cell, 2016) was able to show just that in the context of pharmacogenomic interactions of currently available cancer drugs with data available from patient samples in the TCGA and other databases. Using a big data and machine learning approach, the authors developed a logic-based model that would predict the efficacy of any drug that is either approved or undergoing clinical trials against the mutation it is intended for in different cancer types ,which is essentially the basis of the MATCH trial. Surprisingly, it appeared that tissue specificity determined the pharmacological agents’ effects on the intended molecular targets; more specifically, only one drug interaction (out of 265 drugs tested) was found to be significant in multiple cancer types, which may sober up the expectations from the MATCH trial outcome. Therefore, using a blanket approach to target mutations in various tissue types without consideration to their environments can seem futile in the light of such findings.

The evidence from all these basic science and clinical studies raise the question of whether precision medicine is doomed to fail. While the gene-centric view of disease etiology have deepened over the years since the completion of the Human Genome Project, does this evidence point to the necessity of another paradigm in our understanding of cancer and other complex diseases, whose cures have been presumed to lie in genetic aberrations and molecular targets? An even more concerning question, relevant in this era of big data, is whether we actually understand what the data is telling us, as the prominent cancer researcher, Dr. Robert Weinberg, admits that “while data mining, as it’s now called, occassionally flags one or another highly interesting gene or protein, the use of entire data sets to rationalize how and why a cancer cell behaves as it does is still far beyond our reach”. A strong critic of the initiative, Dr. Michael Joyner from Mayo Clinic, opines that while “hundreds of genetic risk variants with small effects have been identified…But for widespread diseases like diabetes, heart disease and most cancers, no clear genetic story has emerged for a vast majority of cases” and that “when higher-risk genetic variants are found, their predictive power is frequently dependent on environment, culture and behavior”.

The success of Precision Medicine Initiative, and in particular, the precision oncology approach, ultimately rests on whether it can stem and curb deaths resulting from cancer and other complex diseases, based on molecular targeted therapy. Unfortunately, it appears that large scale public health initiatives have done more to that end (e.g. – tobacco control has largely cut down rates of lung cancer incidence, diet and exercise can cut down the risk of converting pre-diabetes to diabetes by nearly two-thirds), compared to what targeted therapy have achieved. However, it seems that such public health success was overlooked by the Cancer Moonshot panel as in February 2016, right after the program was announced, public health researchers across the country had to urge the Vice President to make prevention a bigger focus in controlling cancer incidence in the population, rather than just trying to find a cure. This approach should have been incorporated into a billion-dollar initiative by default, one would think, but this didn’t seem to be the case and one must wonder why.

In order for this huge, publicly-funded initiative to achieve more than just lukewarm outcomes and to actually become a breakthrough it is promised to be, the Precision Medicine initiative needs to break free of the gene-centered tunnel vision and incorporate all factors that affect an individual’s health, such as lifestyle choices and environmental exposures, as Dr. Collins boasted it to be. While this initiative is only at its infantile stage, changes based on clinical trial and basic science evidence should be made early enough so that favorable outcomes can be achieved and does not require the government to stage another public bailout as it did for the failing banks and wall street corporations back in 2008 when they were deemed to be “too big to fail”.