Tag Archives: featured

GSC COMMITTEE & CLUB UPDATES: MARCH 2016

GSC Career Paths Committee

On February 13, 2017, the GSC Career Path’s Committee kicked off the year with a workshop learning the basics of the Prism Graphpad software. In the past, students had expressed interest in analysis of data using statistical software as well as graphing the data in a presentable fashion. Therefore, the GSC thought a workshop on PRISM would not only be very useful but also have a significant impact on the students’ research careers. The workshop was kindly guided by Dr. Dan Cox, a professor in the Neuroscience department, and it took place in the computer room in the Sackler library. The workshop was well-received, according to GSC representatives Vaughn Youngblood and Roaya Alqurashi. “(The workshop) was a successful one. The attendees loved how Dr. Cox explained each application you will need to use in Prism with an active learning experience” Roaya said. Vaughn mentioned “the Prism workshop was helpful!  It taught the fundamentals of using Prism along with how to represent different types of data.  Hopefully, we can bring Dr. Cox in for another session with another statistical program like R.” If time permits this year, the GSC Career Path’s Committee hopes to hold several more workshops like this with different analysis softwares (R, SAS, etc.).

Harder, Better, Stronger, Faster through CRISPR?

After a year-long intense and bitter dispute over the rights to patent use of CRISPR-Cas9 gene editing technology in mammalian cells, the US patent office ruled in February that “there is no interference” and that the patent belongs to the Broad Institute. Needless to say, effects of this verdict extended beyond the emotions of the scientists locked in the dispute, as the shares for Editas Medicine, established by Feng Zhang from the Broad, saw a 29% jump by the close of trading. What does this mean for other biotech companies looking to harness the power of this novel technology? It would mean that they can only license the use of Broad’s patents if Editas passes on a specific disease-related application since Zhang holds the patent for application of CRISPR for disease applications. However, this does not mean that Doudna and Charpentier, the two heavyweights from the losing side of this patent battle, are going to miss out on benefitting from their discoveries. UC Berkeley has already filed patent for application of CRISPR technology on all cells, which the Broad is not contesting. Biotechs who had invested in obtaining patents through UC Berkeley may actually be able to benefit regardless of the loss in the patent fight.

The promise of CRISPR-Cas9 in disease applications, while still in its nascent stages, is a real possibility. CRISPR has yet to enter human clinical trials en masse for such purposes, although last week a team of Chinese scientists reported successful editing of normal human embryos. This hasn’t deterred popular science news outlets from speculating whether the era of designer babies is finally within sight. This may have been compounded by the recent release of a report by an international committee convened by the U.S. National Academy of Sciences and the National Academy of Medicine that cautiously suggests germline editing of human embryos sometime in the future, albeit “only for compelling reasons and under strict oversight”. The panel also suggested that the genome editing can only be undertaken after much more research on its risks and benefits. This report appears to be a cautious first step towards unlocking the so far forbidden zone of germline editing, and a move away from the moratorium previously established in December 2015 by an international group of leading scientists.

Does this imply, then, that Gattaca or similar engineered societies are upon us? It is a possibility that heritable diseases may one day be cured, but scientists warn us that engineering complex traits such as intelligence are still a pipe dream. These utopian societies are based on the fantasy of engineering the human species to be “harder, better, stronger, faster”; however, they completely ignore the scientific evidence that such adjectives usually are applied to traits that require reciprocity between the genetics and the environment. While genetics play a part, the environment of the individual, along with other factors such as diet, lifestyle and socioeconomic status heavily influence such traits. Even making small changes in the genome has been a challenge, as shown by the efforts to “fix” the mitochondrial genome of babies using the three parent approach.

The question then becomes whether we can engineer designer babies or not, but rather should we be doing such a thing. It would help to note that this obsession with making human beings “better” is rooted in eugenics and racial supremacy, and history is rife with such examples. What the futurists or people who propagate such ideas are missing out on is that evolutionary changes are not meant to make any species more “efficient” as is understood in technological vernacular, but rather help the organism adapt better to the changing environment. This understanding perhaps would help shed more light on the role of the environment and pull us out of our obsession with genetic determinism. Maybe if we ARE to build better human beings, we should start by fixing our environmental problems, such as pollution, climate change, deforestation, the threat of extinction to a large variety of organisms, etc. The promise of genetic engineering should not blind us to what is more important at the moment. Like they say – “one bird in hand is better than two in the bush” and if we don’t do our part to save the environment, we won’t have any bushes or birds left.

Notes from the Library…Copyright

As both creators and users of copyrighted works, it is good to know a little about this topic.  Below, you will find a brief copyright primer.  For answers to copyright questions pertaining to your thesis, see FAQs for Dissertation & Thesis Writers from the Tufts University Libraries Scholarly Communications Team, where you will find answers to questions such as: Can I delay the release of my thesis?  What if I have already published part of my thesis as an article?  What if I want to reuse a graph created by someone else in my thesis?  What is open access?

What is copyright?

Copyright is a set of rights, which give the owner the exclusive right to do or authorize any of the following: reproduce the copyrighted work; prepare derivative works based on the copyrighted work; sell, transfer ownership of, rent, lease or lend copies of the work; publicly display or perform the work (17 U.S. Code § 106).

The authorization for copyright legislation goes back to Article 1, Section 8, Clause 8 of the United States Constitution.  The Copyright Act of 1976 provides the framework for today’s copyright law, which can be found in Title 17 of the U.S. Code.

What does copyright apply to?

Copyright applies to “original works of authorship fixed in any tangible medium of expression” (17 U.S. Code § 102).  This includes literary, pictorial, graphic, and audiovisual works.  Copyright does not extend to ideas, procedures, processes, methods of operation, concepts, principles or discoveries.  Patent, trademark or trade secret policy may apply to these forms of intellectual property.  For more information on these types of IP protection, see What Are Patents, Trademarks, Servicemarks, and Copyrights? and Trade Secret Policy on the U.S Patent and Trademark Office website.

How long does copyright last?

In general, for works created on or after January 1, 1978, copyright lasts for the life of the author plus 70 years (17 U.S. Code § 302).

Who owns the copyright to my thesis?

You do!  According to the Tufts Policy on Rights and Responsibilities with Respect to Intellectual Property: “Students generally own the copyright to the academic work they produce.  Academic work can include class papers, theses, dissertations…”.

ICYMI: Public Relations and Communications Essentials for Scientists

When it comes to reporting our scientific findings, we are trained to compose manuscripts that are measured, precise and objective. The mainstream media, however, take a very different approach to broadcasting scientific news: headlines designed to grab readers tend to be more sensationalized and the articles draw more conclusive and overarching statements. These contrasting approaches to reporting are appropriate in their respective fields and it is important that we as scientists learn to take advantage of mainstream journalism for the publication of our discoveries, not only for the reputations of our university and ourselves, but also to share with the public, whose tax dollars fund most of our work, what we have accomplished. Enter the Tufts Public Relations Office—a fantastic resource that allows us the opportunity to share our research with the community outside of our scientific world. The purpose of the seminar du jour was to inform the Tufts community on how the office works and how to best use it to our advantage.

The purpose of the seminar was to provide some information on how to work with the PR office when you are ready to publish work that you would like to broadcast beyond scientific journals. Kevin, the assistant director of the office, stressed that the earlier you get in touch with the PR office, the better prepared they will be to help you. The best time to contact them about publicizing a manuscript is when you are submitting your final revisions to the scientific journal that will be publishing the paper. You will be asked to share your manuscript with the office so that Kevin and members of his team, who are well versed in reading scientific literature, can familiarize themselves with your work. Soon after, they will meet with you to discuss the details of your study, get a quote, and draft a news release that your PI can edit and approve. From there on out, the PR Office works to spread the word on your research via prominent blogs, science, local, and potentially national media, depending on your work’s level of impact. The PR Office is also equipped to help you interact with reporters effectively: they can prepare you to talk about your science in layman’s terms to be more relatable and better understood by the general public.

By sharing your work with more mainstream media, you build your reputation as well as credit your university, your funding agencies, and the tax-paying public. Reach out to the PR Office for more information on communicating your science with the rest of the world and take advantage of the great opportunities they offer that can make you a more visible and effective participant in the science world!

One last tip for those of you interested in improving your science communication skills–keep your eyes peeled for more details on our upcoming joint Dean’s Office / TBBC / GSC Event, Sackler Speaks in April!  This is a competition for students to pitch their 3-minute flash talks in front of a panel of judges.  Besides critical feedback on presentation skills, there will also be cash prizes for winning presentations!

Contacts at the Tufts PR Office, Boston Campus:

Siobhan Gallagher, Deputy Director (Siobhan.gallagher@tufts.edu)

Kevin Jiang, Assistant Director (Kevin.Jiang@tufts.edu)

Lisa Lapoint, Assistant Director (Lisa.Lapoint@tufts.edu)

 

Reflections from AAAS 2017 – Research During the Trump Administration

The theme of this year’s American Association for the Advancement of Science (AAAS) meeting in Boston was “Serving Society through Science Policy.” As we move through the first few months of a new administration, this gathering could not have been more timely. While this conference is diverse with topics ranging from gene editing to criminology, the undercurrent of the meeting was anxiety over what will happen to research under the Trump administration.

What is science policy even? Most in this audience probably think of it as how much money research gets budgeted and occasional rule changes on whether fetal stem cell research can occur. Generally, science policy is the set of federal rules and policies that guide how research is done. Science policy can be split into two general frameworks: policy for science and science for policy. These can often feed into each other. For example, policy for science provides funds for climate research. The data and conclusions derived from that research could then inform new climate related policies. That would be science for policy. While science itself is an important input into the whole process, other considerations such as economics, ethics, budgets and public opinion are also inputs. As a scientist who considers science as a method of interpreting the world, my biases had not let me consider non-science inputs for science policy decision-making. It may seem obvious to some, but it was illuminating to realize that other concerns can be just as valuable and legitimate.

As funding is a major reason scientists are concerned, I was happy to learn a lot about the place of research in the federal budget. There are some out there who believe that research in Boston will be fine no matter who is in charge because of all the industry science in the area. It’s true; around two-thirds of research and development is funded by industry. However, industry is mostly concerned with development. Basic research is primarily funded by federal money. The federal budget is divided into mandatory spending and discretionary spending. Mandatory spending does not require congress to act for programs in it to be funded. These include the entitlement programs such as Social Security, Medicare and Medicaid. Dr. Josh Shiode, a Senior Government Relations Officer from AAAS, informed us that entitlement programs are considered “third-rail” discussions by lawmakers, meaning if you touch them, you die (an electoral death). In contrast, discretionary spending requires Congress to actively fund. Most of research and development spending falls into this category.

 

 

Due to changing (aging) demographics, the percentage of the budget that goes towards mandatory spending has been steadily increasing. 50 years ago, we spent around 30% of the federal budget on mandatory spending and now we are up to 70% and increasing. Research and development generally gets around 10-12% of the remaining budget left for discretionary spending. Traditionally, increases in discretionary defense spending will correspond with a parallel increase in nondefense discretionary spending. The Trump administration has proposed increases in military spending. Given likely tax cuts and reluctance to make changes to entitlement programs, it is unlikely nondefense discretionary funding will fare well. The good news is major research programs like the BRAIN Initiative, Precision Medicine Initiative and the Cancer Moonshot were funded through the bipartisan 21st Century Cures act during the lame-duck session. While NIH and biomedical research will likely have diminished profiles during this administration, both parties are against Alzheimer’s, diabetes and cancer. What is less clear is how research performed by the EPA and Department of Agriculture will fare although initial reports are grim. Finally, repeal of the Affordable Care Act will have rippling effects, as many research universities are also providers of healthcare. It is clear that there will be a shift in culture. Under President Obama, science was elevated and scientists were regularly consulted. As former senior science adviser to President Obama John Holdren said: “Trump resists facts he doesn’t like”.

There is reluctance for some scientists to get involved in the political theater, as some believe science should be apolitical. I would argue that science is already political as science can dictate policy and policy can dictate science. What science is and should be is nonpartisan. No party has an inherent monopoly on being allies of science and scientific thinking. So what can scientists do? All politics is local and personal. The majority of Americans say they don’t know a scientist. This is an easy thing to work on. Make sure you introduce yourself to others! Visit your lawmakers and let them know that you are funded by federal money. Politicians are most concerned about their own districts and so if you’re a transplant, you likely have connections to more than one district. Try to build a relationship with him or her by seeing if you could help with anything. Figure out if you can help your local community with anything by serving on committees. Speaking of committees, know what committees your representatives are on. When communicating, think carefully about what words you use. Former U.S. Congressman Bart Gordon opined that he never called it climate  change. Instead, he called it energy independence. While branding may sound like a trivial thing to worry about, targeted story telling is extremely important. We would love for our data to speak for itself but people connect best to stories, especially ones concerning things they can relate to or care about.

If you’re interested in science policy, there are a number of good resources available to get better acquainted. The Engaging Scientists & Engineers in Policy (ESEP) Coalition has a wealth of information and resources on their website (http://science-engage.org/). In fact, they host a local monthly science policy happy hour to network and engage those interested in science policy. If you are interested in learning more about the R&D budget, AAAS has an excellent resource with analyses of federal research and development funding (https://www.aaas.org/program/rd-budget-and-policy-program). There you can also find their data dashboard to look at funding for specific agencies for different periods of time.

 

Notes from the North: Review of Online Course “Scientists Teaching Science”

Scientific graduate programs all over the country do a wonderful job training their students to become critical thinkers able to design experiments, write fellowship grants, write peer reviewed papers, and grasp complex scientific systems. Nearly all programs, however, struggle to provide career training. Traditionally, skills such as mentoring, teaching, and leadership have been learned by observing others. This has generated many excellent scientists, mentors, teachers, and leaders, but how many more could we have developed had students received directed training? And how much better would our current scientific leaders be had they not had to reinvent the wheel for themselves?

One of the dangers of requiring students to learn through osmosis is that we tend to recapitulate what we see, even if it is not the most effective method. Partly this is because many of us do find this an effective way of gaining skills and knowledge, but there is also a mentality of initiation: we had to struggle, the next generation should experience this too. There are many answers to this paucity of career development training, however, in the form of business clubs, student and postdoc association lead career workshops, and online extracurricular courses.

Some of us at Sackler interested in a teaching career have taken advantage of a short course entitled “Scientists Teaching Science” which teaches best practices in science education, based on the latest research on teaching and STEM ed sol logolearning by STEM Education Solutions (http://stem-k20.com/). This is a completely online course that runs about nine weeks with a different module every week. Depending on the week, the time commitment is about 3 hours per week for light weeks and as much as 8 hours per week on heavy weeks (depending on how assiduous a note taker you are when doing readings and how detailed you are in written assignments).

I found the intro to the course very illustrative and memorable. We were asked to read several articles on how science has traditionally been taught and how active learning has repeatedly been shown to improve learning outcomes, then Barbara Houtz started her own narrated lecture in the traditional “Sage on the Stage” style. My heart immediately sank as I envisioned the next nine weeks writing dense, jargon filled notes on topics that seemed esoteric and non-practical. This was not what I thought I was signing up for! Then she paused and asked the question, “what are you thinking?”

That’s when the real lecture began. The narrated lectures were fantastic! Available 24/7 and provided as both narration and transcript. Methods that make participants stop to think about what they are being told were used liberally to retain participant attention. This meant that we were being shown how to effectively employ all the skills we were being taught as they were being taught to us. The modules covered learning/teaching styles, generating effective assessments, Bloom’s Taxonomy of Learning, writing your teaching philosophy (a part of faculty application materials that I only learned about last year despite years of aspiration to teach), cultural awareness, active learning and inquiry based teaching, writing course objectives, teaching online, course development, and syllabus compilation. Each module was comprised of a narrated lecture, readings, and a written assignment or discussion board post requirement. Additional resources were also provided on the Virtual Learning Environment and Barbara Houtz frequently sent out class announcements about recent articles on STEM education and careers for PhDs.STEM

I embarked on this online only course with a great deal of trepidation. Would I have the self-discipline to keep up with the material? Would I feel comfortable reaching out to the instructor with questions and comments? The answer is that with the help of an instructor devoted to keeping her participants involved and getting the most out of her course I was able to gain practical teaching skills in a remarkably short time.

On the Shelf…

For Work

ontheshelf_fisher_enjoywriting_2017_02

Enjoy Writing your Science Dissertation or Thesis!, Elizabeth Fisher & Richard Thompson

Location: HHSL Book Stacks, Sackler 5, WZ 345 F533e 2014

I am not quite sure whether the title of this book (and exclamatory punctuation) is a command or a promise, but the book does provide advice on all aspects of thesis writing.

For Leisure

ontheshelf_marchant_cure_2017_02

Cure: A Journey into the Science of Mind Over Body, Jo Marchant

Location: HHSL Book Stacks, Sackler 5, WB885 M315 2016

Science writer Jo Marchant explores stories and research about the mind-body connection.

Opposites Attract: The Unlikely Marriage of Science & Fiction

Science, as a subject of study, often comes into conflict with other ways of thinking about the world. Religion. Philosophy. Art. The caricature of science as an opponent to these ‘humanitarian’ endeavors obscures the real relationship: symbiotic. In the case of science and literature, science provides fiction with an intriguing playground to muck around in, while fiction gives science a more human voice. This give-and-take between the two is what makes the genre of science fiction so rich, so enduring, and above all, so entertaining.

Science fiction more often than not uses science as a tool to explore other subject areas versus the science itself. It is not the engineering of the 20,000 Leagues submarine or the bioelectricity behind the monster in Frankenstein that makes these books long-standing members of high school reading lists. Readers are not likely spellbound by Margaret Atwood’s MaddAddam series mainly because of the intricacies of the genetic engineering catastrophe that ended her version of our world. No one likes Star Wars because of its explanations of the physics behind inter-galaxy travel. Fiction is not a mirror that reflects science to readers so that they can understand its most basic aspects. Instead, fiction is a prism that refracts science, fractioning and expanding it into its ripple effects and societal implications. It bends the bleached starkness of the discipline into a million different shades, spattering dark implications and bright hope for humanity in equal measure.

It is not always a fair coloring. Dystopia walks hand-in-hand with science fiction more times than not. Those stories do speak well of the perseverance of the human condition but often at the cost of vilifying some aspect of science. (Everything becomes a villain if left unchecked long enough, after all.) Still, fiction doesn’t just take from science; it gives as well. Science fiction is always ahead of its time, more audacious in imagining what human hands are capable of creating than what we believe is achievable at the time. With that creative inspiration, our history has shown it is inevitable that science fiction becomes science fact, from endeavours as incredible as space travel to tools as mundane as credit cards. And as such, science fiction has the privilege of not just asking can we, but also should we, and it has the added advantage of most times asking it first.

The audacious pushing of boundaries beyond the confines of the contemporary scientific knowledge within science fiction also creates a unique and rich environment for rebellion. Because in that type of story, in an imagined world that both is and is not this real one, what else could be different? Who else could become something more than what they are, or what society tells them they are?

This type of rebellion is what led to the existence of the genre itself. In 1666, the English duchess Margaret Cavendish published The Blazing World, a prose piece often considered one of the first utopian fictions and the precursor to ‘science fiction’ (a term not officially coined until 1926) as we know it today. Cavendish was an anomaly of her time, publishing plays, essays, and prose that tackled philosophy, rhetoric, and fiction, all under her own name instead of anonymously. She also was the first woman to attend a meeting of the Royal Society of London, despite fierce protest, and did not hold back in commenting on and even criticizing the scientific presentations and practices she observed. Her novel dove into discussions tackled by male authors of the time period–the conflict between imagination and reason or philosophy and fiction–but also was groundbreaking in two ways. First, she explored these topical areas within an alternate universe entirely of her own making but one that still used contemporary science of the era; second, her story strikingly centered on herself as the main character, where she traveled in between the two worlds. In a time where women were not considered capable of studying complex topics such as science, the Duchess of Newcastle used her writing to boldly carve herself a space in which she could defy that notion. In the process, she wrote into existence the first examples of many science fiction tropes still widely used today.

Her actions paved the way for other rebels, such as Mary Shelley, the mother of the first science fiction horror novel, Frankenstein. While a grey, depressing summer and a writing challenge born out of boredom provided an opportunity to craft her terror-filled story, her imagination was ultimately sparked after a firelit evening conversation with the controversial Lord Byron about what life is and how to create it. Despite being supported in her endeavours by her companions and her husband, Shelley ran into criticism upon publishing her work–incidentally most strongly from the specific publishers who knew the author was a woman–because it challenged the entrenched ideology of God being the only conceivable creator, not Man (or, in her case, Woman). In the deeply religious society of Victorian England, this was a revolutionary act.

Cavendish and Shelley may have been the some of the first authors to use sciene in fiction to challenge the social and moral status quo, but it was a tradition that persisted in the genre throughout the twentieth century. Starting in the 1960s, female authors were among the first to interrogate the definitions, implications, and biases associated with gender, class, and race. Ursula Le Guin’s sci-fi novel The Left Hand of Darkness–with its gender-fluid alien race dissecting what exactly gender and sex means outside of its Western civilization confines–led the charge. This breakthrough was followed by Joanna Russ’ 1975 matriarchal parallel-universes utopian novel The Female Man, then by Octavia Butler (who was the only African-American woman publishing in the genre at the time) and her late-1980s space trilogy Xenogenesis which explored race in addition to sexuality.

These revolutionary works also represent a broader theme within the genre: the influence of contemporary events of the era in which they were written. Science fiction is as much a reflection on the scientific knowledge of the day–and what could come of it–as it is on the historical and political backdrop of the time. Many early science fiction novels from the eighteenth and nineteenth centuries focus on stories of exploration and the technology that allows journeys into lands unknown. Most notable of these are Gulliver’s Travels (Jonathan Swift, 1726), 20,000 Leagues Under the Sea (Jules Verne, 1870), and The Time Machine (H.G. Wells, 1895). Historically, these centuries were flooded with exploration expeditions by European countries, and later the United States and Russia. While discovery for political and economic gain was the main purpose of most 18th century explorations, those carried out in the 19th century were more focused on deepening knowledge of the world, often through scientific observation and analysis. So, it is little wonder that the science fiction of the era reflected that desire to know more about the surrounding environments.

In the early 20th century, the domination of exploration themes in science fiction gave way to playing around in other subject matters–such as technology, biology, and medicine–which would later become genre staples. The early half of the century was one of rapid scientific advancement as much as it was political upheaval, and the collision of these two jarring phenomenons is reflected in the science fiction of the day. It was during this era that some of the seminal works of the genre were produced, including the post-Bolshevik revolution novel We (Yevgeny Zamyatin, 1924) and the science fiction classics Brave New World (Aldous Huxley, 1932) and 1984 (George Orwell, 1949). These novels each address how uncurbed scientific advances lead to a dystopian political society, and their thematic commonality clearly demonstrates the lasting impact several world wars and fast-paced science had on the public psyche of the time.

While dystopia strongly persisted within science fiction in the middle of the 20th century, the worlds crafted within genre novels did begin to grow a little less dire. As technological development continued to accelerate and started infiltrating daily life in the Western world–thus ‘normalizing’ it–likewise did the role of technology grow in fiction as androids and robots appeared on the genre scene. Authors of the time such as Isaac Asimov and Philip Dick couldn’t help but ask–and then answer through their writing–questions pertaining to the human condition in relation to the (imagined) creation and existence of non-human life. This philosophical bent echoed the early origins of the genre, going all the way back to Cavendish’s precursor work, demonstrating how far the genre had progressed.

Glancing back and paying homage is all well and good, but science fiction also found new ways to move forward at the end of the 20th century. In 1979, The Hitchhiker’s Guide to the Galaxy added a little laughter and good humor to the genre, breaking ground for many others to follow across even until today. The gloom of the war-torn early decades also seemed to have worn off, with a revitalization of the previously ‘tired’ utopian sci-fi tradition by Kim Stanley’s Mars trilogy in the 1990s. This trend of revitalizing and redefining the genre has persisted into recent years, with the semantic alteration by Margaret Atwood, who calls her novels not ‘science’ fiction, but speculative fiction. In her MaddAddam series, she reaches for what might be just possible in the realm of science and society, instead of the complete impossible. In some ways, this approach brings about an even more imaginative (and frightening, and wonderful) vision of what the human mind can create when challenged in the perfectly right and wrong ways.

Ultimately, the fiction of science is as elusive and ever-changing as the real thing. It circles itself: thought and action, can and should, might and will and have done. Whether we as scientists today use science fiction as inspiration–or as a warning–only time will tell.

21st Century Cures Act: Boosting biomedical research, but at what cost?

Co-authored by Andrew Hooper & Nafis Hasan

In a remarkable display of bipartisanship, the Senate passed HR 34 and President Obama signed the 21st Century Cures Act into law on Dec. 13, 2016. The original bill was introduced and sponsored by Rep. Suzanne Bonamici (D-OR) on Jan 2015 and garnered co-sponsors from both sides of the aisle, including the support of Rep. Lamar Smith (R-TX), Chairman of the House Committee on Space, Science and Technology. The House approved the original bill in Oct 2015 and after a year on the Senate floor where the bill underwent several amendments proposed by both Democrats and Republicans, the Senate approved the bill on Dec 6 2016 and passed the bill on to President Obama to be signed into law.

This law is meant to accelerate drug development and bring cutting edge treatment to patients, revise the current status of mental health research and treatment for disorders, with a strong focus on the current opioid crisis sweeping across the nation. The law is also of significant importance to biomedical scientists as it will expand funding  for certain fields, keeping in line with the Precision Medicine Initiative launched in 2015. More specifically, the Cures act will provide funding for specific NIH innovation projects such as the Precision Medicine Initiative ($4.5 billion through FY 2026), the BRAIN initiative ($1.51 billion through FY 2026), the Cancer Moonshot project ($1.8 billion through FY 2023) and the Regenerative Medicine (stem cells) program (30$ mn through FY 2026). In addition, this law will stimulate innovative research by awarding investigators with the Eureka Prize for “significant advances” or “improving health outcomes”. The law also seeks to promote new researchers through its Next Generation of Researchers Initiative, an attempt to solve the postdoc crisis in academia. As a response to the lack of women and underrepresented minorities in STEM fields, the law also contains provisions that will attract and retain such scientists in “priority research areas”.  Finally, to further encourage early-stage researchers, the law authorizes the establishment of programs to help in the repayment of student loans and raises the cap on the repayment assistance available to the researchers.

Besides ensuring funding for biomedical research, this law aims to address privacy concerns brought up by experts regarding patient information in the era of precision medicine (for more details, check out our analysis of the precision medicine initiative). Under this law, certificates of confidentiality will be provided to all NIH-funded researchers whose studies involve collection of sensitive patient information. This information will be withheld by the NIH, but can be accessed upon requests filed under the Freedom of Information Act. On the other hand, in order to make sure data sharing is made easier for scientists, this law will allow NIH to break out of red tape and regulations that obstruct scientists from attending scientific meetings and sharing data.

Despite the generally positive reception of the Cures Act by NIH officials and research scientists, the bill was not without its critics. The principal criticism of the final product is that it constitutes a handout to pharmaceutical and medical device companies by substantially weakening the FDA’s regulatory check on bringing new treatments into the clinic.

For example, Sydney Lupkin and Steven Findlay point to the $192 million worth of lobbying collectively expended by over a hundred pharmaceutical, medical device, and biotech companies on this and related pieces of legislation. The goal of this lobbying, Lupkin and Findlay assert, was to give the FDA “more discretion” in deciding how new drugs and other treatments gain approval for clinical use – presumably saving a great deal of money for the companies that develop them. Adding weight to their assertion is the fact that President Trump is reportedly considering venture capitalist Jim O’Neill for FDA commissioner. Mr. O’Neill is strongly supported by libertarian conservatives who see FDA regulations as inordinately expensive and cumbersome, so it seems reasonable to worry about how Mr. O’Neill would weigh safety against profit in applying his “discretion” as head of the FDA. On the other hand, under a wise and appropriately cautious commissioner with a healthy respect for scientific evidence, we might hope that maintaining high safety standards and reducing the current staggering cost of drug development are not mutually exclusive.

Additionally, Dr. David Gorski writes of one provision of the Cures Act that appears to specifically benefit a stem-cell entrepreneur who invested significantly in a lobbying firm pushing for looser approval standards at the FDA. Once again, it is not unreasonable to suspect that there is room to reduce cost and bureaucratic red tape without adversely impacting safety. And in fairness to the eventual nominee for FDA commissioners, previous commissioners have not been universally praised for their alacrity in getting promising treatments approved efficiently… at least, not within the financial sector. Still, the concerns expressed by medical professionals and regulatory experts over the FDA’s continued intellectual autonomy and ability to uphold rigorous safety standards are quite understandable, given the new administration’s enthusiasm for deregulation.

It appears that this law will also allow pharmaceutical companies to promote off-label use of their products to insurance companies without holding clinical trials. Additionally, pharma companies can utilize “data summaries” instead of detailed clinical trial data for using products for “new avenues”. It is possible that these provisions were created with the NIH basket trials in mind (details here). However, as Dr. Gorski argues, without clinical trial data, off label use of drugs will be based on “uncontrolled observational studies”, which, while beneficial for pharma companies, are risky for patients from the perspective of patient advocacy groups. These fears are not without evidence – a recent article from STAT describes how the off-label use of Lupron, a sex hormone suppressor used to treat endometriosis in women and prostate cancer in men, is resulting in a diverse array of health problems in 20-year olds who received the drug in their puberty.

Another “Easter egg”, albeit unpleasant, awaits scientists and policy-makers alike. Buried in Title V of the law is a $3.5 bn cut on Human and Health Services’ Prevention and Public Health fund, without a proper explanation added to such an act. Given the outcry on the lack of public health initiatives in the Precision Medicine Initiative, one is again left to wonder why 21st century cures are focusing only on treatment and drug development and not on policies directed towards promoting public health and prevention of diseases.

In conclusion, the implementation of this law will largely depend on the current administration. With the NIH budget for FY2017 still up in the air, the confirmation of nominees still hanging in balance, this law is far from being implemented. Based on the provisions, it appears that overall biomedical funding will be boosted in particular fields, designated “priority research areas”. However, it shouldn’t fail an observant reader that this bill also seems to allow pharma companies a higher chance to exploit the consumers. It, therefore, still remains a question of whose priorities (consumers/patients vs. investors/corporations) are being put forward first and the answer, in our humble opinion, will be determined by a dialogue between the people and the government.

Sources/Further Reading –