Archive for June 2011
A recent headline in FierceBiotech Research proclaimed: “Scientists crack histamine code that could lead to better allergy relief.” The article goes on to describe how an international research team studied how doxepin, an old drug used to treat allergic reactions and other conditions, binds to the human histamine receptor protein, thereby easing allergy symptoms. The team did this by solving the complex 3-D structure of the human histamine H1 receptor protein. Dr. Simone Weyland, one of the investigators, commented on the importance of this finding, stating, “First generation antihistamines such as doxepin are effective, but not very selective, and because of penetration across the blood-brain barrier, they can cause side-effects such as sedation, dry mouth and arrhythmia. By showing exactly how histamines bind to the H1 receptor at the molecular level, we can design and develop much more targeted treatments.”
Based on this pronouncement, allergy sufferers might assume they will have new, superior medicines to treat their symptoms. Well, this scenario is highly unlikely.
Antihistamines, like diphenhydramine (the active ingredient of Benadryl), have been available for over 60 years. While effective in treating hives, runny nose, itchy eyes, etc., they do cause toleration issues including sleepiness. In the 1980s, pharmaceutical companies sought to come up with better antihistamines that produced less drowsiness. This pursuit led to the so-called second-generation agents such as loratidine (brand name: Claritin) and cetirizine (brand name: Zyrtec). These antihistamines have proven to be safe; they are so broadly used that they are now available over-the-counter at any pharmacy and supermarket.
Perhaps loratidine and cetirizine can be improved upon. The research that’s been done to get a better understanding of the biological interaction between the drug and the site of action in the body would be a good starting point to begin such a drug discovery program. But it would be surprising to find a company willing to make the investment to do this. The current antihistamines on the market work well enough. To do the research to devise a new generation of agents from lab bench through clinical trials and to FDA approval would require 12 – 15 years and hundreds of millions of dollars. Is it really economically responsible for a drug company to go down such a path? Wouldn’t it be better for patients if the time and effort needed to get a third-generation antihistamine be invested in areas of major medical need?
Critics of the pharmaceutical industry have justifiably been concerned that not enough R&D is being devoted to discovering new treatments for Alzheimer’s Disease, diabetes, cancer, drug-resistant infections, etc. In fact, back in the late 1990s, companies may have indeed capitalized on this new scientific breakthrough and looked for a third generation antihistamine. But economics in today’s world of medicine won’t support such a research program. The science done in “cracking the histamine” code is of academic interest, and, in theory, it could lead to a new drug that works slightly better than what’s already available. But if the R&D dollars are instead invested in areas of more critical need, we’ll all eventually be better off.
A recent comic strip entitled “Off the Mark” (written by Mark Parisi) depicted the following: the Tin Man comes upon Dorothy and the Cowardly Lion in the woods. At their feet are the remains of the Scarecrow. Dorothy, who along with the Lion, is eating the last bits of straw, says to the horrified Tin Man: “Dr. Oz said to eat more whole wheat…”
Admittedly, this cartoon takes the influence of Dr. Oz to the extreme. But its premise is not too far from the truth. Dr. Oz has become America’s physician. He has the endorsement of Oprah Winfrey as well as his own show that’s on daily (twice a day in some markets!) and he recently won two Daytime Emmy awards, one for best informative talk-show and the other for best talk-show host. His show is watched by millions of people. He is clearly beloved by his followers. He is better known and, when it comes to dispensing medical advice, he is more influential than the US Surgeon General (Dr. Regina Benjamin, in case you might have forgotten).
But this notoriety and influence put an onus on him to be especially accurate in his comments and opinions. A few weeks ago, I appeared on Dr. Oz’s show along with Dr. John Abramson, a critic of the pharmaceutical industry. During one segment, Dr. Abramson and I debated the value of statin drugs to prevent heart attacks and strokes: Dr. Abramson taking the view that statins are unnecessarily prescribed while I defended the fact that statins have saved countless lives. Dr. Abramson said that he often took his patients off statins and I was stunned by this. I turned to Dr. Oz and asked: “Dr. Oz, you’re a cardiologist, how do you prescribe statins?” He replied: “I am usually the one taking them off statins” at which point the audience broke out into applause.
Dr. Oz then said that he doesn’t have a vendetta against statins and that: “There are people whose lives are saved every day with statins.” However, I am afraid that these later points were lost both to the audience and the millions who watched the show. What many people actually heard from Dr. Oz’s pronouncement was that millions of people are needlessly taking statins. This is evident from subsequent summaries of this discussion that now appear on various websites. These summaries highlight the fact that Dr. Oz takes patients off statins and don’t mention his points about how statins do save lives.
Like Dr. Oz, I am a big believer in making lifestyles changes, such as watching your diet or exercising, as a way of warding off disease and also preventing the need for medications. But doctors don’t haphazardly prescribe statins to their patients. They prescribe these drugs knowing their patients’ health history and medical profiles. When a patient hears someone as prominent as Dr. Oz saying that medicines like statins are overprescribed, many are likely to stop taking these drugs – resulting in potential dire downstream consequences.
Dr. Oz does a great job in things such as warning his audience about the perils of too much sugar in their diet. He is a great teacher of good nutrition and the benefits of eating fruits and vegetables. A lot of what he covers on his show is quite informative. But Dr. Oz is in a much different position than other daytime hosts. His words are gospel to the American public. Thus, he has an enormous responsibility when it comes to commenting on medicines that members of his own profession prescribe. Unfortunately, there will be patients who will act like Dorothy and take his words a bit too far.
Last week, I gave a talk at a course at Drew University in New Jersey. The presentation was about the misconceptions that exist around the role and contributions to medicine made by the pharmaceutical industry. During the Q&A session, a member of the audience asserted that academic institutions were responsible for roughly 50% of the new medicines that are approved annually. This is a provocative assertion – and completely false.
The topic of who is actually responsible for new medicines is not a new one. Worried that American taxpayers should be sharing more in the profits of the pharmaceutical industry, Congress commissioned a study in 2001 to determine which top-selling drugs had their origins in work done by the National Institutes of Health (NIH). The Department of Health and Human Services prepared the final report which showed that, of the 46 drugs that had annual sales of $500 million or more, only three were associated with Federal patent ties. The other 43 drugs were discovered and developed by the pharmaceutical industry with no Federal investment.
Of course, this study is a decade old. Critics of the industry are again challenging the premise that industry is the major driver for the discovery and development of new drugs. A recent article in Nature Biotechnology (June, 2011) entitled “Debate re-ignites on contribution of public research to drug development” provides some valuable data on this topic. Perhaps the most relevant information is taken from a paper that emanated from Boston University and was published in the New England Journal of Medicine (volume 364, 535 – 541, 2011). This work, which focused on “The Role of Public-Sector Research in the Discovery of Drugs and Vaccines,” shows that of the 1,541 drugs approved by the FDA from 1990 – 2009, nearly 10% were rooted in public sector research. Clearly, the public sector is making important contributions in this field. Yet, the fact remains that the private sector is still responsible for 90% of new medicines.
The NIH, as well as other academic institutions and research institutes, plays a critical role in funding important biomedical research that provides broad benefits not just to the pharmaceutical industry, but to society in general. Furthermore, the fact is that the vast majority of basic biological research is done in academia. But one must distinguish between important theoretical work and the application of this work in discovering and developing new medicines. Basic research is not drug discovery. The NIH does a great job in providing basic knowledge and hypotheses about the nature of living systems. However, it is the pharmaceutical industry, both large companies and small biotech firms, that discovers and tests the compounds to prove or disprove these medical hypotheses. Neither can work without the other. A successful academic-industry partnership is crucial in discovering new medicines.
Professors David Nutt (Imperial College London) and Guy Goodwin (Oxford University) have recently authored a report for the European College of Neuropsychopharmacology which sounds the alarm on the elimination of research by pharmaceutical companies in areas such as depression and schizophrenia. They are greatly concerned that companies like GSK, AstraZeneca and Pfizer “see research into better neuropsychiatric drug targets as economically non-viable.” Their fear is that if such research is stopped “we will have a dead space of 20 to 30 years before we can begin to do it again.”
First of all, it is encouraging to hear prominent academics extol the importance of research that occurs in the pharmaceutical industry. Usually, industrial drug R&D is minimized by people like Dr. Marcia Angell, former editor of the New England Journal of Medicine, who promote the view that pharmaceutical companies do little original R&D and instead license all of their drugs from universities and small start-up companies. Others, like Dr. John Abramson, author of Overdosed America, feel that the anti-depression drugs that have been produced by the industry add little value. It is a pleasant change to hear these professors recognize the value of the contributions of the R&D of pharmaceutical companies.
Nutt and Goodwin are justified in their concerns. While drugs already exist to treat these central nervous system (CNS) disorders, they are far from ideal. However, their rationale as to why pharmaceutical companies are abandoning this research may be not be entirely correct. Because the biology of the brain is so complex, it is not unusual for an agent in the CNS to have side-effects such as changes in mood, anxiety, insomnia or even create suicidal tendencies. Nutt and Goodwin say the industry is spooked by potential litigation over these types of side-effects, but every drug causes side-effects. If a pharmaceutical company worried exclusively about adverse reactions, it would never develop any new medicine for any condition. As was said here before, the key in navigating side effects is that everyone involved–the industry, the FDA, the doctors and, most importantly, the patient–understands the risk-benefit profile of a new drug.
Nutt and Goodwin should be applauded for calling attention to the beneficial research done by the pharmaceutical industry; however, the real issue here isn’t fear of litigation. It’s that there are not many compelling targets for industrial scientists to study. Traditionally, academic researchers have helped to fuel the generation of new ideas by doing basic research into how the brain works. This preliminary research yields areas for industrial scientists to probe. But funding of this type of academic research by institutions like the NIH have decreased over the years. Furthermore, the NIH is now diverting hundreds of millions of dollars to doing drug discovery, which is clearly better done by industry.
To avert the “dead space” feared by Nutt and Goodwin, a greater emphasis on basic CNS research needs to occur. While some companies have gotten out of this area, many are still doing research in seeking new pathways to treating CNS diseases. The FDA needs to express an openness that new medicines that result from this work will be given a high priority in their review process. In addition, the public sector needs to increase funding for academic researchers in the CNS field to plant new seeds that can be developed by industrial scientists.
Ironically, Congress christened the 1990s as the “Decade of the Brain.” Unfortunately, much more than a decade will be needed to make the breakthroughs necessary to relieve the burden of disease caused annually by the neurological disorders.
Critics of the pharmaceutical industry often find fault in the practice of prescribing medicines to seemingly healthy people in an effort to prevent disease. Said critics abandon the term “disease prevention” in favor of “disease mongering.” They contend that the majority of people on preventative medicines are taking drugs unnecessarily. A person who feels pretty healthy may in fact question why he or she should bother with these pills. Taking a medicine for 10 or 20 years to prevent eventually having a heart attack or getting osteoporosis requires a leap of faith from not just these healthy patients, but from regulatory agencies and physicians as well.
A recent report in the European Heart Journal from researchers at the University College London Medical School provides strong evidence of benefits of taking daily preventative medication. Data from the long-running Whitehall II study of 9,453 civil servants in the UK over 20 years has shown that better control of blood pressure and cholesterol levels and reduction of smoking have led to heart attack rates dropping by 74%. Equally impressive is that this drop has occurred in the face of a rise of the body mass index (BMI) in this population. The researchers believe that a further drop in heart attacks of 11% would be realized if people controlled their weight as well.
The downside to these benefits is that such a preventative regimen requires taking a variety of different pills which helps to drive up costs to patients and health care providers. Furthermore, the combination of drugs could cause problematic adverse drug – drug reactions. Back in 2001, the World Health Organization and The Welcome Trust convened a meeting of experts to discuss evidence-based and affordable interventions for non-communicable diseases. That meeting was the impetus for the creation of the “Polypill” to reduce heart attacks and strokes. The Polypill contains the four key classes of generic drugs used to prevent heart disease: aspirin (75mg), lisinopril (10mg), simvastatin (20mg) and hydrochlorthiazide (12.5mg). Clinical studies are now starting to appear that show the Polypill effectively lowers blood pressure and LDL cholesterol. Remarkably, this multi-component pill is well tolerated with the major side-effect being gastrointenstinal distress due to the aspirin.
The approval of the Polypill will be a big step in the prevention of heart attacks and strokes not just for wealthier nations but also for the developing world. The ease of use and affordability will be a major help for physicians in controlling this disease. Ideally, the continuum of obesity leading to diabetes leading to heart disease will be dealt with by appropriate diet and exercise. Unfortunately, that message is not resonating broadly enough to people around the world. Preventative medicine is here to stay. Managing it properly is the key to controlling health care costs and maintaining a healthy population.
When you pick up a prescription drug, you’ll find it includes the drug label. Among other things, this label contains safety information in the form of a warning label which describes the potential side effects that this drug might cause. Obviously, this is a good thing. After all, a patient should know the potential downside of taking any medicine. However, as Gina Kolata pointed out in her article in The New York Times last Sunday (“Side Effects? These Drugs Have a Few”), warning labels keep growing longer with the AVERAGE label now listing SEVENTY possible adverse reactions. When such a label gets this long, it becomes numbing and chances are patients will disregard it, thereby defeating the purpose of having these warnings.
There’s a seemingly easy fix to this problem. Why not simply list only the major side effects and eliminate the lesser ones that might only occur in 1 out of 10,000 patients? This unfortunately is not easily done. The problem is that no drug is 100% safe. It is impossible to design a medicine that will work identically in men or women, young or old, or in racially diverse groups. Drugs behave differently even in highly homologous populations. Thus, every drug will have a problem in a certain percentage of people who take it. The FDA always weighs these risks before approving a new medicine, carefully measuring the risk-benefit of each new drug.
But let’s say that the FDA heeds the advice of Ms. Kolata and others and cuts back warning labels to just those of major or serious consequence. Initially, this may be applauded. However, the following scenario is likely. A patient on a recently approved drug will have a rare, but serious, adverse reaction and will be outraged that the FDA didn’t warn him about this possibility. A Senator or Congressman will then go on CNN and be irate that the FDA is exposing his constituents to such dangers. Many will rail against the company that made this drug, saying it clearly only cares about profits and not the safety of sick people who are seeking relief from disease but instead subjects them to dangerous and expensive new products. This will then lead to a major lawsuit where the injured party will seek millions in damages.
This scenario isn’t an exaggeration. Even with the current onerous warning labels, these situations arise. One may find the listing of a drug’s side effects in industry commercials to be ludicrous; however, how does the FDA and the drug manufacturer best prepare the prescribing doctor and patient for any potential consequence in taking this medication? The FDA and pharmaceutical companies need to have extensive warning labels to show that rigorous efforts are being made to understand the benefits and risks that any medicine possesses. True, they are no doubt protecting themselves from attacks by those who purport to be “protectors” of unsuspecting patients. However, these labels represent a good-faith effort for complete transparency for the risk-benefit profile of a new drug. More is known about newly-approved drugs now than at any other time in medical history. This knowledge, however, leads to larger warning labels.
Perhaps the solution lies in having a revised warning label system. One can envision having a set of “serious” warnings, ones that highlight major potential side effects that a patient could experience. Presumably, this would be a limited and focused list (otherwise, why would the FDA approve the drug?). Then there can be a supplementary list of side effects that would include things like headache, nausea, constipation, etc., things that seem to be listed in EVERY warning label. Such a system could focus the doctor and patient on the potential side effects that they would really need to be aware of.
The American Society of Clinical Oncology (ASCO) meeting begins this weekend in Chicago. It promises to be an exciting meeting with 4,000 clinical studies being disclosed covering a wide variety of topics. All are critical to the future treatment of cancer including new drugs for melanoma, lung cancer, and kidney cancer, new biomarkers to identify the cause of one’s cancer and genetic markers to help design specific treatments for certain cancers. The scientific breakthroughs that will be discussed at this meeting dwarf what was known about this disease a decade ago.
In fact, many researchers now believe that, much like the case with AIDS, we are not far from the time when a cancer diagnosis will not be a death sentence but rather a diagnosis of a chronic disease, one that might not necessarily be cured but rather be treated with a variety of medications which will keep one’s cancer in check. As AIDS patients are treated with a combination of drugs, so too, will cancer patients be treated. Getting to this point will be the result of decades of research and billions of dollars in investment. But what a wonderful situation for patients.
The new cancer medications are expensive and annual costs can range anywhere from $10,000 to $100,000. Furthermore, multiple drugs will be needed to keep one’s cancer in check; perhaps a patient will be on a drug specifically designed to stop cancer cells from proliferating in her breast, another drug designed to prevent the breast tumor from growing new blood vessels thereby starving it and finally a third drug designed to help her own immune system to fight the disease. (This is not a fantasy – there are now drugs on the market that do all of this.) And there are not just 10 or 20 such drugs in clinical development – there are close to 1,000! Not all of these will make it through FDA approval and the market. But the odds are that 100 or more might. These drugs will be specific not for a broad disease such as breast cancer but for a specific subtype of breast cancer – great examples of designed drugs. But these drugs, too, will be expensive.
This welcome news for cancer patients, unfortunately, poses a potential problem for healthcare systems. Conversion of cancer from a fatal disease to a chronic one means that many people (millions?) will be living for 10, 20, 30 years on expensive medications. One can begin to do the math and show the tremendous burden that such costs will add to an already high health care bill for governments and other payers. One can argue that, as the cancer survivor population grows, paying for its medicines will become unsustainable. How does one get to a situation where all patients will be able to get life-saving medications?
The companies that will be successful in this field will be ones that can “bundle” the medications needed to treat certain cancers. Thus, it will behoove companies to develop, either internally or through alliances, the best drugs to treat certain cancers. By having these multiple agents in hand, one can envision a scenario where a company can work with healthcare providers to deliver triple therapies at a reasonable cost. There is evidence that such positioning is already happening. BMS and Roche recently have joined forces to combat melanoma. They have agreed to fund clinical studies that combine BMS’s recently approved melanoma drug, ipilimumab, with Roche’s late stage clinical compound, vemurafenib. Both of these attack melanoma in different but potentially complementary ways. Theoretically, the combined therapy should be far more effective than either is alone. Perhaps it is naïve to assume that, should this combined therapy be successful, the cost of this combination would not be additive but rather be lower than the cost of each individually. Nevertheless, one would expect more such joint ventures in the future and as multiple cancer treatments emerge perhaps competition will help drive down price.
The “War on Cancer” has been waged for decades. Now that it appears that the control of this disease is possible, it is imperative to find business models which allow for patients to be treated with medications and protocols that don’t bankrupt the healthcare system.