Friday, December 13, 2013

A Shot Across the Bows for SCIENCE, NATURE and CELL

-->
In a recent article in the Guardian newspaper Nobel laureate Randy Schekman castigated the ‘ holy trinity' of prestige journals for their editorial practices (1). He also criticized the academic promotions process for putting too much emphasis on publication is these journals. The journals in turn exploit this for their own ends of increased circulation and profit.  While there is a lot of excellent science published in these venues, the prestige journal system has two major flaws. The first, pointed out by Schekman, is that these journals want articles on timely and ‘sexy’ topics only- other equally good science is ignored.  The second is that the editorial process really isn’t peer review. Much of the decision making for each journal lies with a small cohort of admittedly very bright, usually young, full-time staff editors (sometimes decried as ‘failed postdocs’).  This is especially true of the Nature stable of journals and is quite at odds with the more traditional approach of journals based in scientific societies where the editors are distinguished investigators in their own right and serve on a part time basis.

Schekman advocates publishing in open-access journals and ignoring the prestige journals. The trouble with that is that the expanding universe of on-line journals include a lot of junk, as a recent experiment showed (2). Personally I have more faith in some of the old-line conventional journals in the biomedical field that have a long track record of publishing solid science. Despite the emphasis on publication in premiere journals, my experience is that good science published in good mid-level journals eventually gets recognition. 



(2) http://www.sciencemag.org/content/342/6154/60.full

Thursday, December 5, 2013

Peer Review: ‘Herding’ Behavior Versus ‘Gut’ Instincts in Science

An interesting article in NATURE scrutinizes the peer review process via a computer model (1). The authors compare the rapidity of acceptance of a hypothesis in a field when peer review is based solely on objective analysis of data versus when the reviewer includes subjective feelings about the validity of the hypothesis. They find that convergence is more rapid in the first case. Rapid convergence has an element of herd behavior- this may be valuable when the hypothesis under consideration is indeed correct. However, if it is not, then ‘herding’ can lead to premature acceptance of false concepts.

The NATURE article reflects a problem that is very familiar to scientists undertaking review of journal articles (or grant proposals). In the current ideal model of peer review one is supposed to confine oneself to the data and methods presented in the submitted article. If these all seem consistent then one should recommend acceptance of the article by the journal.  However, often one has the ‘gut’ feeling, based on long experience, that something isn’t right, that the results can’t be valid given the nature of the experiment or the methodology used. But you can’t just come out and state this in your review! In these cases reviewers often hunt for some excuse to reject the article, but are sometimes forced into reluctant acceptance. If more subjectivity were allowed in peer review it might reduce the frequency of papers, especially high profile ones, whose data cannot be reproduced upon subsequent analysis (2). This type of subjective evaluation does take place- in hallway conversations at meetings or in the university cafeteria, but perhaps there should be more of a place for it in the formal review process. Eventually flawed research is revealed- especially if the topic is quite important. However, reputations are not made by repeating the work of others, and surely here are many observations and concepts of more modest importance that persist in the literature even though they are basically incorrect.   

(1) http://www.nature.com/news/peer-reviewers-urged-to-speak-their-minds-1.14302


(2) http://blogs.nature.com/news/2011/09/reliability_of_new_drug_target.html


Friday, November 22, 2013

tCDS and the First Glimpses of Human Enhancement Technology


Recently I have become increasingly aware of an emerging emphasis on using advances in biotechnology not just to cure disease but to improve the capabilities of healthy people. This ‘human enhancement’ phenomenon is cropping up in all sorts of places ranging from the military experimenting with ‘exoskeltons’ to give soldiers extra strength, to supposed life-style enhancing ‘nutraceuticals’, to the first attempts to increase longevity with drugs such as rapamycin. An interesting example was recently reported in a NY Times Magazine article “Jump-Starter Kits for the Mind” (1). This involved trans-cranial direct current stimulation (tCDS), a technique where small currents are applied to regions of the brain to improve memory or executive function. The idea is that the current stimulates neural tracts and such ‘exercise’ increases function, thus reflecting the well-establish neurobiological concept of Long Term Potentiation. As with it’s near relative, transcranial magnetic stimulation (TMS), a number of studies in ageing patients with impaired mental functions have claimed to detect benefits with use of tCDS. This has prompted investigators to examine the effects of tDCS in healthy people and, as described in the Times article, some positive results have been found. Apparently, since the technology is so simple, this has also prompted handy do-it-yourselfers to download how-to videos from YouTube and make tCDS devices for themselves. Since there have not yet been large-scale, blinded, controlled trials of tCDS or TMS the jury is still out on whether they really work. However, the enthusiasm with which they have been adopted by scientists and by some members of the public illustrate that there is tremendous interest in in the whole human enhancement thrust. 

Friday, November 15, 2013

FDA Doesn’t Make Sense Dealing With an Antisense Medication for Duchenne Muscular Dystrophy


Duchenne Muscular Dystrophy (DMD) is every parent’s nightmare. It’s an X-chromosome linked genetic disease that condemns young boys to paralysis and premature death. At present no therapy is available. However, recent research has indicated that certain types of antisense oligonucleotides can partially correct the genetic defect and at least slow the course of the disease. A small-scale clinical trial of an antisense molecule called Eteplirsen was recently completed by the biotech company Sarepta. The trial involved only twelve boys, but showed encouraging results. Sarepta then asked the FDA for ‘accelerated approval’ of the drug.  This is a relatively new mechanism whereby the FDA can conditionally approve new drugs for which there is a major unmet need, such as is the case in DMD. If the drug fails in subsequent larger-scale trials it would then lose approval. However, the FDA declined and has insisted on a full-scale placebo controlled trial of Eteplirsen prior to approval (1).

These decisions are always difficult. Clearly it is the duty of the FDA to make sure that approved medications actually work and are not a threat because of toxicity. However, stringent requirements for ‘classic’ clinical trials can keep good drugs out of the hands of needy patients for years. Similar situations have cropped up in the cancer therapy area where patients have pleaded for promising new drugs before these agents had completed formal clinical trials (2).
There is a lot of interest currently in adaptive clinical trials where advanced Bayesian statistics can be used to modulate trial design as information is accrued, rather than to be stuck with a rigid trial framework based on initial assumptions. One would think that the ‘accelerated approval’ process could be linked to trials of that type.

In this case it seems that FDA unnecessarily erred on the side of caution. While it is important to protect patients against possible toxicities of new drugs, in the case of Eteplirsen there was no evidence of toxic effects among the boys treated. Thus it seems likely that little harm would be done by letting additional patients be treated while more was being learned about the drug. 

(1)

(2)

Friday, November 8, 2013

Faith and Skepticism in Nanomedicine


As one who has intermittently toiled in the field of nanoparticle mediated drug delivery, I am bemused by the uncritical, almost reverent acceptance by the news media of each new publication on nanomedicine that appears in a decent scientific journal.  A couple of recent examples might include a post on the Economist Babbage site enthusing about peptide-coated nanoworms that are designed to detect elevations of protease activities in certain disease states by releasing the peptides for detection in the urine. Another on the CEN website lauds work using drug bearing polymers to suppress inflammation in CNS microglial cells. This is not to criticize the scientists who did the work or the studies themselves; they are certainly interesting science. However, as is often the case in the nanomedicine area, these very early stage investigations are hailed as breakthroughs that will inevitably result in important advances in clinical medicine. Not likely!

Over the years I have seen hundreds of novel and interesting strategies involving use of nanotechnology for diagnosis or therapy come crashing to a halt as they encounter the complexities of real-world medicine. Yet the breathless, awestruck acceptance of new developments in this field continues. Apparently there is a robust mythology about nanomedicine that is widely accepted. However, a bit more skepticism would probably be good for the field in the long run. 


Wednesday, October 30, 2013

Don’t Mess With Basic Science!


In this week’s NATURE Daniel Sarewitz, a well-known science policy guru, states that the rightful place of science is in service to society. While it is hard to argue against the idea that one of the main goals of research should be to benefit health care, the economy, or other aspects of societal well being, there is an unsettling underlying theme to Sarewitz’s commentary.  In this article he outlines several attempts by the federal government to more effectively harness basic research to national goals. This includes the NIH’s National Center for Advancing Translational Sciences, the DOE’s-Advanced Research Projects Agency-Energy and a new National Additive Manufacturing Institute. Sarewitz contrasts these goal-oriented efforts to what he feels has been the bloated and wasteful state of basic research, especially basic biomedical research, in this country. This theme of devaluing undirected basic research has been prominent in Sarewitz’s previous writings over the last few years.

While the NATURE commentary makes many valid points, it is fundamentally flawed because it ignores one of the key aspects of science, the unpredictable ‘Black Swan’ nature of basic research. Certainly there is merit in coupling many aspects of science to societal goals and priorities. Much research is rather mundane and consists of filling in gaps in the knowledge base. Nonetheless now and then truly unique and unexpected insights emerge that change the entire scientific paradigm.  One might point to the discovery of RNA interference by biologists or of exoplanets by astronomers as contemporary examples. If much of the nation’s basic research effort is put in harness to short to medium term technological goals, will such fundamental breakthroughs continue to emerge? While much of our investment in R&D should be directed toward pragmatic goals, it will still be essential to maintain a substantial core of unfettered, undirected basic research. 





Monday, October 14, 2013

Don't Keep Clinical Trial Adverse Events Data Secret!


Drug company attempts to repress open dissemination of Clinical Study Reports by the European Medicines Agency are quite shortsighted. In addition to the obvious public benefit of providing comprehensive safety data to academic researchers, the transparent approach would be of value to the drug companies themselves. One of the major factors in the high cost of new drugs is the expense involved in failed late-stage clinical trials.  In many cases several companies will conduct trials on similar medications with each trial being veiled by corporate secrecy. In contrast, prompt dissemination of full CSRs could prevent late entrants into a therapeutic field from making the same mistakes in trial design as earlier entrants. While this might be perceived as rather negative for the pioneer in any particular case, over time the advantages would average out to the benefit of the entire industry. Surely steps can be taken to protect key intellectual property involved in clinical trials while still disseminating important information on adverse events. The EMA is very forward looking on this issue; one would hope that the FDA would follow, but don’t count on it!

Thursday, October 3, 2013

Scientist? Public Servant? You Lose!


Scientists are public servants. Whether employed by a public institution or a private one, most likely much of a scientist’s research support comes form public funds, and the essence of a scientist’s job is to contribute new knowledge to the public domain. To effectively carry out his/her mission a scientist needs to make a sustained effort over a long period of time. Now comes the sequester, and then the shutdown! Much of the infrastructure for doing science in this country is now in disarray and many careers are at risk.

What kind of a message does this send to young scientists or indeed to any young person who aspires to a career in public service? The message clearly is that you are a pawn that can be manipulated by powerful interests and that your contributions are not valued. In our current bizarre political and economic system the only thing that counts is money!

A breakthrough for siRNA based therapy?


A report in The Lancet this week describes an important milestone in the evolution of siRNA oligonucleotides as therapeutic agents (1). PCSK9 is a protein that binds to and causes degradation of the Low Density Lipoprotein-Receptor that is involved in cholesterol regulation. Patients with mutations in PSCK9 have increased LDL-R and very low levels of blood cholesterol thus validating the importance of this protein. Researchers at Alnylam Pharmaceuticals have developed a siRNA that triggers destruction of the messenger RNA for PCSK9 and have tested it in a small clinical trial. The siRNA was delivered to liver cells (where PCSK9 is made) using a lipid-based nanoparticle. The study showed a substantial reduction in PCSK9 levels and corresponding decreases in serum cholesterol. Effects were seen with doses of siRNA in the 0.15 mg/kg range, quite a low dose for this type of molecule.

The eventual intended use of the PCSK9 siRNA would be in patients refractory to statins, the drugs commonly used to reduce cholesterol. Apparently there are quite a few such patients since by some estimates there is a $4B yearly market in this area. The siRNA drug will have competition from anti-PCSK9 monoclonal antibodies that are undergoing clinical testing by other companies.

Single stranded antisense oligonucleotides can also trigger destruction of messenger RNA, although by a different mechanism than siRNA. There are competing views on whether antisense or siRNA offers the superior approach for new drug development. An antisense oligonucleotide was recently approved by the FDA (2). Interestingly, this agent is also designed to regulate cholesterol in statin-refractory patients but works by reducing expression of a key component of LDL itself.

In both of these cases, the target of the oligonucleotide drug was in the liver. This highlights a key problem in the development of antisense or siRNA as therapeutic agents. It is very difficult to deliver adequate amounts of these types of molecules to any tissue other than the liver.  While there are certainly many liver-associated diseases that might be approached with this technology, it would be a great step forward if efficient delivery to other tissues could be attained. 





 

Thursday, September 19, 2013

Super-Bugs Kill Thousands


The CDC report that drug resistant bacteria cause over two million illnesses and result in over twenty thousand deaths per year in the US is very troubling, especially since much of this is preventable. The recent increasing prevalence of carbapenem-resistant enterobacteriaceae (CRE) is truly frightening since the penems are the drugs of last resort in many cases. There are many reasons for the increasing frequency of resistant strains of bacteria, but two of the major contributions could readily be avoided. First, and foremost is the overuse of antibiotics in agriculture. The CDC has traced numerous examples of resistant bacteria to livestock that have been maintained on antibiotics to promote growth. There is increased public awareness of this, accompanied by demand for antibiotic free meats and dairy products. However, the overwhelming proportion of commercial livestock production in this country still relies on antibiotics. A second key contribution is inappropriate use of antibiotics by physicians. Many common illnesses have a viral causation and are thus unaffected by antibiotics. However, many patients demand antibiotic treatment for common respiratory and intestinal diseases even if it is not medically warranted, and physicians tend to acquiesce. More rapid gene-based tests to distinguish viral and bacterial diseases should help to alleviate this problem. In the meantime there is an urgent need for new antibiotics that will kill bugs that have become resistant to older drugs. Unfortunately the pharmaceutical industry is not investing in this area because of the relatively poor profit picture in the antibiotics field.  

Thursday, September 5, 2013

The Global Population Explosion: Can It Be Stopped Before We Self-Destruct?


Two really interesting books on global population growth were recently published, one by Stephen Emmott (1) and one by Alan Weisman (2). After seeing reviews on line I look forward to reading both of these books. It is about time that someone clearly stated that we need not merely to stabilize global population but to dramatically reduce it in order to prevent total environmental disaster. Unfortunately the trends are not encouraging. The UN has just revised some of its global population predictions upward. Moreover the much-hallowed ‘demographic transition’ whereby increased wealth leads to lower fertility is showing some strain. Thus some very recent data indicates that in China and elsewhere, higher income women are having more rather than fewer offspring. It is hard to see how voluntary measures to spread use of contraception will really impact the enormous momentum of current population trends. The projections for population growth in certain less developed areas such as Africa are truly frightening and will be accompanied by increased consumption, resource depletion and environmental degradation. However, the really sad thing is that the US, which should know better, continues to pursue economic policies that emphasize rapid growth, based partly on an immigration-driven rapid population increase. We need to start thinking about new economic models that do not require constant growth (and constantly increasing environmental destruction) in order to attain a decent life-style for most people. A few economists have started to address this task (3). 

(1) 10 Billion. Stephen Emmott, Allen Lane 2013. ISBN: 9780141976327

(2) Countdown: Our Last, Best Hope for a Future on Earth? Alan Weisman, Little, Brown 2013. ISBN: 9780316097758

(3) Prosperity Without Growth. Tim Jackson, Earthscan 2011. ISBN 978-184971-323-8


Thursday, August 29, 2013

Misguided choices? How the NIH decides to prioritize research areas is a strange and mysterious process.


This week’s SCIENCE magazine notes that the NIH is committing $17M to a program to evaluate the role of extracellular RNA (exRNA). Obviously a variety of RNA types including miRNA, siRNA, piRNA, lncRNAs as well as conventional messenger RNAs play key roles within mammalian cells. In certain lower organisms like nematodes there is good evidence for cell-to cell transfer of functional siRNA. However, in mammals the exRNA story is very muddy indeed. While cells shed various RNAs enclosed in membranous structures called exosomes, there is no evidence that this material has any function whatsoever (cells shed lots of stuff, most of which is just debris). So how did exRNA get to be a NIH funding priority? What will be the contribution to human health?

This is somewhat reminiscent of the recent commitment of $100M federal funds to the “Brain Activity Map’.  The initial energy and organization for this concept came from non-neuroscientists and from private groups such as the Kavli Foundation. So how did the BAM get to be an NIH priority? (http://scienceforthefuture.blogspot.com/2013/04/the-brain-activity-map-bam-pluses-and.html)

These episodes illustrate the byzantine process by which funding priorities are set at the NIH and presumably at other federal funding agencies. The usual process is that NIH staffers seek advice from certain scientists about future funding needs. The scientists consulted are often ones who have devoted a lot of time to service on NIH study sections (grant review groups) and who are therefore well known to the staffers. No doubt the consultants are good scientists, however they may not be representative of their fields and they certainly will have their own interests to pursue. It is well known in the academic community that the surest way to get a research grant is to be involved in writing the RFP (Request for Proposals) in that area. Thus funding priority decisions are made in a very murky and nontransparent manner somewhat similar to the old-style politics of ‘smoke filled rooms’.

Surely in this age of near instantaneous communication there must be a better way to set funding priorities. For example, why not let the NIH convene panels of experts in various areas and give them a day or so to make some initial recommendations for new areas to fund. The recommendations, as well as the names of the panel members, could be posted on the internet and the larger scientific community allowed to comment. At some point a decision would need to be made by NIH staff, but at least broader input would be achieved. Additionally, the entire scientific community, not just a few ‘insiders’, would know that an area of research was under consideration for increased funding. Its time for a little sunlight to penetrate the darkness of NIH prioritizations. 

Thursday, August 8, 2013

Ageing, Human Enhancement and the Economy: Work and Sex at 120.


Today’s NY Times published an opinion piece by the columnist Charles Blow that discussed the prospects for radical increases in human lifespan and possible implications for society (1). This was partly driven by the recent online publication of a report (2) from the Pew Foundation that surveyed American’s attitudes toward old age and the possible radical extension of life. Interestingly most people did not express interest in living much beyond 90 or so, what would be considered a ripe old age, but nothing exceptional.  In his article Mr. Blow briefly mentioned some of the economic, ethical and societal problems associated with advanced old age. Many of the on-line comments appended to the Blow article expressed concern about increasing the number of frail, sickly elderly people.
In my view this column and most of the comments have it all wrong. They visualize increasing numbers of decrepit elders acting as a drain on society. They fail to anticipate the accelerating wave of ‘human enhancement’ technology that will allow people to live far longer but also to be healthier, stronger and smarter in their advanced years than most middle-aged humans are today.  Gene therapy, stem cell technology, advanced neuropharmacology, physical and mental prostheses, all of these are converging to allow a re-engineering of the human organism.  Clearly there are key issues about who will be able to access these advances and what impact they will have on our economy and society. Will it be only the very rich or will many people be able to enjoy the benefits of human enhancement? What will be the effects on our economy and our social structures? By the way, the finding of the Pew Foundation that most people do not want to live much beyond current lifespans will likely go out the window as people start to see smart, vigorous, sexually active 120 year-olds!
To me the most worrisome prospect is that advances in human enhancement will be arriving on the scene just as another technological wave is cresting. Ever-smarter and more capable machines (think IBM’s Watson coupled with a very sophisticated robot) will be doing more and more of the work of the economy.  Thus we may see large numbers of very healthy, vigorous people who make no contribution to the production of goods or services. How will society deal with this?
These interesting themes regarding ageing, technology and the economy will be developed at length in future blogs on this site.



Tuesday, August 6, 2013

Scientific Reproducibility, Hype, and the Glut of Ph.D.s


There is an interesting conjunction of articles in this week’s NATURE.  One opinion piece from an idealistic young graduate student deplores that fact that scientists must promote their work as being medically, economically or socially relevant, to an extent verging on ‘hype’ (1). In the same issue a news feature reports that the NIH is considering verification rules for some of the research it supports partially because of many comments from the pharmaceutical industry that much academic research cannot be reproduced (2). Finally, the issue contains an obituary of the Nobel prize winning physicist Kenneth Wilson (3).

Here is what relates these three articles- two are examples and one a counter example of the consequences of the overexpansion of contemporary science. Scientists oversell the pragmatic ramifications of their work largely because funding agencies require them to do so. For example, all biomedical investigators must deal with the NIH requirement for explaining the ‘significance’ of projected research as well as with the agency’s current emphasis on ‘translational’ research.  Other funding entities in the US have similar stipulations and the situation may be even worse in Canada and the UK where there is increasing emphasis on the commercial ramifications of research. However, as has been shown by many historians of science, the greatest impacts often flow from unfettered basic research rather than from work intended to address specific medical or technical problems. The career of Prof. Wilson is a good example of this. After being hired by Cornell in 1963 he did not publish a paper until 1969. Then in 1971 he published theoretical work that revolutionized areas of physics ranging from sub-atomic particles to fluid mechanics, thus earning himself the Nobel. Today it is hard to imagine anyone lasting for six years in an academic position without multiple publications (preferably ‘translational’ ones!).

Delving further, why are the funding agencies so insistent on research that claims to have immediate pragmatic ‘significance’. The reason is that the agencies must drum up political support for their enormous budgets, and it’s much easier to sell a Senator or Congressman on curing say prostate cancer than on elucidating some obscure molecular interaction. While society should generously support fundamental scientific research, both for its own sake and for its long term practical benefits, the level of support must be ‘right-sized’ to the state of the economy and the level of development of science itself. If maintaining public support for science means a constant process of overselling its short-term payoff, then perhaps the bloated science establishment needs some trimming.

One of the major problems, however, is that universities have used the relatively generous science budgets of the last few decades to train an enormous cohort of Ph.D.s who now must struggle for research funding. The current rather obscene degree of ultra-competition drives investigators to publish results prematurely leading to increasing concerns about reproducibility. As discussed previously on this blog (4) curtailing the production of Ph.D.s would be immensely helpful in obtaining the right balance between reasonable levels of publically supported science funding and the size of the scientific work force. Fewer, better quality Ph.D.s may actually produce a greater amount of high quality science than hordes of ill-trained Ph.D.s, many from 3rd rate institutions.




Tuesday, July 23, 2013

Glaxo in China: Another Step in Big Pharma’s Race to the Bottom



GlaxoSmithKline has been much in the news lately. Following a scandal where Glaxo executives apparently bribed Chinese physicians and hospital officials to promote use of GSK drugs, it now turns out that there are also major problems at GSK’s shiny new research center in Shanghai, as recently described in the NY Times.   Apparently key pre-clinical studies in animals were not reported (misplaced? suppressed?) before an important new drug went into clinical trials.  Ozanezumab, a monoclonal antibody for treatment of neurological diseases was being developed at the Shanghai facility when problems emerged during an internal audit in 2011 that has only recently become public.  Evaluating animal studies prior to initiation of clinical trials is crucial to protecting patients in the trials against potential harmful effects of the new drug.  An excerpt from the Times article highlights the issue ““If that’s true, it’s a mortal sin in research requirements,” said Arthur L. Caplan, the head of the division of medical ethics at NYU Langone Medical Center. “No one could approve human trials without having that information available, scientifically or ethically. That’s kind of a Rock-of-Gibraltar-sized ethics violation.”  As bad as it is, this is not the first scandal to hit GSK’s China operations. A few months ago the head of GSK R & D in China was fired for misrepresenting data in an article published in Nature Medicine.

So is this just an isolated case? I doubt it. Over the last decade or so big Pharma has sought to maximize profits by reducing expenditures for research, personnel, and materials.  To a considerable extent this has been done by seeking lower cost alternatives in China and other less developed countries. Thus research staff at sites in the US and Europe have been cut while new sites have been created in Asia. Production of the ingredients to make existing drugs has been outsourced to companies in India and elsewhere. The problem with this is that it is not only costs that are being cut, but quality as well.

As described elsewhere on this blog (1) there have been numerous problems with drugs produced by foreign manufacturers for the US market. Now it is emerging that research results from big Pharma’s outsourced labs can’t be trusted either. It is not that Asian researchers can’t do good science. More and more outstanding work is emanating from academic laboratories in China, India, Taiwan, Singapore and other Asian countries. However, when commerce enters the picture scientific probity seems to go out the window.

It makes one wonder whether we should entrust our future needs for important new drugs to the current profit driven system represented by the big pharmaceutical companies. There are other models for drug development (2) including the public-private partnerships that have been so successful in developing drugs for malaria and other neglected diseases. Some new approaches are clearly needed.


Tuesday, July 16, 2013

Mitochondrial replacement- disease therapy or a first step toward human enhancement through the germ line?


As reported in a recent commentary in Nature the UK has decided to allow human trials of mitochondrial replacement. This would involve a small group of women who have genetic defects in their mitochondria that could be transmitted to their offspring. The procedure would involve transfer of only the nucleus of the egg of the affected woman to a healthy enucleated egg from a donor with subsequent in vitro fertilization by the male partner. Simply put, it would result in a ‘three parent baby’ with male and female nuclei provided by the parents and mitochondrial DNA provided by the donor. The positive aspect is that it would allow women with mitochondrial defects to have normal children that share most of their genome. The mitochondrial genes account for only a tiny fraction of the total genome and thus the children born in this manner would be genetically very similar to those conceived normally. The aspect of concern is that this really is re-engineering the human germ line since the mitochondrial DNA could be passed on indefinitely through female offspring. On an ethical basis there has been great reluctance to engage in modification of the human germ line, but this would be a first step.

While the proposed trials are directed toward preventing offspring with genetic defects one could easily visualize a ‘slippery slope’ effect here. There is increasing momentum for using various pharmacological and genetic techniques to enhance human capabilities (I will be posting much more about this soon). Since mitochondria can contribute to endurance and athletic performance (1), it will be tempting to use this relatively simple technology to ‘improve’ one’s offspring with, for example, mitochondria from an outstanding female athlete.  Clearly stringent monitoring will be needed.


1. http://physiolgenomics.physiology.org/content/43/13/789.full


Tuesday, June 25, 2013

The Supremes protect generic drug producers- but what about patients?



The US has increasingly turned to generic drugs as a key tool in constraining its outlandish health care costs. However, are these drugs as safe and effective as brand name medications? The Supreme Court apparently thinks so since in a recent decision (Mutual Pharmaceutical co. v. Bartlett) it ruled that makers of generic drugs cannot be sued under state law for adverse reactions to their products. The underlying concept is that since the original branded drug has been approved by the FDA and since the generic manufacturer is providing a product that is ‘equivalent’ to the original drug it cannot be held responsible for harm caused by the drug.

Generic drugs now account for 80% of prescriptions and are often 80-90% less expensive than the original branded drug (1).  Generics entered the US market under the 1984 Hatch-Waxman Act that provides a rapid approval process (Abbreviated New Drug Application, ANDA) that is far less demanding than the process for approval of the original drug. The generic has to demonstrate ‘bioequivalence’ which is usually interpreted as attaining comparable blood levels to the original drug. However, it is important to realize that the FDA essentially depends on the generic drug producers to provide the data used in the evaluation. Currently the FDA lacks the resources to extensively test all submitted drugs or to inspect all of the factories that produce these agents. Thus, if a generic drug company manipulates the data submitted for approval, or if its manufacturing practices deviate from accepted standards, these problems may not be discovered for a long time. This may be particularly a problem with foreign generic producers, who now account for a substantial share of the US market, but whose operations are difficult for the FDA to monitor. There is a long history of issues with generic drugs due to the lack of efficacy because of insufficient or degraded active ingredient or due to the presence of harmful contaminants. An especially egregious situation involved Ranbaxy Laboratories, an Indian firm that is one of the largest generic manufacturers. Over a period of decades Ranbaxy engaged in outright fraud to obtain approval for its drugs both in the US and especially in many less developed countries with weak pharmaceutical regulations (2).

In this light the Mutual decision seems very problematic. One of the key constraints that keep pharmaceutical companies on the straight and narrow is the fear of litigation. Thus the Supreme Court seems to be opening the door to allowing generic manufacturers to cut back on monitoring the quality of their products and their manufacturing processes. Inevitably this will have harmful effects on patients.