Friday, March 16, 2012

John, Paul and Eroom

Catching up on “In the Pipeline” over spring break, I followed Derek Lowe’s pointer to yet another article in Nature Drug Discovery attempting to explain the declining productivity of drug R&D.

Rather than start from scratch, the team from Sanford C. Bernstein (an investment research company) summarize many of the previous diagnoses, noting the lack of incommensurability of the earlier proscriptions:
They include: the FDA’s ‘Critical Path Initiative’; a series of prescient papers by Horrobin, arguing that bottom-up science has been a disappointing distraction; an article by Ruffolo focused mainly on regulatory and organizational barriers; a history of the rise and fall of medical innovation in the twentieth century by Le Fanu; an analysis of the organizational challenges in biotechnology innovation by Pisano†; critiques by Young and by Hopkins et al., of the view that high-affinity binding of a single target by a lead compound is the best place from which to start the R&D process; an analysis by Pammolli et al., looking at changes in the mix of projects in ‘easy’ versus ‘difficult’ therapeutic areas; some broad-ranging work by Munos; as well as a handful of other publications.

There is also a problem of scope. If we compare the analyses from the FDA, Garnier, Horrobin, Ruffolo, Le Fanu, Pisano, Young and Pammolli et al., there is limited overlap. In many cases, the different sources blame none of the same countervailing forces. This suggests that a more integrated explanation is required.
† The book by Gary Pisano (2006) of Harvard Business School is a favorite here at KGI — I think I’ve seen it on more shelves than any other.

The main point of the article is to introduce a so-called “Eroom’s Law”, in which the number of new FDA-approved therapies (per $ million R&D) fell by half every nine years. It is intended to be the inverse of the Moore’s Law that has driven IT improvements or more than 40 years, in which the number of transistors per chip doubles every two years.

The Eroom’s Law was the major new gimmick of the Scannell et al article, and Lowe devoted an entire posting to it. I’m far from convinced: unlike Gordon Moore, there’s no real explanation of the causal mechanisms — even post hoc. Instead, it seems more like Eroom’s Syndrome — a collection of symptoms with multiple root causes.

The four (possibly five) causes that the authors identified:
  • “the ‘better than the Beatles’ problem;
  • the ‘cautious regulator’ problem;
  • the ‘throw money at it’ tendency; and the
  • ‘basic research–brute force’ bias.
There may also be some contribution from a fifth factor, termed ‘the low-hanging fruit’ problem.”

Some of these are dog-bites-man stories. The excess of caution by the FDA is like the weather — everyone complains about it but nobody does anything about it. Even though they mostly dismiss it, the “low-hanging fruit” argument is also an old one.

Interestingly, they argue that “throw money” is easily reversed: if, in effect, big pharma R&D shops had become fat, dumb and happy (my term not theirs), a prescription of lean and mean can be applied with little impact on output. If true, this would certainly help reverse the trend.

It’s the two remaining ideas that to me seemed the novel ones.

Better than the Beatles

This was I thought the most clever argument — both in its identification, and in its framing:
Imagine how hard it would be to achieve commercial success with new pop songs if any new song had to be better than the Beatles, if the entire Beatles catalogue was available for free, and if people did not get bored with old Beatles records. We suggest something similar applies to the discovery and development of new drugs. Yesterday's blockbuster is today's generic. An ever-improving back catalogue of approved medicines increases the complexity of the development process for new drugs, and raises the evidential hurdles for approval, adoption and reimbursement. It deters R&D in some areas, crowds R&D activity into hard-to-treat diseases and reduces the economic value of as-yet-undiscovered drugs. The problem is progressive and intractable.
As they point out, content-based IP loses its novelty and so people seek out new things. After everyone saw Gone with the Wind, there was room for Casablanca, On The Waterfront, The Sound of Music, and a couple of Godfather movies. In resource based industries, the consumption of the old content (e.g. coal from a mine) makes the remaining holdings more valuable.

They point to a specific area — anti-ulcerants — as an example where drug discovery is held back by the back catalog. Two families of (now generic) solutions are already available, and while the third approach “would probably be safe and effective,” it is unlikely that any healthcare reimbursement system would pay for a new class of patented medicines, except for those rare cases not treated by the first two.

Their conclusion is even more depressing: “This general problem applies in diabetes, hypertension, cholesterol management and many other indications.” They also note that some of the decline of R&D productivity may be because pharma companies shifted from crowded (but high approval rate) therapeutic areas to less crowded (but lower approval rate) areas.

Basic Research/Brute Force

In this argument, the authors contend that the whole attempt to solve health problems through basic research and a molecular understanding of disease has been a disappointment, if not an outright failure. In addition to the molecular basis — and brute force analysis attempting to find new molecules — is also the assumption that the best way to solve a disease is to make a single molecule that binds to a single target.

Based on his industry experience, Lowe shows great sympathy for this argument
This gets us back to a topic that's come up around here several times: whether the entire target-based molecular-biology-driven style of drug discovery (which has been the norm since roughly the early 1980s) has been a dead end. Personally, I tend to think of it in terms of hubris and nemesis. We convinced ourselves that were were smarter than we really were.
I lack both the science and the industry experience to assess the validity of this complaint, except to note that this critique is obviously not universally shared. It would be nice to think that this article would start an honest conversation, but my guess is that too much money (private, government, university) has been invested in molecular approaches for this idea to gain traction until the evidence is undeniable.


The issue of declining R&D efficacy is a huge one for the industry, and also for investors in pharma companies big and small. The authors list a number of reasons why they are optimistic in the next five years:
Flat to declining R&D costs, as well as a bolus of oncology drugs, more orphan drugs and 'biosimilars as BLAs', might put an end to Eroom's Law at an industry level. Whether this improves things enough to provide decent financial returns on the industry's R&D investment is a different question. Financial markets don't think so. Industry executives do.
Their final idea is for each big pharma company to create a Chief Dead Drug Officer (CDDO), who is incentivized to analyze the failure of R&D investments and report the board and the public about the reasons beyond Eroom’s Law. A provocative idea — but I’m not holding my breath.


Gary Pisano, Science Business: The Promise, the Reality, and the Future of Biotech, Boston: Harvard Business School Press, 2006.

Jack W. Scannell, Alex Blanckley, Helen Boldon & Brian Warrington, “Diagnosing the decline in pharmaceutical R&D efficiency,” Nature Reviews Drug Discovery 11 (March 2012): 191-200. Doi: 10.1038/nrd3681

No comments:

Post a Comment