Blog: How to lie with (FDA) statistics
The FDA approved 27 new drugs in 2013. Is this a downward trend? Is it an upward trend? Does it suggest that the pharmaceutical industry is failing, or that genomics is finally paying dividends? We revisit Darrell Huff’s 1954 classic How to Lie with Statistics for insights into these pressing questions.
The headline of a recent Associated Press release read “The Big Story, New drug approvals from FDA declined in 2013.” (1) Indeed, the FDA approved only 27 New Molecular Entities (NMEs) in 2013, compared to 39 in 2012. This discouraging news was widely reported in mainstream and business media.
The AP release quotes FDA officials as saying that “the tally of innovative medications approved last year is in line with the historical trend.” The official FDA report was more circumspect, stating that “CDER approved 27 NMEs in 2013, which is similar to average totals of other years from this time period. For instance from 2004 through 2012, CDER has averaged about 26 NME approvals per year. In 2012, CDER approved 39 NMEs, but this was an unusually high number….” (2).
The Associated Press repeated the FDA’s analysis, but then states that “Experts attribute the recent uptick to a combination of factors: a stable, well-funded FDA and a newly established research model among drug makers…” A commentator on the web site of the National Venture Capital Association, extolling the importance of regulatory reforms championed by the Association, echoed this view, writing: “…we believe that the upward trend in drug approvals over the past several years has increased investor confidence in the FDA's review and approval process” (3).
Reading the conflicting interpretations of the FDA data, I was reminded of Darrell Huff’s classic book How to Lie with Statistics (W.W. Norton, 1954) and his statement that “The secret language of statistics, so appealing in a fact-minded culture, is employed to sensationalize, inflate, confuse, and oversimplify.” Huff’s book illustrated the subjectivity that could be introduced into analysis by selective data analysis, graphical representation, and the association of correlations with trends or causation. For the record, the FDA has reported historical data (4) on the number of new drugs (New Chemical Entities and Biologicals) as shown in the attached figures. The top panel shows the raw data with three possible “trends” that could be inferred from selectively chosen linear regressions. The first, 1970-2013, might be interpreted as representing the “post-thalidomide” era of drug development. The second, 1996-2013, might be interpreted as representing the post-PDUFA (Prescription Drug User Fee Act of 1993) era of the FDA; and the others, 2005-2013, or the 2004-2013 time frame described by the FDA, might be used to assess recent trends. In the tradition of “How to Lie with Statistics,” each of these arbitrary analyses provides the basis for empirical, though not statistically significant, narratives suggesting that drug development trends are either positive (green) or negative (red).
Using standard regression models to evaluate long term trends is even more confusing. The bottom panel shows the application of various analytical formulations including linear regression (purple), 5, 10, 12, 15, and 20 year moving averages (blue), or polynomial (4th, 5th order) (orange) curve fits. The 10 year moving average is in bold with +/- SD of the residual shaded. It is apparent that analysis of drug approval data alone does not conveniently support either a “declining” or “upward” narrative.
Can anything be learned from the FDA data?
These data are seemingly consistent with Jurgen Drews’ 1995 prediction (5), that the number of new pharmaceutical products in development at that time was not sufficient to replace the number coming off patent and sustain continued growth of the biopharmaceutical industry. While each of the regression models in the bottom panel appear to show growth in the number of NMEs approved though the 1970s and early 1990s, this trend seems to be absent after 1995.
Less defensible is the statement in the AP release that “FDA drug approvals are watched closely by analysts as both a barometer of industry innovation and the federal government's efficiency in reviewing new therapies.” On the surface, it seems obvious to make a causal connection between innovation in industry, the efficiency of the review process, and the number of NMEs that result. Indeed, Drews’ concern about the inadequacy of the product pipeline is often referred to as the “innovation gap.”
Yet, as Huff warned, “flaws in assumptions of causality are not always so easy to spot, especially when the relationship seems to make a lot of sense or when it pleases a popular prejudice.” Is the assumption of causality in this instance flawed? We think it might be.
It is recognized that not all technological innovations have a sensible, sustaining effect on product development. Many technological innovations are incompatible with the capabilities and conventions of established industries. Such innovations may initially be disruptive to product development processes and markets until businesses adapt to new technological requirements and commercial opportunities. Disruptive innovations may require substantial periods of time to mature before they contribute to the development of competitive products.
In this context, short term changes in number of NME’s reflects only incremental or sustaining innovations, and not the type of radical and disruptive innovations embodied in genomics and other “omic” technologies that have emerged in recent decades. In fact, a 2001 report titled Fruits of Genomics predicted that genomic innovations in the pharmaceutical industry would initially have a negative impact on the timelines and cost of drug development (6). Their model predicted that it would not be until after 2010 that these innovations would begin to produce increasing numbers of NMEs.
Is there evidence to support the prediction that this “newly established research model” is finally producing increasing numbers of NMEs? I would suggest that you first read “How to Lie with Statistics,” then come to your own conclusions.
Fred Ledley is Director of the Center for Integration of Science and Industry at Bentley University, and Professor of Natural & Applied Science and Management.
5 - Drews, Jürgen, and Stefan Ryser. "Innovation deficit in the pharmaceutical industry." Drug Information Journal 30.1 (1996): 97-108.
6 – Brothers, Lehman. "The fruits of genomics." Lehman Brothers: New York (2001).
Research: Making the biotech IPO work
There were more biotech IPOs in 2012-13 than in a y previous IPO window. Will these companies be able to deliver of their promise of new generations of therapeutic products and economic growth? A recent paper from the Center for Integration of Science and Industry provides a prescription for success that will require strategic business models and patience. Listen to interview on Bloomberg Radio.
Blog: Could Human Genome Sciences have become Standard Oil?
Human Genome Sciences (HGS) was not a company with normal ambitions. At its inception, HGS aspired to dominate not only the field of genomic science, but also emerging markets for regenerative medicines designed to meet the needs of ageing populations. Fred Ledley asks whether HGS could have become the Standard Oil of our generation.
Human Genome Sciences (HGS) was not a company with normal ambitions. From its founding in 1992, HGS aspired to dominate the newly emergent field of genomics. The company’s strategy was not simply to mine the human genome for novel sequence elements that could be refined into drug targets or biological products, but rather to use “omics” approaches to systematize and automate biopharmaceutical development. Moreover the company’s intent was not only to develop novel therapeutic products, but to discover regenerative medicines that would meet the needs of ageing populations. The company’s ambitions were evident in annual reports that featured Greek gods and saints as metaphors for the company’s quest along with headlines such as “towards victory over disease” (1998) and “the new face of pharmaceuticals” (2000).
A recent paper from our group (McNamee, L.M., Ledley, F.D. (2013) Assessing the history and value of Human Genome Sciences. J. Com. Biotech. 19) examined the history of HGS, focusing on the relationship between capital investments, the company’s valuation, its accumulation of intellectual property, and its product pipeline. This technical analysis highlighted the failure of markets to value HGS’ intellectual property portfolio or product pipeline, which ultimately caused the company to be acquired by GlaxoSmithKline (GSK) for a “fair value” that was less than the total capital investment in the company.
Given HGS’ early ambitions, however, the more interesting question may be “could HGS have become Standard Oil? The question is not entirely facetious.
Like HGS, Standard Oil was at the forefront of the science and technological innovations of its era. Standard Oil initially grew to prominence with its patented Frasch-Burton process for fractional distillation, which enabled the company to produce kerosene from the high-sulphur oils being discovered in Ohio. The company continued to invest in research and, in the words of the US Supreme Court, was “…unremitting in their efforts to improve the processes of refining.”
Standard Oil was not the only corporate enterprise to emerge from the inventions and innovations of the 19th and early 20th centuries. Carnegie Steel, later US Steel, was the first US company to license the revolutionary Bessemer Process for steel production, American Tobacco was an early adopter of the automated cigarette rolling, and companies like General Electric, Westinghouse, and AT&T were organized around innovations in electricity and communications. So too, the modern pharmaceutical industry emerged during this era based on advances in organic chemistry and novel technologies such as the tablet press.
While each of these companies was at the forefront of technological innovation, they were also at the forefront of business innovation. Standard Oil famously created a vertically integrated enterprise that funneled oil from its wells into its transportation, refining, and marketing network. Similarly, US Steel grew to control mining, coke and ore production, and steel production, but also built a network of railroads and steamships, while American Tobacco controlled the processing, production, and marketing of tobacco products. The business innovations of the era not only improved efficiencies through vertical integration and scale, but also mass production as well as fundamental changes in banking practices, management, marketing, labor relations, and business law; not to mention the monopolistic practices that would ultimately cause these companies to be dismembered. In ordering the break-up of Standard Oil in 1911, the Supreme Court noted that the company was “…guided by economic genius of the highest order, sustained by courage, by a keen insight into commercial situations, resulting in the acquisition of great wealth, but at the same time serving to stimulate and increase production, to widely extend the distribution of the products of petroleum at a cost largely below that which would have otherwise prevailed…”
Perhaps HGS could never have become Standard Oil. Perhaps the innovations arising from the “omics” era are not as great as those that empowered the great corporations of the late 19th and early 20th centuries. Perhaps the potential markets for products of these innovations are not as large. While these reservations may be fairly debated, what is clear is that there have been no innovations in the business models for “omics” comparable to those that accompanied the technological innovations of the earlier age.
HGS, and other companies comprising the genomic sector, adopted business models that were familiar to the venture capitalists and the biopharmaceutical industry, and most were eventually acquired and seamlessly integrated into established pharmaceutical companies. While some individuals acquired significant wealth from genomics (HGS’ CEO was briefly biotech’s first billionaire in 2000), it is hard to imagine critics of the industry describing its leaders as “economic genius of the highest order.”
It may, in fact, be silly to consider similarities between HGS and Standard Oil; there are simply too many differences. But that is exactly the point. Genomic technologies, and the market opportunities for biopharmaceuticals to manage the health and wellness of ageing populations, are unprecedented. Nevertheless, while HGS avidly embraced innovation in science and technology, it failed to explore comparable innovations in business.
Business models have been described as “a focusing device that mediates between technology development and economic value creation,” and an extensive body of research suggests that successful innovation requires an effective synergy between a company’s technologies and business model. Could HGS have become Standard Oil with critical innovations in business to complement their novel technologies? We will never know.
Fred Ledley is a professor in the Department of Natural and Applied Sciences at Bentley University and Director of the Center for Integration of Science and Industry at Bentley University
Podcast: Dr. Fred Ledley speaks on Bloomberg Radio about the "Remarkable Year for Biotechs"
Research: What was the value of Human Genome Science?
When Human Genome Sciences (HGS) was acquired by GlaxoSmithKline (GSK) in 2012, its “fair value” was $3.6B, less than the $3.9B capital investments in the company. Does this truly reflect the value of a product (BenlystaTM) with billion-dollar potential, HGS’ product pipeline, and >600 patents? A recent paper from the Center for Integration of Science and Industry reviews the history of HGS and valuation of HGS and argues that there is a need for greater alignment between the milestones of translational science and measures of corporate value.