How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions
Richard Harris Central 610.72 H3158r 2017
About biomedical research, but the problems extend to many academic research fields.
My own brainfarts, after reading the book:
The core problem is too many academic scientists dividing up the available funding and jobs. Universities hire scientists as professors to train more scientists, who want jobs at those universities. If (WAG) three academic scientists produce ten new scientists every ten years over a 30 year career, and half of those new scientists seek grant-funded research jobs, then the number of scientists competing for those jobs doubles every 3.5 years. Funding for science is growing, globally, but not nearly that fast. Funding vacillates in countries like the United States, driven up and down by political and economic pressures.
This results in poorly trained new scientists, many lacking the statistical and logical training to identify and correct their own errors. It rewards careless boasters who make the biggest unsubstantiated claims, and paywall "prestige" journals like Science and Nature that are more concerned with glamor than replication and accuracy. Most articles are wrong, some dangerously. Many papers are cited carelessly by subsequent authors who have not even read (much less validated) those articles - I know, one of my papers has been cited more than 200 times.
Science publishing and science employment require radical restructuring. Ward Cunningham's "Federated Wiki" might be a good way to establish bidirectional links between "evolving" papers; no "published" paper stops improving. Erronious papers are identified, and are repaired or orphaned by originating and future authors. Rather than an exponential explosion of poorly supported work, the scientific community revises and develops its corpus of work (with the old revisions watermarked, timestamped, and carefully archived by multiple users of that work). Digital mass storage grows much faster than science does; we still need formatting experts, editors, and reviewers, but we don't need journals and we don't need paywalls anymore.
That might help with quality; there will be a broader range, from brilliant multiply-tested community creations to pseudoscience dreck. Dreck believers punish themselves; if that draws away marginal scientists and researchers, it helps deal with the scientific overpopulation problem.
Nonetheless, many good scientists can't find work as full-time scientists, like the professors they wish to emulate. Perhaps 10% can expect academic positions in their own field at the end of their training; in some over-producing fields like astronomy and physics, that might be closer to 3%. How do we relieve the pressure?
The downfall of science is overspecialization; newly trained scientists know how to do one narrow thing, and don't know how to talk about that to the citizens whose taxes pay for most research. ALL scientists should be crosstrained in at least one other field, such as business, engineering, medicine, manufacturing, education, law, politics, journalism, advertising, entertainment, and religion. Many will find fulfillment in their "second marriage", and can help inform those fields. This crosstraining can occur PRIOR to formal scientific training; a managed path for EARNING tuition before and during college can eliminate the crippling loans that many emerge from college with. If we take the debt-driven hurry out of science, we'll have time to do it right.
We also need more volunteer-driven public science. Ideally, everyone on the planet should contribute talent to the process; that will increase scientific productivity and public awareness. How to filter out the dreck? Darwinian selection by economic result. Most science affects (and is affected by) economic output; bad ideas turn into failed products. The trick is designing the scientific network (probably running on an evolved internet substrate) to back-propagate economic value in return for forward-propagating scientific value. Discovering a robust and stable "science economy" won't be easy, but it can evolve from small successes.
Wikipedia came from wiki, which came from html and Apple's hypercard. Wikipedia will never be perfect, but it evolves. When it gets too big, it will split into competing but federated wikis, and users will interact through adaptive interfaces. Computers evolved from mainframes with terminals, to personal computers, to mainframes (server farms) with terminals ("dodopaddle" smartphones) again. Search is currently "mainframed", but with the right tools, personal computers can host the search themselves, accessing millions of tiered sources of raw data. This could evolve into the new "science economy", when researchers connect their labs and their own raw data into this widely distributed network.
Professional scientists are isolated their customers, funders, and from effective self-regulation. This does not elevate them into an elite, but plunges them into a poverty-stricken ghetto, and robs the rest of us of the bounty that collaboration will bring.
- p002 $30b/y NIH
- p003 7000 known diseases, 500 have treatments, many marginal
- p004 lab that authenticates cells
- p004 "The much harder challenge is changing the culture and the structure of biomedicine so that scientists don't have to choose between doing it right and keeping their labs and careers afloat."
- p005 "Science progresses by testing ideas indirectly, throwing out the ones that seem wrong ..."
- p007 "Each year about a million biomedical studies are published ..."
- p007 C. Glenn Begley, cancer research at Amgen, codiscovered G-CSF ... 53 groundbreaking papers, could reproduce six. One failure cited 2000 times.
- Similar project by Bayer, 25 percent replication
- p011 2005 John Ioannidis "Why Most Published Research Findings Are False"
- p013 Leonard Friedman: 20% untrustworthy desogns, 25% dubious ingredients, 8% poor lab techniques, 18% bad data analysis
- half of preclinical research untrustworthy, costs $28B/y
- p016 FDA's Janet Woodcock, "I think it's a totally chaotic enterprise"
- p017 Johns Hopkins' Arturo Casadevall, more progress 1950-80 than 1980-2010
- p018 2012 Jack Scannell, "Eroom's law", steadily worsening state of drug development, approvals falling since the 1950s.
- improving after 2010
- p021 ASU Tempe Anna Barker, 200 clinical trials for glioblastoma multiforme, every one a failure, 65-80% oncology trials fail.
p023 2000 transdifferentiation: bone marrow stem cells -> brain, liver, others; hundreds of papers, "a mirage"
p027 Begley Six Red Flags for Suspect Work:
- should be blinded and repeated, no cherry picking, positive and negative controls, validated ingredients, appropriate statistical tests
p029 Feynman 1974 "The first principle is that you must not fool yourself -- and you are the easiest person to fool."
- p031 Chemistry, no direct observation, Steven Goodman Stanford, every observation filtered through an instrument
- p036 Carol Greider TERT telomerase enzyme, doesn't work, used in dozens of experiments and papers with false conclusions
p038 Stuart Fierstein Failure: Why Science Is So Successful, Harris differs, lazy-sloppy isn't bold but distracting and wasteful
- p047 reproducability, difference between stirring (rocking vs spinning bar)
- p049 cancer, matrix-dissolving enzymes, no consensus, failed to stop metastasis
- p051 cancer, checkpoint inhibitors and custom-built drugs from rigorous observations and multiple confirmations
- p055 ALS, some deeply flawed experiments used 4 mice. Steve Perrin: 32 treatment, 32 control, $112K per test and per dose, 9 months
- p055 ALS Therapy Development Institute (TDI)
- p057 ALS antibootic minocycline clinical trial, $20M failure for NIH.
- p059 Sen. Richard Shelby questioning NIH Francis Collins in March 2012 hearing
- p062 80% of Federal grant proposals rejected, so
- p067 TACT: Treat (Neuromuscular Dystrophy, NMD) Advisory Committee for Theraputics, reviews research proposals
- p071 Misled by Mice
p074 US > 10M animals a year, most are mice
- p080 mice in top of cage racks near lights, more stressed out and immune-suppressed, mice fear male handlers
- p081 more consistent results from heterogeneous mouse populations
- p082 Malcomb Macleod: scale up mouse experiments before trying on humans
p084 Emulate (spinoff of Harvard Wyss Institute) builds ersatz organs on "plastic chips" to mimic human organs
- p090 Jack Scannell (U Edinburgh) Most drugs are "magic shotguns, not magic bullets", off target effects can be useful
Many (actually most) important discoveries start with human beings in a medical clinic
p095 Walter Nelson-Rees: most cell lines turn out to be HeLa. A Conspiracy of Cells book by Michael Gold, 1986
p096 more than 7000 papers used HEp-2 or Int-407 cells that were actually HeLa, wasting more than $700M.
- p099 Amanda Capes-Davis, R. Ian Freshney, imposter cell list, International Cell Line Authentication Committee.
- p125 "batch effect", samples from test group run one day, from control group on another day.
p156 Science Exchange Palo Alto CA
P156 The Reproducability Project: Cancer Biology Brian Nosek
p157 Center for Open Science Charlottesville VA
p158 eLife open journal editor Randy Schekman at UCB and HHMI
- p165 Aturo Casadevall at JH Med School "Most scientists are not trained today on the basics of epistemology or logic ..."
- p173 postdocs, cheap labor pool, perhaps 40,000 in biomedicine, 5 years of heavy time demands for less than $50K
- in 2008, less than 21% end up in tenure track job, trend is downwards
- p174 "high impact factor": a measurement for selling ads and subscriptions, but naive surrogate for quality and significance
Nature over 40, Cell and Science over 30. Calculated by Thomson Reuters
- p175 Carol Greider ad JHU: 400 applicants for 1 assistant professorship
p177 Randy Schekman "Deans are bean counters. They like a simple number."
p178 Chinese scientists get cash bonuses for pubs in Science, Nature, Cell, and sell coauthorships for cash.
p179 federal Office of Research Integrity, about a dozen cases per year.
p180 Retraction Watch 2010 hobby of Ivan Oransky and Adam Marcus
retracted anesthesia papers: Yoshitaka Fujii >180, Joachim Boldt > 100
- p182 Ferric Fang UW, Casedavall JHU: 70% of retractions are bad behavior, not simply error, more common in high-profile journals.
- p182 David Allison U.Alab.: Some journals demand payment up to $2100 to publish letter pointing out other's error.
- p189 NIH spending doubled between 1998 and 2003, then inflation dropped value 20% over the next decade.
- p189 UCSF gets 3% of its funding from the state of California. Administration judges grant revenue, not work quality.
- p190 "positive words" increased by up to 150x between 1974 and 2014.
- p193 Postdocs arrange "Future of Research" meetings
p194 Rescuing US Biomedical Research from Its Systematic Flaws PNAS 2014
- p211 FDA's Janet Woodcock ... Most biomarkers reported in the scientific literature fail .
p212 ASU Josh LaBaer (editor Journal of Proteome Research) requires a tested hypothesis, not just an observed correlation
- p214 adaptive trial designs ... hmm, sounds like Texas Sharpshooting to me.
p217 Steve Goodman 2013 METRICS Meta-Research Innovation Center at Stanford
p217 two years after $$$ study debunked vitamin E for heart disease, half of all articles still cited the original study favorably.
- p224 UVa administration to tenure candidate Brian Nosek: print all your publications and deliver the stack. Volume matters
- p225 ... who reads them ... ?
- p226 ... steer some biomedical research dollars through DOD which is influenced by patient advocacy groups (?)
p229 Science had no formal board of statistics editors before 2015
p232 Openness badges, -> open data from 3 to 38%