Science

Science's Weakness Is Also Its Strength

One of the beauties of science is that it’s self-correcting — and there are times when its methods and the culture need rethinking, too.

If at first you dont succeed...

Photograph: Hulton Archive/Getty Images

Pride and self-reproach have both been on display in the scientific community in recent weeks. At this year’s meeting of the American Association for the Advancement of Science, for example, some scientists took up pro-science signs and marched in a nearby public square, while others gathered in conference rooms to agonize over a recent proliferation of questionable claims.

The fields suffering most from dubious results are social science and medical research, where critics say too many highly-cited findings are evaporating as soon as others try to replicate the original experiments.

Speaking at an AAAS session on reproducibility problems, Dana Farber Cancer Center biologist Bill Kaelin blamed too much illogical and wishful thinking on the part of scientists, and a scientific culture that values the wrong things. A publish-or-perish mentality is just the beginning of the problem, said Kaelin, a recent recipient of the Lasker Award, medical research’s highest honor short of the Nobel Prize.

Before about 1980, he said, scientists strove to publish papers that proved one thing in multiple ways. Now, he said, top journals want multiple things proven one way. The claims have to be bigger -- loftier. But the evidence can be weaker.

It might sound strange, but this session and others examining all the warts of science showcased the value of the scientific process at least as much as the marches did. One of the beauties and strengths of science is that it can be self-correcting, and there are times when not only the findings need rethinking but the methods and the culture, as well.

The most troubled areas are young sciences. Researchers are still learning how to apply scientific methods to understanding the complexities of human disease and behavior. Medicine has been around for millennia, but until recently it was based more on tradition than on controlled experiments.

Elaborating on his views in an interview in his Boston office, Kaelin said there’s now bigger pool of scientists and more competition. Funding agents prefer big promises up front -– which is fine for engineering projects, but not for science, where the results are inherently unknown. “You need to accept a certain failure rate,” he said.

In engineering, failure is bad -- you don’t want rockets blowing up or bridges collapsing. But in science, if there’s a hint that a drug will have anti-cancer properties, for example, and someone tests it and finds it doesn’t work, that’s still valuable. The result may not be what people wanted to see, but it is what they needed to know.

Conversely, if a scientist claims a drug works in some way that it does not, that sets the whole field back and could potentially lead to public harm.

The peer-review system is supposed to guard against bad science, but Kaelin said it’s hard to find enough peers to review the vast number of ambitious, overreaching papers spun out each week. Papers with sweeping claims lead to “reviewer exhaustion,” he said, and often end up getting a pass when the evidence does not support the conclusions.

Worse still, he said, admitting to unexpected effects in a medical paper can be “the kiss of death.” That is, reviewers will reject a paper if there are anomalies in the data the researchers can’t explain. That’s too bad, he says, because those anomalies, if published, might give others the chance to figure out what’s going on.

Critics of social science are coming across similar problems underlying irreproducible claims. No longer do we need to puzzle over the bizarre claim that unscrambling sentences with words such as “old,” “gray,” and “Florida” caused people walk more slowly than a control group not exposed to old-age-related words. And, as it turns out, contorting your face into smiles may not, as once claimed, make you happier or more amused.

Looking at the big picture, researchers published a paper earlier this year suggesting that by rewarding bold claims over careful research and quantity of papers over quality, social science was setting up a Darwinian selection process by which flimsy scientific methods are more likely survive and be copied than more solid, careful methods.

Social science and medical research share similar pressures and a similar methodological problem. Both depend in the concept of statistical significance, which is often calculated incorrectly or misinterpreted, said Gerd Gigerenzer, a social scientist at the Max Planck Institute for Human Development in Berlin. An incorrect assessment of statistical significance can give scientists false assurance that they’ve observed a real correlation, when they may have observed nothing but random noise.

While some social science journals have banned the use of statistical significance measures, the same statistical principle works for physicists, who use it to sift data gathered in their detectors for new particles or other physical phenomena.

Physicists had their share of bloopers in the early and mid-20th century, discovering N-rays, mitogenic rays and the Allison effect -- none of which turned out to exist. But they also learned how to be better users of statistical analysis. Physicists also have the benefit of theories that flag findings needing a closer look. All this double- and triple-checking, skepticism, and self-reproach is what allows science to progress toward a more encompassing understanding of the universe -- and makes it worth marching for.

Read more in Faye Flam’s ongoing series about the science of sorting fact from falsehood:

Why Scientific Consensus Is Worth Taking Seriously
How Science Sorts Fact From Alternative Fact

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

    To contact the author of this story:
    Faye Flam at fflam1@bloomberg.net

    To contact the editor responsible for this story:
    Tracy Walsh at twalsh67@bloomberg.net

    Before it's here, it's on the Bloomberg Terminal.
    LEARN MORE