A Rasmussen Reports survey of 1,000 American adults conducted in April found that 53 percent of Republicans were willing to take anti-malarial hydroxychloroquine to treat COVID-19, while only 18 percent of Democrats were willing to try it.
The utter arbitrariness of how public opinion on scientific questions has fractured along partisan divides reveals something rotten at the core of the national conversation.
No one is surprised to see political polarization around issues of taxation, immigration, welfare or military spending. But it has been remarkable to see such deep partisan divides about basic medical science. And as has become very clear this year, it is especially dangerous during a global pandemic. In 2019, you might have predicted that in some future disease outbreak, liberals would favor an expanded federal role in health care while conservatives would oppose government restrictions on business activity. But could you have anticipated that Democrats would champion masks and Republicans would endorse hydrocholorquine, rather than vice versa? The utter arbitrariness of how public opinion on scientific questions has fractured along partisan divides reveals something rotten at the core of the national conversation.
So is hydroxychloroquine a miracle drug, or is it worthless snake oil? Researchers have tried hard to demonstrate its value, but the largest and best-designed trials have repeatedly failed to show benefits. Still, so many studies have been published with such variable results that both sides of the political aisle can claim they have science on their side. And therein lies the problem.
An early trial by the iconoclastic microbiologist Didier Raoult reported astonishing success, but his claims crumpled under closer examination. A major U.S. study also indicated benefits, but it wasn't a randomized controlled trial; critics, including Dr. Anthony Fauci, have highlighted its serious methodological flaws. Other large high-profile trials have failed to find evidence that the drug helps. These findings were conclusive enough that the National Institutes of Health halted a clinical trial and the Food and Drug Administration revoked emergency use authorization for hydroxychloroquine.
Tragically, another major study reported that hydroxychloroquine actually harms patients, and it led researches to halt trials for safety reasons. The study appears to have been deeply flawed, and it was ultimately retracted amid allegations of serious data irregularities.
Research misconduct is rare, and a mixture of positive and negative results is common in clinical research. Rarely, if ever, does a single study prove conclusively that a treatment works. This is how science works. A new result isn't a decisive final answer; it is a pebble on the scale in favor of one hypothesis or another. But pundits cherry-pick results and mislead their audiences by telling only half the story.
And the phenomenon certainly isn't isolated to studies about coronavirus.
Last winter, shortly before the pandemic began, we finished writing a book, "Calling Bullshit: The Art of Skepticism in a Data-Driven World." In the book, we acknowledge that we inhabit a world of fake news and hyperpartisan reporting but argue that readers can make sense of their information environments with just a bit of training in critical thinking and quantitative reasoning. While BS increasingly appears clad in the trappings of numbers, statistics, data graphics and mathematical models, one doesn't need an advanced degree in science or mathematics to see through it. What one does need is a willingness to change one's mind in the presence of empirical evidence and a commitment to let science rather than partisan politics serve as the arbiter of truth.
Even the simplest strategies can be effective. For instance, if a claim seems too good or too bad to be true, it probably is. Or if you want to know whether an argument is credible, check the source. Ask yourself who is making the claim. What do they have to gain from making it? What expertise do they have, and what evidence?
Importantly, not all "experts" are created equal. And the best way to find out is to check their writing and histories. Researchers with M.D. after their names may seem authoritative, but if they also believe that endometriosis, infertility, STIs and miscarriages are caused by demon sperm, skepticism is in order. If aresearcher has been thoroughly discredited multiple times in the past and is now promoting conspiracy theories about masks activating viruses and the director of the National Institute of Allergy and Infectious Diseases willfully fueling the coronavirus pandemic, that individual should not be given aplatform on hundreds of local news stations.
When anxiety is high and uncertainty is pervasive, even transparently ridiculous theories can spread rapidly. In our new Center for an Informed Public at the University of Washington, we study how bad information spreads through society via traditional and social media. We explore how unsubstantiated rumors explode across the internet, how social networks provide easy targets for malicious actors seeking to spread disinformation and how charlatans peddling false certainty drown out expert scientists accurately relaying the limitations of our knowledge. Even though takedowns and corrections of bad information are helpful, we find that they reach only a small fraction of those exposed to the original falsehoods. A principle known as Brandolini's law summarizes this succinctly: It takes an order of magnitude more effort to refute BS than it does to create it.
During a pandemic, these are literally matters of life or death. Everyone, Democratic and Republican, must make an effort to bail away the constant waves of disinformation that threaten our health and society. Rather than base our actions on the unthinking bonds of party affiliation, we need to make careful decisions based on complete information and rigorous logical reasoning to combat our worsening health crisis.