scorecardresearch
Monday, November 4, 2024
Support Our Journalism
HomeOpinionGood science takes time. Finding a coronavirus miracle isn’t a job for...

Good science takes time. Finding a coronavirus miracle isn’t a job for social scientists

The Covid-19 pandemic has highlighted the importance of what social scientists do on a regular basis, and in time, it will matter whether they answered questions well.

Follow Us :
Text Size:

The public appetite for more information about Covid-19 is understandably insatiable. Social scientists have been quick to respond. They are writing papers at a record pace, and academic journals are expediting the review process so that these new, exciting results can be published in a timely and newsworthy manner. While I understand the impulse, the rush to publish findings quickly in the midst of the crisis does little for the public and harms the discipline of social science.

Even in normal times, social science suffers from a host of pathologies. Results reported in our leading scientific journals are often unreliable because researchers can be careless, they might selectively report their results, and career incentives could lead them to publish as many exciting results as possible, regardless of validity. A global crisis only exacerbates these problems. Rushing to publish timely results means more carelessness, and the promise of favorable news coverage in a time of crisis further distorts incentives.

I am especially concerned about three trends among social scientists during the Covid-19 pandemic, the first of which is that many of them appear to be rushing their work. Good science takes time. Researchers often spend months collecting, organizing and double-checking their data. They spend more months presenting their findings and gathering feedback from colleagues before they publicly release their results. But many social scientists are already releasing and publicizing studies using Covid-19 data that was collected just days ago, and they are often failing to apply the same level of rigor that they normally would.

For example, several recent studies have asked whether partisan attitudes affect social distancing. One challenge is that it’s difficult to measure social distancing. In one recent study, survey respondents were asked to self-report their social distancing, but people often misreport their beliefs and behaviors in political surveysAnother study used GPS data to measure visits to places of interest like restaurants and movie theaters, but this seems like a poor test of social distancing at a time when many such places are closed (especially in more Democratic places). A second challenge is that even if we find a clear difference between Democratic and Republican behavior, it’s difficult to say whether this difference is explained by political attitudes or other factors. Democrats tend to live in more urban places, where the pandemic has been more severe and local governments have implemented more stringent policies and guidelines; neither of these studies accounted for these alternative explanations.


Also read: Researchers say vaccine for Covid-19 could be ready in a year, to look for ‘signal’ in June


Another recent study investigated the extent to which watching “Hannity” versus “Tucker Carlson Tonight” may have increased the spread of Covid-19. This is the kind of study that might make one skeptical in normal times. An extra concern now is that the paper was likely written in just a few days. Although the authors write that they used variation in sunset times to estimate the effect of watching “Hannity,” a closer reading suggests that they’re mostly using variation in how much people in different media markets watch television and how much Fox News they watch. Maybe conservative commentators like Sean Hannity have exacerbated the spread of Covid-19, but it’s dangerous for social scientists to publicize these kinds of results before they have been carefully vetted.

Not only are social scientists rushing to write these studies, but academic journals are also rushing to publish them. An editor might typically vet submitted papers, then select experts in the field to review these papers. The reviewers might read a paper carefully and provide feedback. And then the authors would have the opportunity to revise their paper in response to that feedback, and the process would repeat (often multiple times). But for papers related to Covid-19, the typical process is being streamlined. I was recently asked to review three such papers for a scientific journal. Although an editor might normally give me six weeks to complete one review, I was asked to complete three reviews in just one week. Similarly, a political science journal asked me to referee a paper for a rapid-review series related to Covid-19. The editor explicitly stated that my review would not require the detail or length of a normal review; instead, they wanted a simple “accept” or “reject” within five days.

One possible reason for rushing science in the midst of a crisis is that the benefits of quickly getting new information to the public and to policy makers outweigh the potential costs of giving them less reliable information. Perhaps one could make this argument for those studying how to cure or prevent the spread of Covid-19. But most of the work being done by social scientists on Covid-19, while interesting and important, is not urgent. Understanding how political attitudes affect social distancing may be relevant for understanding political psychology, for example, and it might even help us design better solutions in a future pandemic, but it doesn’t significantly benefit society to have this information today.

The second troubling trend is the temptation of social scientists to speak outside their areas of expertise. There is so much we don’t know about Covid-19 and so much uncertainty about how the pandemic will play out that many are tempted to speculate and conduct their own analyses. I’ve recently seen scholars in fields as varied as political philosophy and macroeconomics giving public-health advice and predicting the future trajectory of the pandemic without seriously discussing the limits of their knowledge or the credibility of their assumptions. A legal scholar first predicted 500 deaths in the U.S., then appeared to revise that to 5,000, and most recently revised it again to 50,000. Despite the scholar’s lack of any relevant expertise or experience, these woefully optimistic early projections reportedly influenced decisions in the White House. (Apparently, this egotism is not unique to social science. Julia Gog, a mathematical modeler of infectious diseases, reports having an email folder full of documents titled “my_first_epidemic.xls.”)

The third troubling trend in social science is the temptation to overclaim. I have seen several studies and analyses by social scientists during the pandemic that are interesting, important and relevant to policy. But the studies often make stronger statements than are warranted, then journalists and policy makers run with these statements and overclaim further. Despite the appeal of favorable news coverage, part of our job as social scientists is to reliably convey the uncertainty associated with our estimates and the limitations of our studies.

In one example of these kinds of misleading claims, one study estimated the economic value of the people spared through social-distancing efforts. Essentially, the authors took estimates from epidemiologists about the number of lives that could be saved, then multiplied them with estimates of the statistical value of a life from economists. The researchers admittedly did not consider any of the potential costs of social distancing. Yet, in the concluding sentence of their abstract, they write, “Overall, the analysis suggests that social distancing initiatives and policies in response to the Covid-19 epidemic have substantial economic benefits.” To an economist, this sentence might simply convey that they computed large benefits but did not consider costs. But to a layperson or policy maker, it sounds like they have conducted a thorough analysis and concluded that social distancing is, on net, economically beneficial. Not surprisingly, many news outlets have cited this study to support the claim that there is no trade-off between saving lives and economic recovery.

Social scientists have for decades studied questions of great importance for pandemics and beyond: How should we structure our political system to best respond to crises? How should responses be coordinated between local, state and federal governments? How should we implement relief spending to have the greatest economic benefits? How can we best communicate health information to the public and maximize compliance with new norms? To the extent that we have insights to share with policy makers, we should focus much of our energy on that.

The Covid-19 pandemic highlights the importance of what social scientists do on a regular basis, and in time, it will provide new opportunities for us to answer long-standing, policy-relevant questions. And by the time the next crisis comes around, we won’t care whether we answered these questions in a timely or newsworthy manner. We’ll care whether we answered them well. –Bloomberg


Also read: Gujarat is wrong. There’s just 1 strain of coronavirus and all mutations are as dangerous


Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular