This story was originally published by High Country News and appears here as part of the Climate Desk collaboration

As the new coronavirus spreads from person to person across the United States, another contagious threat is also escalating: online misinformation. False stories range from speculation that COVID-19 is caused by 5G cellular networks to claims that it’s a bioweapon to the promotion of fake cures such as silver, zinc and teas.

Such inaccuracies are of particular interest to Emma Spiro, a sociologist at the University of Washington and co-founder of the new Center for an Informed Public, a research group focused on how misinformation proliferates. Falsehoods during a crisis are dangerous because it can mislead and confuse the public, something Spiro saw while researching the rumors that arose after the 2013 Boston Marathon bombing. And today’s digital connections appear to be supercharging the rumor mill — for example, multiple versions of the same story can be circulating online at the same time, making it difficult for people to distinguish fact from fiction.

Now, with her new center up and running and a small staff dedicated to this research, Spiro is closely monitoring the spread of misinformation during the current coronavirus pandemic and analyzing what factors influence its evolution. High Country News spoke with Spiro by phone in late March, as she stood outside her home on a once-busy street in Seattle, Washington. This interview has been edited for length and clarity.

High Country News: How did you first become interested in studying misinformation?

Emma Spiro: I was pursuing my Ph.D. in sociology, and I was fascinated by the ways in which people — individuals and groups and communities — were starting to use online social networks and changing the way we participate in public discourse, and the ways we build social relationships with people.

At the time, people were just starting to use things like Twitter. So I started thinking about how these new technologies were being used, specifically in the crisis-response space. After a crisis event, we know that there are heightened levels of anxiety, extreme levels of uncertainty as people are trying to figure out what is going on and what to do about it. And they often turn to their social networks, and they turn to their neighbors, and they turn to their family members. And now they were doing that online, so we could see how that was happening.

HCN: How has this shift to online platforms changed the spread of false information?

ES: There’s a long history of studying crisis events and disasters in the social sciences, and often what we would see after events would be information voids, where there wasn’t a lot of information from official sources. As people collectively try to make sense of what's going on, that’s how rumors would arise — as a very natural byproduct of people talking and exchanging information. And we still see that today.

As COVID-19 spreads, so does misinformation

... (today) we often don’t have information voids. Now we have information overload.

But I would say, (today) we often don’t have information voids. Now we have information overload. Everybody is on social media, trying to post things, trying to consume things. So there’s sometimes some confusion about what's official and what's not. The World Health Organization recently called this (the coronavirus) not only a pandemic, but an “info-demic” — when there’s an overabundance of information, some accurate, some not.

One of the things that we’ve seen just anecdotally … is the ways in which social media can sometimes allow non-experts — potentially scientists, but with adjacent expertise, so not an epidemiologist, (for example) — to become central players in some of these online conversations and really influence potentially a lot of people in terms of not only their beliefs, but also the actions they take. In this particular case, those actions can make a significant difference in the trajectory of how this (coronavirus epidemic) plays out.

HCN: What factors influence the circulation of these rumors?

ES: Often that information really does try to elicit or arouse our emotions. And so, you know, it gets us worked up in particular ways, gets us very passionate about a particular topic. And that often leads to situations where people are making decisions that are not rationally thought out or intellectually thought out, but that are emotionally driven. Also, when we have multimedia content, we process that information in different ways. And people really respond to visual and video content.

Everybody is vulnerable. Information requires us to spread it, right? It doesn’t just read itself. This is a very participatory kind of phenomena that we’re trying to study.

HCN: What research is your team doing right now to better understand the spread of coronavirus misinformation?

ES: This past December, we launched a new research center at the University of Washington called the Center for an Informed Public. I don’t think any of us anticipated (that) this is where we would be today, amidst the epicenter of a global pandemic, but we are.

Our mission is to be able to bring together the resources that we have at a world-class public university and address this challenge of … misinformation online, and how that diminishes our trust in democratic institutions — things like science and our government.

HCN: As a Seattle resident, what’s your personal experience in dealing with misinformation in a COVID-19 epicenter?

ES: Everyone on our team always shares stories about interactions with our family and friends and people coming to us and saying, you know, “I heard so-and-so was going to bring the National Guard in and everything, and now it's going to be locked down.” And, “I heard the hand sanitizer doesn’t actually work because it’s antibacterial.”

HCN: How do you respond when people tell you things that seem like obvious falsehoods?

ES: I often will probe a little bit more. I’ll ask, “OK, well, where did that information come from? And how was it produced?” And try to trace a little bit of the provenance of that content — mostly out of curiosity, because I’m interested in where it came from and how people perceive it. But I think that also gets whomever I’m interacting with to ask those same kinds of questions.

Keep reading

As far as that goes, a federal politician's office was "sharing" that vinegar kills the virus, a doctor on national news extrapolated from behaviour of the common cold coronavirus and told people it lives only a couple of hours or so outside the body ... later changing it to a couple of hours to a couple of days ... misrepresenting what the published science said. Long after they should have known better, officials were telling us we couldn't catch it "from the air" ... and clearly it depends on which air and who was breathing into it how forcefully, etc. etc. etc.
They also told us that wearing masks were of no use ... now they're saying they're useful in protecting others ... but they're still counting on outright coughing or sneezing as the only way to get the virus out there ... and still persisting in testing only those who have fever, after acknowledging that a high percentage of those who test positive have no symptoms.
So much for accurate information from officials.