Battling Misinformation in a Post-Pandemic World
April 26, 2025

Assistant Professor Becca Beets on the challenges of communicating about science.
By Jessica Weiss ’05
When COVID-19 hit in 2020, science communication became a matter of life and death. Misinformation spread as rapidly as the virus, leaving the public scrambling for clarity while scientists worked to dispel myths and explain shifting guidance. For Assistant Professor Becca Beets, who was then a Ph.D. student at the University of Wisconsin studying science communication, it was a defining moment—one that shaped her research on uncertainty, trust and how to talk about science.
Beets joined UMD last year and is now teaching a course titled “Misinformation, Society, and Science Communication.” We sat down with her to explore what makes misinformation so tricky, why facts alone don’t always change minds and how we can communicate science more effectively.
What led you to focus on misinformation in science communication?
The pandemic really put it front and center. COVID-19 misinformation was everywhere, and institutions had to respond in real time, while also responding about changing guidance. Everything I’d been studying suddenly became so relevant. One example I studied was the false claim that COVID vaccines could give you COVID. Health organizations tried to correct this by saying, “COVID-19 mRNA vaccines do not contain live virus.” But in our study we found that emphasizing this could actually increase concerns about live virus vaccines. We changed the safety message to: “Contrary to some false claims, mRNA vaccines do not contain live virus. But in other vaccines live viruses are used safely.” And that helped reduce concerns. This showed how misinformation correction can have unintended effects, making it crucial to think carefully about how we communicate uncertainty.
How do you define misinformation and disinformation, and why is it hard to identify in science?
Misinformation is false or factually incorrect information, while disinformation is misinformation that is intentionally spread. But it’s difficult to prove intent—many people share misinformation without realizing it’s inaccurate. Science adds another layer of complexity because it's always evolving. What was considered true six months ago might change with new evidence, especially in rapidly developing fields. This makes it harder to label something definitively as misinformation when scientific consensus is still forming. For example, several of the students in my class are interested in supplements and treatments that get promoted widely on social media and that claim to positively impact health. But if we have no scientific evidence to back these claims, does that make it misinformation? It’s tricky.
What are some of the ways misinformation can be corrected?
It depends on the timing and type of misinformation. If the false claim is already widespread, we need to focus on debunking it. If it hasn’t gained traction yet, we might use a "prebunking" approach—warning people about common misinformation tactics in advance. There isn’t a one-size-fits-all strategy, but research suggests that avoiding repetition of misinformation, providing alternative explanations and reinforcing correct information with clear sources can help.
How does trust play a role in science communication?
Trust is key. Who we trust as sources of credible information impacts what we believe. If a trusted figure spreads misinformation, their audience is more likely to believe it. Additionally, motivated reasoning—the tendency to process information in a way that aligns with our existing beliefs—makes it difficult to change someone’s mind just by presenting facts. Students often ask, "Why don’t people change their minds when given correct information?" The answer depends on the source, the person's prior beliefs and whether their experiences shape their views on the topic. It's a very similar thing to risk perception; whereby people can have very different judgements about a particular risk even if they are well informed. For instance, you could tell someone that 500 doctors say it's safe to get vaccinated, but it simply may be that even the most minor risk is one that person is not willing to take, so it just outweighs everything else. And if you don't trust the medical community, then 500 doctors won't necessarily change your views on the risks of vaccination.
What role does uncertainty play in shaping public perception of science?
Science is inherently uncertain. Some areas of science are more settled, while others are still evolving, and different people have different tolerance levels for uncertainty. Communicating uncertainty is important, but we also need to recognize that uncertainty takes many forms. The challenge is finding a balance—being honest about unknowns without eroding public confidence in science. I have a group of students looking at some of the misinformation claims around the wildfires in California and connections between increases in wildfire and climate change. But that's very complicated because while it may be true that climate change plays a role, it’s difficult to proclaim that climate change is what actually caused any given particular fire. Some things on their face seem very straightforward, but then when you actually dig into them, it becomes more thorny and complicated. There is a growing area of research that explores how communicating different types of uncertainty impacts people's attitudes and behaviors.
Misinformation isn’t new, but has it changed in recent years?
Misinformation has always existed, but our media landscape has amplified it. Algorithms push certain narratives, unmoderated content spreads widely and people in positions of power can now reach massive audiences instantly. Whether misinformation is worse now is hard to say, but the conditions for its spread have certainly changed. This makes it even more important to understand why people form certain beliefs and tailor science communication strategies accordingly.
Photo courtesy of Adobe Stock.