Recently, researchers unveiled generative artificial intelligence tools that can recreate a person’s voice after disease or injury has taken it away. For patients who can no longer speak, these systems offer something extraordinary: the ability to communicate again, to participate in daily life, to be heard.

The same technologies, however, are also being used to impersonate people, spread false medical advice, and lend artificial credibility to misinformation

In other words, AI can give voice. It can also distort truth. Whether it helps or harms depends on who understands the science behind it and who is trained to explain it.

Scientists are trained for a world where data speaks for itself. Where misinformation moves slowly. Where scientific expertise naturally rises above noise. That world is gone.

The distinction is not just theoretical.

In 2022, a white supremacist cited distorted claims about genetics to justify a racially motivated mass shooting in Buffalo, New York. The document relied on discredited ideas about biological race, not modern genetic concepts like ancestry or population structure. 

In the aftermath of the shooting, scientists confronted a difficult realization: Even when the science itself rejects a framework, the language and findings of inaccurate claims or outdated interpretations can be found online and appear legitimate or be selectively misused. 

This is the distorted environment today’s scientists are entering. 

Yet, scientists are trained for a world where data speaks for itself. Where misinformation moves slowly. Where scientific expertise naturally rises above noise. That world is gone.

Today, claims about vaccines, climate science, genetics, and reproductive health spread online at viral speed. Algorithm-driven content and social media personalities increasingly shape what the public believes about science, often with little accountability. Short-form videos, podcasts, and influencer posts now serve as primary sources of health and science information for millions of people, especially younger audiences.

When scientists are absent from public conversations, misinformation fills the space.

Research consistently shows that much of this content is inaccurate or misleading. Studies of TikTok and Instagram have found that a majority of popular health videos created by non-experts contradict established medical guidance. Influencers have been shown to profit from anti-vaccine and wellness misinformation, leveraging trust built through lifestyle content to promote unverified claims. In these spaces, an audience’s engagement is rewarded far more reliably than accuracy.

The problem is not that scientists do not care about the impact of their work. It is that we are not trained to engage.

Science communication and policy literacy are still treated as side projects. Optional workshops. Unpaid labor. Activities you pursue only if you have the privilege of time or institutional protection. They are rarely valued in hiring, promotion, or grant review. 

But dismissing public influence as peripheral rather than essential has real consequences.

When scientists are absent from public conversations, misinformation fills the space. During the COVID-19 pandemic, confusion about vaccines and treatments cost lives. Climate misinformation continues to delay action as extreme weather intensifies. Genetic myths shape debates over reproduction, disability, and human difference. In each case, science and the evidence to support it existed. What failed was the system’s ability to communicate it clearly, responsibly, and at the pace the public sphere now demands.

Public understanding of science does not disappear overnight. It erodes gradually, when expertise feels inaccessible, detached, or absent from the places where people are actually forming beliefs.

If evidence is meant to inform policy, then scientists must be prepared to participate in policy conversations.

If science is meant to serve the public, then public engagement cannot be optional. If evidence is meant to inform policy, then scientists must be prepared to participate in policy conversations. Communication should be as foundational to scientific training as statistics or experimental design.

Inclusion matters here too, not as a separate initiative, but as a condition for credibility. People are more likely to engage with science when they recognize themselves in it, when explanations reflect cultural and historical context, and when knowledge is shared with humility rather than authority. Communication failures often land hardest on communities that science has historically ignored or harmed.

Embedding communication and policy fluency into STEM training is not about adding polish. It is about responsibility. Students should graduate knowing not only how to generate knowledge, but how to explain it, contextualize it, and respond when it is misunderstood or misused.

But this cannot be achieved by simply adding more expectations to an already overextended system.

Graduate students juggle caring for family, coursework, research, side hustles, teaching, mentoring, and grant writing. Faculty face mounting administrative and funding pressures. We have seen what happens when new responsibilities are layered onto existing roles without adequate support.

Meaningful change requires structure. Funded teaching positions. Protected time. Mentorship. Clear incentives that recognize public engagement and policy contributions as legitimate scientific work.

Science still depends on discovery. But in the world we now live in, it also depends on history, context, and clarity.

Shifting the values of science education is not about producing activists or influencers. It is about producing scientists who can operate in a world where knowledge moves fast, context is easily lost, and the consequences of misunderstanding are real.

Scientists who can explain their work before it is distorted.

Scientists who can engage policymakers before decisions become final.

Scientists who can recognize when silence carries risk.

Science still depends on discovery. But in the world we now live in, it also depends on history, context, and clarity.

JP Flores is a Ph.D. Candidate in Bioinformatics & Computational Biology at UNC Chapel Hill, the co-founder of the nonprofit organization Science For Good, and a Public Voices Fellow on Technology in the Public Interest with The OpEd Project.