
In this installment of her column "The Intersection," Keerthana Ramanathan (SOH '26) explores AI's potential to combat vaccine hesitancy by serving as a vehicle of communication between patients and biomedical researchers.
In just the first few months of 2025, there have already been three measles outbreaks in the United States. Out of the 222 cases, 94% have occurred in individuals who have not been vaccinated or whose vaccination status is unknown against the dangerous, but preventable, illness.
But the World Health Organization declared North and South America to be free of measles in 2016. How did we get here?
Vaccine hesitancy, the delay or refusal to take a safe vaccine despite its ability, presents a significant barrier to public health innovation and progress, according to the World Health Organization. Coupled with the recent executive orders to cut “indirect expenses” in medical research, including the research of lifesaving vaccines, the state of public health in the nation is alarming.
In an alarmingly regressive step, the Centers for Disease Control (CDC), as an organization under the Robert F. Kennedy-run Department of Health, just initiated a study examining the debunked link between vaccines and autism. Ironically, the very vaccine that was falsely linked to autism is the measles, mumps and rubella (MMR) vaccine. In the face of active measles outbreaks, such a study will undoubtedly undermine the public’s confidence in vaccines and continue to allow vaccine hesitancy to prevail in the face of scientific evidence.
Public health now faces not one, but two hurdles: a lack of research funding and an administration that is actively working to reinforce anti-vaccine messaging. We must turn to advanced, more innovative approaches to counter vaccine hesitancy and promote public trust in science.
A potential approach is rooted in a tool that many may be familiar with: ChatGPT.
ChatGPT is an artificial intelligence (AI) tool and a widely available large language model (LLM), a tool specifically designed to process and generate text. Though heavily contentious in academia and healthcare, the current landscape makes it important to examine the effects of these tools — especially in an era when making a simple Google search brings you to an AI overview.
The public’s ability to interact daily with LLMs on demand makes them pivotal in discussions surrounding scientific misinformation, which largely fuels vaccine hesitancy. More broadly, AI usage presents the same challenges, as AI tools are not always accurate. LLMs have been shown to be susceptible to reproducing biases and spreading misinformation. As we have seen in recent years, misinformation, especially regarding health, can be deadly.
However, if implemented correctly, LLMs can be used on a personal and population front to deliver equitable and accessible patient care, make disease risk predictions and integrate the two for public health surveillance. In the sphere of childhood immunizations, AI could translate complex medical research into easy-to-understand terminology for both children and parents, serving as a resource for basic vaccine education.
Going even further, AI can also be used to dismantle vaccine misinformation. Generative AI methods such as sentiment tracking and topic modeling generate content that could understand current trends in hesitancy and misinformation. This understanding is key to informing effective interventions.
Identifying the root causes of hesitancy can help better tailor communication strategies to public needs. For example, many LLMs have a unique ability to replicate a desired tone for a piece of communication. In fact, one study even found that patients preferred AI messaging compared to information provided by doctors due to the empathetic tone these LLMs can adopt.
AI comes with risks. Used incorrectly, it can exacerbate the spread of misinformation and fuel emotional hesitancy. However, if we prioritize ethical guidelines, integrate human expertise into the tool and capitalize on its strengths, AI has the potential to navigate the complexities of vaccine hesitancy and strengthen public confidence in biomedical research.