News18    •    7 min read

Using AI Chatbots To Google Your Symptoms? New Research Says It Can Be Very Dangerous

WHAT'S THE STORY?

Typing your symptoms into an AI chatbot might feel like the fastest route to figuring out what’s wrong with you but new research suggests it could be a risky shortcut.

A major study reported by the BBC

AD

has found that using artificial intelligence tools for medical advice can be “dangerous”, warning that chatbots often deliver inaccurate, inconsistent, and sometimes misleading guidance when people seek help for real health concerns.

The research was led by scientists from the University of Oxford, including teams from the Oxford Internet Institute and the Nuffield Department of Primary Care Health Sciences. Their conclusion is blunt: despite rapid advances, AI is not ready to replace a doctor or even reliably guide patients on what to do next.

“Patients need to be aware that asking a large language model about their symptoms can be dangerous,” Dr Rebecca Payne, a GP and co-author of the study, told the BBC. She warned that AI systems can give incorrect diagnoses and fail to recognise when urgent medical help is needed.

Despite rapid advances, AI is not ready to replace a doctor or even reliably guide patients on what to do next.
Despite rapid advances, AI is not ready to replace a doctor or even reliably guide patients on what to do next.

What The Research Found About AI

Researchers asked nearly 1,300 participants to assess different health scenarios, ranging from severe headaches to postnatal exhaustion, and decide what condition they might have and what action to take. Some participants used AI chatbots, while others relied on traditional routes like consulting a GP.

The results were troubling. According to the BBC report, people using AI were often given a mix of good and bad information, and struggled to tell the difference. While chatbots performed well on standardised medical knowledge tests, they faltered when faced with real-world, messy human health problems.

“The AI might list three possible conditions,” Dr Adam Mahdi, a senior author on the study, told the BBC. “People were then left to guess which one applied. This is exactly where things fall apart.”

The study found that AI responses varied wildly depending on how a question was phrased, making outcomes inconsistent and unreliable.
The study found that AI responses varied wildly depending on how a question was phrased, making outcomes inconsistent and unreliable.

Why AI Struggles With Symptoms

Unlike a doctor’s appointment, people don’t present all their symptoms at once. They remember things gradually, miss details, or describe sensations vaguely. The study found that AI responses varied wildly depending on how a question was phrased, making outcomes inconsistent and unreliable.

Lead author Andrew Bean explained that interacting with humans remains a major challenge even for advanced AI models. The concern, researchers say, is that users may trust the confident tone of chatbots without realising how fragile the advice actually is.

A Growing Habit With Real Risks

The findings come at a time when AI use for health is booming. BBC-reported polling by Mental Health UK found that more than one in three people now use AI tools to support their mental health or wellbeing.

Experts also warn that chatbots can reproduce long-standing medical biases embedded in their training data. Dr Amber W. Childs of Yale School of Medicine told the BBC that AI is “only as good a diagnostician as seasoned clinicians are, which is not perfect either.”

Should You Stop Using AI Altogether?

Not necessarily. Tech experts say AI could still play a role in healthcare but with strict safeguards. Dr Bertalan Meskó, editor of The Medical Futurist, told the BBC that health-specific versions of chatbots released by companies like OpenAI and Anthropic may eventually offer safer outcomes.

For now, researchers are clear on one thing: AI can provide information, but it should never be the final authority on your health.

AD
More Stories You Might Enjoy