Can you trust AI such as ChatGPT with health questions?

Updated
Doctor holding tablet showing brain human and body anatomy futuristic technology, innovative in science and medicine concept, technological digital futuristic.
Can AI help answer medical questions? (Getty) (Vithun Khamsong via Getty Images)

Move over Dr Google: users are turning to artificial intelligence chatbots such as ChatGPT to answer medical questions. However, can you trust AI like you can a GP?

Almost a third of us (31%) hope that AI can improve our access to healthcare, according to the Office for National Statistics (ONS). In addition, AI has a huge part to play in healthcare thanks to its abilities to spot patterns – for instance, in X-ray scans – and the government is already investing millions in AI for the NHS.

But can you actually rely on a large language model (LLM) to answer medical questions?

Previous research has suggested that up to 39.5% of healthcare professionals hope to rely on LLMs such as ChatGPT to make medical decisions. But new research suggests that consumers should be wary of relying on bots for big health decisions and diagnoses.

What did the new study find?

The researchers found that AI software such as ChatGPT and Google’s Bard – now Gemini – are potentially inaccurate.

Posing 56 common medical questions to the chatbots, the answers were reviewed by two doctors. The answers offered by the AI bots were often either inaccurate or incomplete.

AI can help doctors analyse chest X-Rays. (Getty)
AI can help doctors analyse chest X-Rays. (Getty) (Reza Estakhrian via Getty Images)

Just over half of Bard’s answers were accurate (53.6%), with 17.8% inaccurate and 28.6% partially accurate.

With ChatGPT, just over a quarter of answers were accurate (28.6%) while 28.6% were inaccurate and 42.8% were partially accurate.

Dr Andrei Brateanu, of the Cleveland Clinic Foundation in the US, said the study showed that AI tools shouldn’t be used as a substitute for doctors.

“They can be considered as additional resources that, when combined with human expertise, can enhance the overall quality of information provided," he said.

How can AI help patients?

Rather than LLMs such as ChatGPT, other AI tools are built to be used by clinicians to speed up diagnosis – instead of being a ‘substitute’ for flesh-and-blood doctors.

Many hope that AI’s ability to sift through huge datasets at speed could help to speed diagnosis, discover new drugs and even match patients to empty beds.

AI’s ability to spot patterns can help to triage patients, or predict when beds in intensive care might be available, advocates say.

In the UK, the government has invested £21m to help NHS staff diagnose and treat patients with strokes, cancers and heart conditions more quickly.

Staff are being equipped with decision-making tools which use AI technology to improve diagnosis.

Another tool uses AI to analyse chest X-rays, the tool most commonly used to diagnose lung cancer – helping clinicians to deal with the 600,000 chest X-rays performed each month in England.

The government has previously said: "Artificial intelligence is already transforming the way we deliver healthcare and AI tools are already making a significant impact across the NHS in diagnosing conditions earlier, meaning people can be treated more quickly."

Read more: AI

Advertisement