DrugGPT: new AI tool could help doctors prescribe medicine in England

Updated

Drugs are a cornerstone of medicine, but sometimes doctors make mistakes when prescribing them and patients don’t take them properly.

A new AI tool developed at Oxford University aims to tackle both those problems. DrugGPT offers a safety net for clinicians when they prescribe medicines and gives them information that may help their patients better understand why and how to take them.

Doctors and other healthcare professionals who prescribe medicines will be able to get an instant second opinion by entering a patient’s conditions into the chatbot. Prototype versions respond with a list of recommended drugs and flag up possible adverse effects and drug-drug interactions.

“One of the great things is that it then explains why,” said Prof David Clifton, whose team at Oxford’s AI for Healthcare lab led the project.

“It will show you the guidance – the research, flowcharts and references – and why it recommends this particular drug.”

Some doctors already use mainstream generative AI chatbots such as ChatGPT and Google’s Gemini (formerly Bard) to check their diagnoses and write up medical notes or letters. International medical associations have previously advised clinicians not to use those tools, partly because of the risk that the chatbot will give false information, or what technologists refer to as hallucinations.

Related: From smart stethoscopes to predicting bed demand: how AI can support healthcare

But Clifton and his colleagues say, in a preprint about DrugGPT’s effectiveness, that it “achieves performances competitive with human experts” in US medical licence exams.

“Imagine if you’re a GP: you’re trying to stay on top of a bazillion different bits of medical guidance which are being updated every year. It’s tough,” said Clifton, who is also a research professor at the National Institute for Health and Care Research (NIHR), which has supported the project.

“But it’s important not to take the human out of the loop. You don’t want the problem of ‘computer says no’. It’s always got to be advice to a human like a co-pilot. It’s a safety net: here’s a recommendation to compare your recommendation against.”

Other research published by the British Medical Journal estimates that about 237m medication errors are made every year in England, costing about £98m and more than 1,700 lives. Only about 2% of errors could potentially result in serious harm, the research said, with GPs making the fewest errors and prescribers in care homes making the most.

Patients also make mistakes with medicines. “Nonadherence”, where patients fail to take ­medication according to a doctor’s instructions, wastes about £300m for NHS England a year, acc­ording to the Pharmaceutical Journal.

General practices already use technology such as ScriptSwitch, which checks medication options and lets prescribers choose cheaper options.

Dr Lucy Mackillop, a consultant obstetric physician at Oxford University Hospitals NHS Foundation Trust who has advised Clifton’s team, said the potential advantage of DrugGPT was that it would give busy doctors more information about the drugs they were prescribing.

“If you discuss it with the patient, they are more likely to understand and be compliant with medication, and the medication is therefore more likely overall to work and do the job it’s meant to do,” she said.

Dr Michael Mulholland, vice-chair of the Royal College of GPs, said that in the vast majority of cases, prescriptions were made correctly.

But “doctors are only human and errors can happen, particularly when doctors are working under intense workload and workforce pressures, as GPs and our teams currently are. This is particularly the case with patients who take lots of medications at once, as there will be many different ways the medications may interact with each other.

“We are always open to introducing more sophisticated safety measures that will support us to minimise human error – we just need to ensure that any new tools and systems are robust and that their use is piloted before wider rollout to avoid any unforeseen and unintended consequences.

“Ultimately, the most effective long-lasting solution to delivering safe patient care is to ensure that general practice is adequately funded and staffed with enough GPs and other healthcare professionals working at safe levels.”

Advertisement