Brummie accents could break AI call handler, police fear

General view of Birmingham's skyline
An AI call handler could struggle with Birmingham accents, it's feared - Ingus Kruklitis/iStockphoto

A police force trialling an AI-powered non-emergency call service fears the technology will not understand callers with a Birmingham accent, it has emerged.

West Midlands Police trialled a voice assistant powered by artificial intelligence in an attempt to deal with rising volumes of 101 calls.

It set out potential risks of the AI, including whether the system, named “Amy101”, would understand local “Brummie” accents.

A document detailing the plan was mistakenly posted online by the office of the West Midlands Police and Crime Commissioner (PCC).

The document, seen by the BBC, was reportedly marked “official sensitive” with warnings that it was “not to be publicly disclosed”.

The findings, which were prepared for an ethical oversight committee that advises the PCC and chief constable, have since been removed.

The report highlights potential problems including whether the tech could cope with the local accent.

“Bias will naturally occur within the “Amy” system based on accents/localisation – for example can she understand “Brummie” accents? And are they treated with equal weighting to different accents in English?” the document asks.

The force recognised that technologies capable of understanding ordinary language were “not flawless”, and therefore may struggle with accents.

If calls were not understood however, they would be transferred to the queue for a human operator.

AI could prioritise vulnerable callers

The technology, which was based on Amazon’s voice assistant, Alexa, was designed to help the force cope with increasing volumes of calls, and potentially offer new services such as responses in different languages.

Amy101 was expected to handle about 200 calls per day. The project, a two month proof-of-concept trial, was nationally funded.

The AI also had the ability to prioritise vulnerable callers, the document stated, by looking out for keywords such as those linked to domestic violence.

Those calls would then be handled by the next available human call operator.

Potential issues around safeguarding data were also flagged.

The ethics committee also had a number of questions, recorded in its minutes, about Amy101, such as the voice and “gendered name” of the tool. The force responded arguing “humanisation” was needed.

It also suggested officers requested further analysis from Amazon on potential issues “such as regional accent recognition and bias testing”.

West Midlands Police told the BBC the trial began on Dec 19 2023 and had now concluded.

Peter Gillet, the director of commercial services at the force, said: “AI does present some potential opportunities for providing a more efficient and robust service.”

Now the trial is over, the force would be “sharing the results and outcomes at a national scale”, he added.

AI is already being used in other areas of policing, such as retrospective facial recognition (RFR) software.

In October last year, Chris Philp, the policing minister, wrote to chief constables urging them to double their use of the software over the next six months.

RFR allows authorities to use facial recognition after an event to establish who a person is or whether their image matches other media held on a database.

Advertisement