Microsoft Copilot blamed by UK police chief for controversial soccer fan ban — incorrect evidence triggered by AI hallucinations

After the successful release of ChatGPT in November 2022, Microsoft made a significant investment in OpenAI and began incorporating its AI technology into its products. Soon after, Microsoft launched Copilot (previously known as Bing Chat) as a competitor to ChatGPT.

Despite Microsoft’s efforts to enhance Microsoft Copilot, it sometimes still makes mistakes, either by providing incorrect information or generating responses that aren’t based in reality.

Copilot, Microsoft’s AI assistant, recently made a strange error. It fabricated details about a football game that never happened, claiming West Ham played against Maccabi Tel Aviv, according to The Verge.

A mistake in a report led police to label the soccer game as potentially dangerous. As a result, they prevented fans of Maccabi Tel Aviv from attending the UEFA Europa League match on November 6 of last year. This decision caused significant anger and criticism among fans, and even caught the attention of Prime Minister Keir Starmer.

West Midlands Police Chief Constable Craig Guildford initially claimed the department hadn’t used AI to create the flawed intelligence report, instead suggesting the mistake came from gathering information from social media. However, Guildford has since admitted that AI was, in fact, used.

I learned on Friday afternoon that an incorrect result for the West Ham versus Maccabi Tel Aviv game happened because it was generated using Microsoft Copilot.

He also confessed that Copilot gave incorrect details about a football match. Despite the game never actually happening, this false information led to important decisions, such as the police preventing Maccabi Tel Aviv fans from going to the match against Aston Villa.

While speaking to Business Insider, a Microsoft spokesman indicated:

Copilot gathers information from various websites and presents it in one place, providing links to the original sources. It also lets users know they’re communicating with an AI and suggests they check the sources themselves.

Okay, so this is pretty wild. I just heard that Guildford actually apologized to the parliamentary committee investigating that whole mess with the Maccabi Tel Aviv fans. Apparently, he initially said AI *hadn’t* been used to put together the report that led to them being banned from the game, but that wasn’t true. He admitted to misleading them, which is a big deal, especially given the impact it had on the fans.

I genuinely believed the match had been found using a Google search before the HAC meeting, and that’s what I was told. I had no intention of misleading the Committee.

Microsoft acknowledges that Copilot isn’t perfect and can sometimes be wrong. But does simply pointing that out prevent those errors from happening again?

Read More

2026-01-15 14:09