UK Police Blame Microsoft Copilot for Critical Intelligence Error

▼ Summary
– West Midlands Police admitted its intelligence report incorrectly used a nonexistent West Ham vs. Maccabi Tel Aviv match, which was generated by Microsoft’s Copilot AI assistant.
– The erroneous AI-generated report contributed to Israeli fans being banned from an Aston Villa match after authorities deemed it high-risk.
– Chief Constable Craig Guildford confirmed the error originated from Copilot, having previously denied AI was used and blamed social media scraping.
– Microsoft’s Copilot interface includes a warning that it may make mistakes, a point underscored by this high-profile error.
– Independent testing has found Copilot frequently provides incorrect information or fabricates details, and Microsoft did not comment on this specific incident.
The chief constable of a major UK police force has confirmed that a critical error in a football intelligence report stemmed from the use of Microsoft’s Copilot AI assistant. This mistake resulted in Israeli fans being banned from a match, highlighting the serious real-world consequences of relying on unverified AI-generated information. The report incorrectly cited a violent incident at a nonexistent match between West Ham and Maccabi Tel Aviv, a fabrication that was included in official police intelligence without any fact-checking.
In a letter to the Home Affairs Committee, Chief Constable Craig Guildford of West Midlands Police stated he learned the erroneous information originated from the use of Microsoft Copilot. This admission reverses his previous denial from December, where he had attributed the mistake to “social media scraping” and explicitly denied AI was used to prepare the report. The flawed intelligence was utilized by the Birmingham Safety Advisory Group, which subsequently categorized a Europa League match between Aston Villa and Maccabi Tel Aviv as “high risk” due to alleged prior incidents.
This designation led to a ban on Maccabi Tel Aviv fans attending the match in November of last year. The advisory group’s decision was based on concerns over “violent clashes and hate crime offences” reportedly linked to a previous Maccabi match in Amsterdam, with the AI-hallucinated West Ham game incorrectly added to the risk assessment. While Microsoft includes a disclaimer on the Copilot interface warning that it “may make mistakes,” this incident represents a particularly significant failure with tangible impacts on public safety and fan rights.
Independent testing of Copilot has shown a tendency for the AI to generate incorrect information and invent details, a phenomenon commonly known as “hallucination.” In this case, the system fabricated an entire sporting event that was then treated as factual intelligence. The police force’s failure to verify the AI’s output before acting upon it underscores a critical vulnerability in integrating such tools into high-stakes decision-making processes. Microsoft did not provide a comment on why Copilot invented the football match prior to publication.
(Source: The Verge)





