Google Sued Over AI Overview’s False Claim About Musician

▼ Summary
– Canadian fiddler Ashley MacIsaac filed a civil lawsuit against Google, alleging an AI Overview falsely identified him as a convicted sex offender.
– The lawsuit seeks at least $1.5 million in damages and argues Google is liable for the AI-generated output, as it “knew, or ought to have known, that the AI overview was imperfect.”
– MacIsaac learned of the false summary in December 2025 after the Sipekne’katik First Nation confronted him with it and cancelled a concert; the First Nation later issued a public apology.
– Google has not commented on the lawsuit, but a spokesperson previously stated AI Overviews are “dynamic and frequently changing” and that the false summary no longer appears.
– The case tests whether courts will treat Google as liable for defamation from automated AI summaries, similar to liability for a human spokesperson.
Canadian fiddler Ashley MacIsaac has launched a civil lawsuit against Google, claiming an AI Overview falsely labeled him a convicted sex offender. The case is poised to test how courts handle liability for defamatory AI-generated search summaries.
The statement of claim, submitted in February to the Ontario Superior Court of Justice, seeks at least $1.5 million in damages from Google LLC. None of the allegations have been proven in court.
According to the lawsuit, MacIsaac,a Juno Award-winning musician,discovered the false summary in December 2025 when the Sipekne’katik First Nation confronted him about it and subsequently canceled one of his concerts. The First Nation later issued a public apology.
The filing alleges the AI Overview incorrectly stated MacIsaac had been convicted of sexual assault, internet luring involving a child, and assault causing bodily harm, and falsely claimed he was listed on the national sex offender registry.
The lawsuit argues Google bears responsibility for the output its AI system generated, stating that Google “knew, or ought to have known, that the AI overview was imperfect and could return information that was untrue.” It also claims Google did not admit fault, reach out to MacIsaac, or offer an apology or retraction.
A direct argument about AI liability is made in the filing: “If a human spokesperson made these false allegations on Google’s behalf, a significant award of punitive damages would be warranted. Google should not have lesser liability because the defamatory statements were published by software that Google created and controls.”
MacIsaac emphasized that Google must take responsibility for what AI Overviews display. “This was not a search engine just scanning through things and giving somebody else’s story,” he said.
Google has not commented on the lawsuit. In December, spokesperson Wendy Manton said AI Overviews are “dynamic and frequently changing” and that when the feature misinterprets web content, Google uses those cases to improve its systems. The false summary linking MacIsaac to criminal offenses no longer appears.
AI Overviews can appear in Google search results as AI-generated snapshots with links to more information. Google’s Search Help documentation acknowledges that AI responses may include mistakes. When those summaries display false claims about real people, the consequences can extend beyond a bad search result. In MacIsaac’s case, the lawsuit alleges the AI Overview led to a cancelled concert and reputational harm.
This is not the first time AI-generated content has sparked defamation allegations. In 2023, an Australian mayor threatened legal action after ChatGPT falsely claimed he had been imprisoned for bribery. MacIsaac’s lawsuit targets Google’s AI Overviews directly and argues the product had a defective design.
The case adds to a growing legal question: whether platforms are responsible when automated summaries present false claims as search results. At this stage, the case is in the statement-of-claim phase, and Google has yet to file a response. Until then, the core questions remain unresolved,whether Google will contest liability, how it will characterize AI Overview output, and how the court will treat automated summaries in a defamation claim.
(Source: Search Engine Journal)




