Matthew Bergman, a product liability lawyer who spent years taking on asbestos manufacturers, now sees a different kind of threat. His focus has shifted to companies building AI chatbots, and he is not issuing a mild caution. He is sounding an alarm.
"We are going to have a mass casualty event," Bergman stated in an interview. "It's not a question of if. It's a question of when."
Bergman represents families who allege their children were psychologically harmed by AI companions from the startup Character.AI. One lawsuit involves a 14-year-old who died by suicide after exchanging thousands of intimate messages with a chatbot. The attorney says his caseload reveals a pattern: minors forming obsessive, delusional bonds with AI entities, leading to severe psychological breaks and self-harm. Psychiatrists are beginning to document what some call AI-induced psychosis, particularly in vulnerable adolescents.
Character.AI, founded by ex-Google researchers, says it takes safety seriously and has implemented measures like suicide prevention prompts and usage notifications for younger users. Bergman dismisses these as surface-level fixes. He argues the core product is designed to foster emotional dependency for engagement, comparing its potential scale of harm to the opioid crisis.
His legal strategy applies product liability principles, alleging defective design and failure to warn. He aims to force the disclosure of internal company data, similar to pivotal cases against tobacco and social media firms. The legal landscape, however, is uncertain, with no clear rules governing the psychological effects of AI companions.
As the technology grows more sophisticated and emotionally convincing, Bergman warns the risk is systemic. With millions of active users, many under 18, he contends the conditions are set for widespread harm. "We don't have a decade to figure this out," he said. "We might not have a year." While Congress debates broader AI rules and lawsuits slowly advance, Bergman's stark prediction hangs over an industry built on simulating human connection.
Source: Webpronews