PANews reported on October 24 that according to Jinshi, Character Technologies, a robot chat tool developer, was sued by a mother in Florida, USA. The company designed and marketed an artificial intelligence (AI)/robot chat tool for teenagers with a predatory nature. The plaintiff accused Character.AI of inciting her teenage child to have suicidal tendencies and causing her child to commit suicide in February 2024 through inappropriate human-computer interaction. The lawsuit stated that the technology of Character.AI products is used to explore the diminished decision-making ability, impulse control, emotional maturity of underage users, and the psychological dependence caused by the user's incomplete brain development.
Character.AI sued for possibly causing a minor's suicide
Comment
Recommend Reading
- 2024-10-24
Radiant Capital hackers have bridged around $52 million of stolen funds from Arbitrum and BNB chains to the Ethereum network
- 2024-10-24
ApeCoin Treasury-related wallet transferred another 5 million APE to Wintermute 3 hours ago
- 2024-10-24
A new user bets $2 million on Harris, shaking up Polymarket's presidential election odds
- 2024-10-24
Base mainnet will introduce fault proof on October 30
- 2024-10-24
Trader who made $3.7 million in BOME trading lost about $429,000 in SHAR
- 2024-10-24
Important information from last night and this morning (October 23rd - October 24th)