PANews reported on October 24 that according to Jinshi, Character Technologies, a robot chat tool developer, was sued by a mother in Florida, USA. The company designed and marketed an artificial intelligence (AI)/robot chat tool for teenagers with a predatory nature. The plaintiff accused Character.AI of inciting her teenage child to have suicidal tendencies and causing her child to commit suicide in February 2024 through inappropriate human-computer interaction. The lawsuit stated that the technology of Character.AI products is used to explore the diminished decision-making ability, impulse control, emotional maturity of underage users, and the psychological dependence caused by the user's incomplete brain development.