Texas Attorney General Ken Paxton has announced that he’s launched an investigation into the Google-backed AI chatbot startup Character.AI over its privacy and safety practices for minors.
The news comes just days after two Texas families sued the startup and its financial backer Google, alleging that the platform’s AI characters sexually and emotionally abused their school-aged children. According to the lawsuit, the chatbots encouraged the children to engage in self-harm and violence.
“Technology companies are on notice that my office is vigorously enforcing Texas’s strong data privacy laws,” said Paxton in a statement. “These investigations are a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.”
According to Paxton’s office, the companies could be in violation of the Securing Children Online through Parental Empowerment (SCOPE) Act, which requires companies to provide extensive parental controls to protect the privacy of their children, and the Texas Data Privacy and Security Act (TDPSA), which “imposes strict notice and consent requirements on companies that collect and use minors’ personal data.”
“We are currently reviewing the Attorney General’s announcement,” a Character.AI spokesperson told us. “As a company, we take the safety of our users very seriously. We welcome working with regulators and have recently announced we are launching some of the features referenced in the release, including parental controls.”
Indeed, on Thursday Character.AI promised to prioritize “teen safety” by launching a separate AI model “specifically for our teen users.”
The company also promised to roll out “parental controls” that will give “parents insight into their child’s experience on Character.AI.
Whether its actions will be enough to stem a tide of highly problematic chatbots being hosted on its platform remains to be seen. Futurism has previously identified chatbots on the platform devoted to themes of pedophilia, eating disorders, self-harm, and suicide.
Alongside Character.AI, Paxton is also launching separate investigations into fourteen other companies ranging from Reddit to Instagram to Discord.
How far Paxton’s newly-launched investigation will go is unclear. Paxton has repeatedly launched investigations into digital platforms, accusing them of violating safety and privacy laws. In October, he sued TikTok for sharing minors’ personal data.
At the time, TikTok denied the allegations, arguing that it offers “robust safeguards for teens and parents, including Family Pairing, all of which are publicly available.”
Parts of the SCOPE Act were also recently blocked by a Texas judge, siding with tech groups that argued it was unlawfully restricting free expression.
Paxton also subpoenaed 404 Media in October, demanding the publication to hand over confidential information into its wholly unrelated reporting of a lawsuit against Google.
The attorney general has a colorful past himself. Last year, Texas House investigators impeached Paxton after finding he took bribes from a real estate investor, exploited the powers of his office, and fired staff members who reported his misconduct, according to the Texas Tribune.
After being suspended for roughly four months, the Texas Senate acquitted Paxton for all articles of impeachment, allowing him to return to office.
Paxton was also indicted in 2015 on state securities fraud charges. Charges were dropped in March after he agreed to pay nearly $300,000 in restitution.
Besides suing digital platforms, Paxton also sued manufacturers 3M and DuPont for misleading consumers about the safety of their products, and Austin’s largest homeless service provider for allegedly being a “common nuisance” in the surrounding neighborhood.
More on Character.AI: Google-Backed AI Startup Announces Plans to Stop Grooming Teenagers
Source link
lol