Uncategorized

The Character.ai has sued in the past over some of the messages sent to teens

Comment on a “Comanion Chatbot” by the Tech Justice Law Center, a former Social Media Victims’ Litigation

The Tech Justice Law Center, an advocacy group that’s helping represent the parents of the minor in the suit, along with the Social Media Victims Law Center, thinks that that Character is a lie. AI advertises its chatbot service as being appropriate for young teenagers. “It really belies the lack of emotional development amongst teenagers,” she said.

Character. AI is among a crop of companies that have developed “companion chatbots,” AI-powered bots that have the ability to converse, by texting or voice chats, using seemingly human-like personalities and that can be given custom names and avatars, sometimes inspired by famous people like billionaire Elon Musk, or singer Billie Eilish.

Users have made millions of bots on the app, some mimicking parents or concepts like “unrequited love” and “the goth.” The services are popular with young users as they act as emotional support, as the bots pepper text conversations with encouragement.

“It is simply a terrible harm these defendants and others like them are causing and concealing as a matter of product design, distribution and programming,” the lawsuit states.

This model is specifically for teens so they can use the platform and not encounter sensitive or suggestive content.

What Is Meant by Talking with the Bots in the Dialogue Box of a Google-Owne Company: Artificial Intelligence, AI Technology and the Teen Mental Health Crisis

Indeed, Character is not owned by Google. According to reports, it has invested nearly $3 billion to re- hire Character. Noam Shazeer and Daniel De Freitas are the founding fathers of Artificial Intelligence. AI technology. Both Shazeer and Freitas are listed in the lawsuit. They did not return requests for comment.

José Castañeda, a Google spokesman, said “user safety is a top concern for us,” adding that the tech giant takes a “cautious and responsible approach” to developing and releasing AI products.

Users are encouraged to keep some emotional distance from the bots. This is an artificial intelligence and not a real person, that is what a note under the dialogue box said when the user was texting one of Character’s million of possible chatbots. Treat everything it says as fiction. What is said shouldn’t be relied on as fact or advice.

U.S. Surgeon General Vivek Murthy has warned of a youth mental health crisis, pointing to surveys finding that one in three high school students reported persistent feelings of sadness or hopelessness, representing a 40% increase from a 10-year period ending in 2019. It’s a trend federal officials believe is being exacerbated by teens’ nonstop use of social media.