FTC scrutinizes OpenAI, Meta, and others on AI companion safety for kids
zdnet.com
Follow ZDNET: Add us as a preferred source on Google.
ZDNET's key takeaways
- The FTC is investigating seven tech companies building AI companions.
- The probe is exploring safety risks posed to kids and teens.
- Many tech companies offer AI companions to boost user engagement.
The Federal Trade Commission (FTC) is investigating the safety risks posed by AI companions to kids and teenagers, the agency announced Thursday.
The federal regulator submitted orders to seven tech companies building consumer-facing AI companionship tools -- Alphabet, Instagram, Meta, OpenAI, Snap, xAI, and Character Technologies (the company behind chatbot creation platform Character.ai) -- to provide information outlining how their tools are developed and monetized and how those tools generate responses to human users, as well as any safety-testing measures that are in place to protect underage users.
Also: Even OpenAI CEO Sam Altman thinks you shouldn't ...
Copyright of this story solely belongs to zdnet.com . To see the full text click HERE