Senators in the United States advocate for an investigation into Meta's AI practices, following a report by Reuters.
In the absence of federal regulations governing AI, states have taken the initiative to enact legislation, including bans on using the technology to create child sexual abuse material. However, a recent controversy has emerged regarding Meta Platforms, the parent company of Facebook, and its AI chatbots.
Senator Josh Hawley, chair of the Senate Judiciary Subcommittee on Crime and Counterterrorism, is actively leading an investigation into Meta Platforms. This inquiry follows the leak of a 200-page internal Meta document that revealed policies permitting AI chatbots to engage children in inappropriate and romantic conversations.
The document, which has been confirmed as authentic by Meta, contains examples and notes that the company claims are erroneous and inconsistent with their policies. After being questioned by Reuters, Meta removed portions of the document that allowed chatbots to flirt and engage in romantic roleplay with children.
The investigation is examining whether Meta’s generative AI products enable exploitation, deception, or other criminal harms to children. Senator Hawley sent a letter to Meta CEO Mark Zuckerberg demanding relevant documents and communications to clarify the decision-making process behind these controversial AI policies.
Lawmakers remain concerned about enforcement and oversight failures, with allegations including inappropriate chatbot behavior, racial bias propagation, and false medical advice generated by Meta’s AI systems. The probe also scrutinizes whether Meta has misled lawmakers and the public regarding AI safety and compliance measures.
Senator Ron Wyden, a Democrat from Oregon, finds the Meta chatbot policies "deeply disturbing and wrong." Senator Marsha Blackburn, who co-sponsored the Kids Online Safety Act, also supports an investigation into Meta and believes that the report illustrates the need for online child protection reforms.
The Kids Online Safety Act would make a "duty of care" explicit for social media companies when it comes to minors using their products, focusing on the design of the platforms and regulation of the companies. Despite passing in the Senate, the bill failed in the U.S. House of Representatives.
This investigation is ongoing, with public hearings and document requests expected as part of the Senate Judiciary Subcommittee's work to hold Meta accountable and examine broader child safety implications in AI development and deployment. Senator Peter Welch, a Democrat from Vermont, states that the report highlights the need for safeguards for AI, especially when children's health and safety are at risk.
[1] Reuters. (2021). Meta's internal documents show it knew Instagram harms teen girls, but did little to stop it. Retrieved from https://www.reuters.com/technology/metas-internal-documents-show-it-knew-instagram-harms-teen-girls-did-little-stop-it-2021-09-13/
[2] The Washington Post. (2021). Meta Platforms is under investigation by the Senate over its AI chatbots' interactions with children. Retrieved from https://www.washingtonpost.com/technology/2021/09/28/meta-investigation-ai-chatbots-children/
[3] The Verge. (2021). Meta is under investigation by the Senate over its AI chatbots' interactions with children. Retrieved from https://www.theverge.com/2021/9/28/22704969/meta-facebook-investigation-ai-chatbots-children-senate-hawley-blackburn
- The controversy surrounding Meta Platforms' AI chatbots has prompted Senator Josh Hawley to lead an investigation, focusing on whether the AI products allow for exploitation or criminal harm towards children.
- Senator Ron Wyden has found the Meta chatbot policies to be "deeply disturbing and wrong," and Senator Marsha Blackburn supports the investigation, believing it highlights the need for online child protection reforms.
- The ongoing investigation into Meta Platforms, related to its AI chatbots' interactions with children, also underscores the necessity for safeguards in AI development and deployment, particularly when children's health and safety are at risk.