A PYMNTS Company

Meta Introduces Teen Accounts to Address Growing Data Regulation Demands

 |  September 17, 2024

Meta, the parent company of Instagram, announced on Tuesday the launch of “Teen Accounts,” a new version of Instagram aimed at providing a safer, more controlled experience for users under 16. This move is part of Meta’s ongoing efforts to address widespread concerns about the potential negative impacts of social media on young people’s mental health and well-being.

According to CBS News, Meta will automatically migrate all users under 16 to the new service, which features a variety of safety measures and enhanced parental controls. These updates are designed to ease concerns among parents and policymakers about the exposure of young users to harmful content and interactions. The changes come amid growing scrutiny of how social media platforms affect the mental health of children and teenagers, with lawmakers and parents calling for tighter regulations on tech companies.

In the new Teen Accounts, user profiles are automatically set to private, preventing anyone from viewing the account unless the teen accepts a request to share their information. Additionally, messaging features are restricted, allowing parents to monitor who their children are communicating with. Another key feature silences notifications at night to encourage healthier online habits, and these protections can only be deactivated with parental approval.

“We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences,” Meta said in a statement on Tuesday. Per CBS News, the company emphasized that Teen Accounts are part of their larger effort to reimagine their apps with young users in mind.

Read more: Irish Watchdog Probes Google’s AI Data Practices

Beyond the enhanced parental controls, the new service also gives teens more control over the content they see. Meta introduced a revamped “Explore” feature that allows teens to select topics that interest them, helping them engage with more relevant and age-appropriate content. This is intended to reduce exposure to harmful material that could negatively impact their mental health.

The launch of Instagram Teen Accounts comes as Meta faces legal challenges from U.S. states over its practices regarding young users. According to CBS News, dozens of states filed lawsuits in 2023 accusing the tech giant of deliberately making its platforms addictive for young users to boost profits. The lawsuits also claim Meta violated federal law by collecting data from children under 13 without parental consent.

Meta has denied these allegations, insisting it is committed to creating positive online experiences for teens. The company said it has introduced several tools over the years to enhance safety and provide parents with more control over their children’s social media activities.

One question that remains is how Meta will ensure compliance with Teen Accounts. According to Meta, young users who wish to disable any of the safety features must seek permission from their parents. Additionally, parents will have the ability to see how much time their children spend on the platform, as well as who they are messaging. They can also set limits on when their teens are allowed to use the app.

To verify user ages, Meta has introduced a new process that includes uploading an ID card or using Yoti, a tool that analyzes facial features to estimate whether a person is under or over 18. This is intended to prevent younger users from lying about their age to avoid restrictions.

Source: CBS News