Meta stated today that kids would have new Facebook and Instagram DM limitations that discourage unwanted messages.
Instagram previously prohibited people over 18 from texting unfollowed adolescents. By default, all users under 16 and in certain regions under 18 will face the new limitations. Meta promised existing users a notice.
Users can only receive Messenger messages from Facebook friends or acquaintances.
Meta is also expanding its parental restrictions by letting parents approve or reject teen privacy settings updates. Previously, guardians were notified when adolescents modified these settings but couldn’t intervene.
The business stated that guardians can prevent teens who try to make their account public, adjust the sensitive content control from “less” to “standard,” or change their DM restrictions.
Meta introduced Instagram parental supervision features in 2022 to help parents monitor their teenagers.
Facebook said it will also create a mechanism to prevent minors from seeing indecent photos in their DMs from friends. This functionality will also operate in end-to-end encrypted chats and “discourage” kids from transmitting these photographs, the business said.
Meta did not indicate how it protects kids’ privacy when implementing these functionalities. It also did not define “inappropriate.”
Meta launched new tools this month to block kids from viewing self-harm or eating problems on Facebook and Instagram.
Last month, EU regulators formally requested Meta to provide more information about its efforts to stop the distribution of self-generated child sexual abuse material (SG-CSAM).
In New Mexico state court, Meta is being sued for promoting sexual content to teens and underage accounts to predators. More than 40 US states sued the corporation in a California federal court in October for developing items that impair youngsters’ mental health.
On January 31, the firms TikTok, Snap, Discord, and X will testify before the Senate on child protection.