Meta is introducing changes to Instagram and Facebook Messenger that aim to better protect minors from unwanted contact online, placing greater restrictions on who can message teens while giving parents more control over their children’s security settings. Notably, the company announced that by default, teens under the age of 16 (or under 18 in some countries) will no longer be able to receive messages, or be added to group chats, by users they don’t follow or aren’t connected with on Instagram and Messenger.
These new updates build upon a series of safeguards that Meta has introduced over the last year as it battles accusations that its algorithms helped turn Facebook and Instagram into a “marketplace for predators in search of children.”
Unlike the previous restrictions, which only limited adults over 19 from DM’ing minors who don’t follow them, these new rules will apply to all users regardless of age. Meta says that Instagram users will be notified of the change via a message at the top of their Feed. Teens using supervised accounts will need to request permission from the parent or guardian monitoring their account to change this setting.
Image: Meta
Parental supervision tools on Instagram are also being expanded. Instead of simply being notified when their child makes a change to their safety and privacy settings, parents will now be prompted to approve or deny their requests — preventing them from switching their profile from private to public, for example.
Image: Meta
Meta also says it’s developing a new feature that’s designed to protect users from receiving unwanted or inappropriate images in messages from people they’re already connected with, alongside discouraging those users from sending such content in the first place. There’s no launch date yet, but Meta says the feature will work in encrypted chats, and that more information will be shared later this year.