Meta Platforms Strengthens Online Safety Measures for Student Users

,

Meta Platforms, the parent company overseeing Instagram and Facebook, has announced enhanced safeguards to protect teenage users from unwanted direct messages on their platforms. This development follows recent regulatory pressures urging increased protection for children on social media networks.

The decision to reinforce safety measures comes in the wake of heightened scrutiny, including testimony in the U.S. Senate by a former Meta employee. The employee alleged that the company was aware of harassment and other risks faced by teens on its platforms but failed to address them adequately.

Effective immediately, teens on Instagram will no longer receive direct messages from individuals they do not follow or have no prior connection with on the platform. Additionally, parental approval will be required for certain settings changes on the app, giving parents more control over their child’s online interactions.

On Messenger, users under 16 and those under 18 in certain regions will only receive messages from Facebook friends or individuals connected through phone contacts. Furthermore, adults over the age of 19 will be unable to send messages to teens who do not follow them.

These measures reflect Meta Platforms’ commitment to creating a safer digital space for its young user base. By proactively addressing regulatory concerns and implementing stricter controls, the company aims to prioritize the well-being of students and ensure a secure online environment as part of responsible digital