Meta Introduces Safety Update for Under-18s on Instagram and Messenger
Meta, the parent company of Facebook and Instagram, is implementing a safety update that will disable the ability for under-18s to receive messages from anyone they do not follow or are not connected to by default. This change comes as part of Meta’s ongoing efforts to provide age-appropriate experiences for teens on its platforms. The update extends an existing policy that prevents adults from messaging teenagers who do not follow them. This means that now, no one on Instagram or Facebook Messenger will be able to direct message a teenager who is not following them.
Meta explained in a blog post that this new default setting will only allow teens under the age of 16 (or under 18 in certain countries) to be messaged or added to group chats by people they are already connected to. This, in turn, will help teens and their parents feel more confident that they will not receive messages from unknown individuals in their direct messages.
Improved Parental Control and Supervision Tools
In addition to the safety update, Meta is also enhancing its parental control and supervision tools. Parents using these tools will now have the ability to approve or deny their teenagers’ requests to change their default privacy and safety settings. Previously, parents were only notified when their child made a change. For instance, if a teenager using supervision attempts to change their account from private to public, alter their sensitive content control, or modify their DM settings to hear from new people, a notification will be sent to their parent, prompting them to approve or deny the request.
These new features aim to facilitate offline conversations between parents and their teens as they navigate their online lives together.
Scrutiny on Social Media Platforms and Young Users
The introduction of these safety updates comes at a time when social media platforms are facing increased scrutiny over their impact on younger users. Politicians and campaigners are calling for stricter measures to protect children and teenagers online. Miriam Cates, Conservative MP, recently urged Prime Minister Rishi Sunak to consider banning social media and smartphones for those under the age of 16. Cates pointed to the rise in poor teenage mental health and other concerning issues, including addiction to pornography. In response, Prime Minister Sunak highlighted the Online Safety Act, which aims to protect children online, with regulatory measures being developed by Ofcom, the sector’s regulator.
Meta’s safety update and enhanced parental control features demonstrate the company’s commitment to prioritizing the well-being of young users and ensuring a safer online environment for them. By limiting the ability for under-18s to receive messages from unknown individuals and giving parents more control over their teens’ privacy settings, Meta hopes to foster age-appropriate experiences and facilitate important discussions between parents and adolescents about online safety.
Analyst comment
Positive news. The market is likely to respond positively to Meta’s safety update and enhanced parental control features. These changes address concerns over the impact of social media on young users and demonstrate Meta’s commitment to prioritizing their well-being. This move may improve public perception of Meta’s platforms and attract more users, particularly parents who are seeking a safer online environment for their teens.