US social media giant, Meta, announced on Tuesday (9 January) that the company will conceal more content from teenagers on Instagram and Facebook. The move is a response to global regulatory pressure to enhance child protection measures on the company’s social media platforms.
Social media platforms such as Meta are facing heightened scrutiny from regulators urging better safeguarding for children. As part of these changes, all teenagers will now be placed under the most stringent content control settings on both Instagram and Facebook, accompanied by limitations on additional search terms within the photo-sharing app, as detailed in a blog post by Meta.
The implementation of these measures aims to create a more secure online environment for teenagers, reducing their exposure to potentially sensitive content or accounts when accessing features such as Search and Explore on Instagram.
The decision follows a lawsuit filed by dozens of US states in October, accusing Meta and Instagram of exacerbating a youth mental health crisis by fostering addictive features on their social media platforms.
Not limited to the US, Meta is also facing scrutiny in Europe, where the European Commission has taken a keen interest in understanding the measures the social media company employs to protect children from illegal and harmful content.
These developments underscore the growing emphasis on responsible online practices and the need for social media platforms to actively address concerns related to the well-being of young users.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalData