Teens won’t be able to see certain posts on Facebook, Instagram: What Meta’s changes mean

Technology

Teens on Facebook and Instagram may soon find that some content that once proliferated on their feeds is no longer visible to them – even if they search for it.

Meta, the parent company of the social media platforms, revealed Tuesday it will begin restricting some of what young users can see on Facebook and Instagram. The announcement comes as the company faces mounting pressure from regulators who claim its social media sites are addictive and harmful to the mental health of younger users.

In a blog post, Meta said the measures, which will roll out in the coming weeks, are designed “to give teens more age-appropriate experiences on our apps.” The protections will make it more difficult for teens to view and search for sensitive content such as suicide, self-harm and eating disorders, according to Meta.

Here’s what to know about the changes:

Meta has come under fire in recent months in both the United States and Europe over allegations that its apps are addictive and have fueled a youth mental health crisis.

In October, more than 40 states filed a lawsuit in federal court claiming that the social media company profited from the advertising revenue gained by intentionally designing features on Instagram and Facebook to maximize the time teens and children spent on the platforms.

Meta said in a statement at the time that it shares the “commitment to providing teens with safe, positive experiences online.”

“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said.

Leave a Reply

Your email address will not be published.