Meta has announced that teens on Facebook and Instagram will see less content related to self-harm and eating disorders. The tech giant is set to roll out new safety settings for users under the age of 18.
Meta has developed more than 30 tools and resources to support teens and their parents. The company in a blog post said, "We will start to hide more types of content for teens on Instagram and Facebook, in line with expert guidance."
It added, "We're automatically placing all teens into the most restrictive content control settings on Instagram and Facebook and restricting additional terms in Search on Instagram."
The company said it wants teens to have safe, age-appropriate experiences on the apps.
Meta is also prompting teens to update their privacy settings on Instagram with a single tap with new notifications.
The announcement comes after Meta has in recent months faced scrutiny over its potential impact on teen users.
In November, former Facebook employee-turned-whistleblower Arturo Bejar told a Senate subcommittee in a hearing that Meta's top executives, including CEO Mark Zuckerberg, ignored warnings for years about harm to teens on its platforms such as Instagram. Bejar raised particular concerns about the sexual harassment of teens by strangers on Instagram, CNN reported.
The same month, court documents revealed that Zuckerberg repeatedly thwarted teen well-being initiatives.
In a separate lawsuit weeks later, Meta was accused of refusing to shut down most accounts belonging to children under the age of 13.
In December, another lawsuit was filed against Meta. It accused the company of creating a "breeding ground" for child predators.
The changes are set to roll out to children under the age of 18 in the coming months.