By: Alena Kuzub
New Meta age-appropriate content controls can play a crucial role in minimizing mental health risks for teenagers, says a Northeastern University expert who serves on the company’s youth safety advisory committee.
Rachel Rodgers, associate professor of applied psychology at Northeastern who has served on Meta’s youth safety advisory committee for the last two years, says the controls should foster better communication between young people and their parents regarding safe social media use and encourage the platforms to proactively guide teens in beneficial ways.
“It’s terribly important that they have put together these new policies to further restrict what young people are able to see on the platforms,” Rodgers says.
There are a number of mental health risks, she says, that play out for younger users on social media related to social relationships, self-harm, loneliness and body image.
In a recent announcement, Meta revealed that it will now be automatically placing all teens into the most restrictive content control settings on Instagram and Facebook, restrict additional search terms on Instagram and start to hide more types of content for teens on Instagram and Facebook, in line with expert guidance.