Instagram announced new restrictions for teen accounts on Tuesday amid mounting controversy over safety guidelines for younger users on the social media platform.
The photo-sharing app will soon limit content for teens using guidelines similar to those in the film industry for PG-13-rated movies. The changes will mean the app hides or does not recommend posts that include strong language, show drug paraphernalia, or encourage “potentially harmful behaviors,” Instagram, which is owned by Meta, said.
[time-brightcove not-tgx=”true”]
The company said it would use “age prediction technology” to prevent teens from avoiding the restrictions, which will be rolled out by the end of the year.
Read more: Instagram Promised to Become Safer For Teens. Researchers Say It’s Not Working.
The changes come as Instagram is under fire for allegedly failing to protect underage users from harmful content. Last week, a report shared exclusively with TIME suggested that nearly 3 in 5 teens between the ages of 13 to 15 had encountered unsafe content and unwanted messages in the last six months. Meta told TIME the report was “deeply subjective” and “relies on a fundamental misunderstanding of how our teen safety tools work.”
And in September, a separate study by online-safety groups and Northeastern University researchers found that more than 40 child safety features promised by Instagram were flawed. Meta called that study “dangerously misleading.”
‘Age-inappropriate content’
The company made efforts to enact greater protections for younger users last year, when it launched “teen accounts,” which prevented people who were under 18 from accessing certain mature material and automatically made their accounts private by default.
The new changes will block teen users from following accounts that post “age-inappropriate content, or if their name or bio suggests the account is inappropriate for teens,” Instagram said. Teens who already follow such accounts won’t be able to see or interact with their content, and those accounts will also be unable to follow teens, send them messages, or comment on their posts. The restrictions will apply to celebrities and other largely followed adult accounts who share even one age-inappropriate post, Instagram told NBC News.
Instagram’s AI chatbot will also see new fixes to prevent it from giving age-inappropriate responses to users. Separately, AI chatbots and the companies behind them have also been subject to legal complaints over allegations that the chatbots help users “explore suicide methods.
Instagram’s new restrictions will automatically apply to teen users. They will not be able to opt out unless they obtain their parents’ permission.
Read more: ‘Everything I Learned About Suicide, I Learned On Instagram.’
Instagram is introducing a feature for parents seeking even stricter controls, allowing them to block an account from viewing, leaving, or receiving comments under posts.
Instagram has been subject to personal injury lawsuits in both state and federal courts over allegations that it harms young people: More than 1,800 plaintiffs filed a lawsuit in northern California against big-name social media companies, including Instagram and Meta, for “recklessly ignoring the impact of their products on children’s mental and physical health.” One such lawsuit referred to Instagram as an “addictive, harmful, and at times fatal” platform.
Still, the social media company celebrated the newly announced changes as the “most significant update to teen accounts” since they were introduced in January 2024. The new restrictions will apply to hundreds of millions of teens who use the app worldwide, though they will first gradually rollout for those in the U.S., U.K., Australia, and Canada.