Instagram Teen Accounts still exposed to sexual content, investigation finds
A limited case study by Accountable Tech shows inappropriate and sexual content easily slips through Instagram's teen safety controls.


Following years of criticism for its handling of the youth mental health crisis, Instagram has invested heavily in beefing up its teen safety features, including an entirely new way for underage users to post, communicate, and scroll on the app. But recent tests of these new safety features suggest it may still not be enough.
According to an investigation conducted by the youth-led nonprofit Design It For Us and watchdog Accountable Tech — later corroborated by Washington Post columnist Geoffrey Fowler — the platform continues to surface age-inappropriate, sexually explicit, and generally harmful content despite content control safeguards.
In the study, "Gen-Z aged" representatives from Design It For Use tested five fake teen accounts on the app's default Teen Account settings over a two-week period. In all of the cases, the youth accounts were recommended sensitive and sexual content. Four out of five accounts were recommended content related to poor body image and eating disorders, and only one account's algorithm surfaced what the nonprofit deemed "educational" content.
The individual algorithms additionally recommended descriptions of illegal substance use, and sexually explicit posts involving trendy, coded language slipped through the filters. But not all protections faltered, including the platform's built-in restrictions on messaging and tagging.
"These findings suggest that Meta has not independently fostered a safe online environment, as it purports to parents and lawmakers," the report writes. "Lawmakers should compel Meta to produce data about Teen Accounts so that regulators and nonprofits can understand over time whether teenagers are actually protected when using Instagram."
In a response to the Washington Post, Meta spokeswoman Liza Crenshaw said the test's limited scope doesn't capture the true impact of the app's safety features. “A manufactured report does not change the fact that tens of millions of teens now have a safer experience thanks to Instagram Teen Accounts. The report is flawed, but even taken at face value, it identified just 61 pieces of content that it deems ‘sensitive,’ less than 0.3 percent of all of the content these researchers would have likely seen during the test.”
Addressing an ongoing, platform-wide issue
A June 2024 experiment by the Wall Street Journal and Northeastern University found that minor-aged accounts were frequently recommended sexually explicit and graphic content within the app's video-centered Reels feed, despite being automatically set to the platform's strictest content settings. The phenomenon was a known algorithmic issue for parent company Meta, which, according to internal documents, was flagged by employees conducting safety reviews as early as 2021. In a response, Instagram representatives said the experiments did not reflect the reality of how young users interact with the app.
At that time, Instagram had yet to launch its new tentpole product, Teen Accounts, introduced as a new, more highly monitored way for younger users to exist and post online — including stronger content controls. Minor users are automatically placed into Teen Accounts when signing up for Instagram, which sets their page private, limits messaging capabilities and the ability to stream live, and filters out sensitive content from feeds and DMs. Teens between the ages of 13-15 have even tighter reins on their app usage, and accounts that fall through the cracks are now being spotted and flagged by Meta's in-house AI.
More than 54 million teens have been moved into a restricted Teen Account since the initial rollout, according to Meta, and the vast majority of users under the age of 16 have kept the default, stricter security settings. And while the numbers show a positive shift, even Meta CEO Mark Zuckerberg admits there may be limits to how effectively the company can monitor its vast user base and complex algorithm.