Instagram Introduces New Teen Protections Amid Growing Concerns
Instagram is rolling out a new set of protections for teenagers, aimed at improving safety and giving parents more control over their children’s activity on the platform. The “teen accounts,” launching in the UK, US, Canada, and Australia, will turn many privacy settings on by default for users under 18. This includes limiting who can view their content and requiring teens to approve all new followers.
For younger teens, aged 13 to 15, changes will also require parental involvement. They will only be able to adjust key settings by adding a parent or guardian to their account. This move comes as social media companies face increasing pressure to safeguard young users from harmful content.
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) called Instagram’s initiative a “step in the right direction,” but emphasized the need for proactive measures to stop harmful content from spreading. Rani Govender, NSPCC’s online safety policy manager, said children shouldn’t bear the responsibility for their own safety and urged Instagram to focus on prevention.
Meta, Instagram’s parent company, describes the changes as a way to “better support parents” and offer reassurance about teens’ online safety. However, critics like Ian Russell, who lost his daughter to suicide after she was exposed to harmful content on Instagram, stressed the importance of transparency, stating that the effectiveness of these measures will only be clear once implemented.
The teen accounts will automatically turn privacy settings on for teens, blocking non-followers from viewing their content, and muting notifications overnight. Parental supervision will allow parents to monitor who their children are messaging and the topics they are interested in, though they won’t be able to read the messages themselves.
Despite these protections, questions remain about the platform’s ability to enforce its rules. Instagram will rely on age verification tools, including AI detection of teens using adult accounts, to enforce the new system. Platforms are under pressure to comply with the UK’s Online Safety Act, which demands stronger protections for children online.
The success of Instagram’s new features hinges on enforcement, according to social media analyst Matt Navarra, who warned that teens could still find ways to circumvent restrictions. Instagram, Snapchat, and YouTube have all implemented similar protections in recent years, but harmful content remains a widespread issue.
The Online Safety Act requires platforms to remove illegal and dangerous content, such as child sexual abuse material and content promoting suicide or self-harm, though its full implementation is not expected until 2025.
In Australia, Prime Minister Anthony Albanese recently proposed a new age limit for social media, further highlighting the growing global focus on protecting young people online. Instagram’s new tools empower parents but fall short of addressing the algorithms and vast amount of content that continues to pose risks to teens worldwide.