Instagram has introduced a series of changes aimed at protecting teenagers on its platform, with new “teen accounts” rolling out in the UK, US, Canada, and Australia. These accounts will feature enhanced privacy settings by default, restricting content visibility for users under 18 to only those they follow. The move is part of a broader effort by Instagram’s parent company, Meta, to address increasing concerns about youth safety online.
Key among the changes is the requirement for teenagers to actively approve all new followers. For users aged 13-15, parental or guardian consent will be required to adjust privacy settings. Instagram will also block harmful content from being recommended to teenagers, using sensitive content controls. Notifications for under-16s will be muted overnight to help reduce screen time.
Teen accounts welcomed by NSPCC
Meta’s announcement was welcomed by the UK charity NSPCC, though they emphasized the need for more proactive content monitoring to prevent harmful material from reaching teenagers. Ian Russell, who lost his daughter Molly after she was exposed to harmful Instagram content, highlighted the importance of transparency and accountability in ensuring these changes work as intended.
Increased parental control is another focus of the update. Parents will now have access to information about who their child is messaging and their interests, though they won’t be able to view message content directly. Instagram will also rely on AI technology to identify and redirect teens using adult accounts, aiming to place them in appropriate teen accounts by early 2024.
These changes align with the UK’s upcoming Online Safety Act, which requires platforms to safeguard children or face significant penalties. Ofcom, the UK’s media regulator, will oversee compliance, ensuring social media companies like Instagram, Snapchat, and YouTube meet stricter child protection standards.
Also, see:
Collusion Allegations Rock India’s Tech Market: Samsung, Xiaomi Involved