PICTURE PERFECT This undated photo shows a woman holding a smartphone featuring the login screen of social media platform Instagram. IMAGE BY STOCKSNAP VIA PIXABAY
PARIS: Social media giant Meta said on Tuesday it was rolling out a slew of measures to boost the safety of young users on its Instagram platform, the latest firm to address the issue.
Campaigners have long criticized technology giants for failing to protect teenagers from harmful content, and the popularity of Instagram with young people has placed it firmly in the firing line.
Meta, which also owns Facebook and WhatsApp, said parents and guardians would be able to set time limits on children’s scrolling on Instagram.
And young users would now see nudges encouraging them to look at other subjects if they are spending too much time looking at content about a single topic.
“It is crucial for us to develop tools that respect the privacy and autonomy of young people while involving parents in the experience,” Meta’s Clotilde Briend said during a media briefing.
Instagram was rocked last year by revelations from whistleblower Frances Haugen that suggested executives were aware the platform could harm the mental health of young users, particularly teenage girls.
Meta has consistently denied the claims, but has since faced a series of tough hearings in the United States Congress and suggestions that regulation could be on the way.
Other apps, including video-sharing platform TikTok, have also been criticized over fears that young people were finding it hard to tear themselves away from the content.
Last week, TikTok announced that young people would get nudges to remind them to take a break from scrolling, similar to an Instagram feature that has already been rolled out.
Also on Tuesday, Meta also announced new measures for its virtual reality headsets.
Parents and guardians will be able to block apps, view what their child is looking at on another device and see how long their child is spending with their headset on.