Instagram still posing serious risks to children, campaigners say

BBC news investigations

Young Instagram users can still be subjected to “serious risks” even if they use new teenage accounts that have been provided to provide more protection and control, and research by activists indicates.
The researchers behind a new report said they were able to prepare accounts using fake birthdays, then sexual content, hateful comments, and recommended adult accounts were presented.
Meta, who owns Instagram, says her new accounts are “built -in protection” and shares “the goal of maintaining safe adolescents online.”
The research is released, from the Salama Charity Foundation 5rights online, such as Offcom, the UK organizer, is about to publish the symbols of the safety of its children.
They will determine that the bases platforms should be followed under the online safety law. The platforms will have three months to show that they have systems that protect children.
This includes strong tooth tests, the safest algorithms that are not recommended by harmful content, and the effective moderation of the content.
Instagram Teen accounts were created in September 2024 to introduce new protection for children and create the so -called Meta “peace of mind for parents”.
The new accounts are designed to reduce users and reduce the amount of content that young people can see.
Current users will be transferred to the new accounts, and those who take first time will get an automatic account.
But researchers from the 5rights Foundation were able to create a series of fake teenagers ’accounts using wrong birthdays, with no additional checks by the platform.
They found that immediately upon registration, adult accounts were offered to follow up messages and messages.
They claim that “Instagram algorithms” still promote sexual images, and like harmful beauty and other negative stereotypes.
The researchers said that their teenagers’ accounts were also recommended by “full of abhorrent quantities”.
The charity also had concerns about the addictive nature of application and exposure to the subsidized commercial content.
“This is not an adolescence environment,” Baroness Beeban Kidron told 5rights.
“They do not examine the age, and they recommend adults, and put them in commercial situations without teaching them and being deep sexual.”
Meta said that the accounts “provide integrated protection for adolescents who limit those who relate to them, the content they can see, and the time it spends on our applications.”
“Teenagers in the UK were automatically transferred to this improved protection, and less than 16 years old needed permission from parents to change it,” he added.

In a separate development, the BBC News also learned to have self -harm groups on X.
Groups or “societies”, as they are known on the platform, contain tens of thousands of members who share graphic images and autonomy videos.
Some users participating in groups appear to be children.
“I was on the ground to see 65,000 members of society,” said Becca Spencenx, an American researcher who discovered the groups.
“It was a big drawing, and there were people who took opinion polls about where they should be cut off after that.”
X was contacted to comment, but he did not respond.
But in the provision of an Offcom advice last year, X said: “We have clear rules to protect the safety of service and the people who use it.”
“In the United Kingdom, X is committed to compliance with the online safety law,” he added.