Facebook delays Instagram app for users 13 and younger

Facebook has paused development of an "Instagram Kids" service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app's effect on young people's mental health

By Adam Satariano and Ryan Mac
Published: Sep 28, 2021

Friends exchange Instagram QR codes at New York’s Central park on Aug. 8, 2021. Facebook paused the development of Instagram Kids amid questions about the app’s effect on young people’s mental health. (Jasmine Clarke/The New York Times)

Facebook said Monday that it had paused development of an “Instagram Kids” service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app’s effect on young people’s mental health.

The pullback preceded a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the company knew of the harmful mental health effects that Instagram was having on teenage girls. The revelations have set off a public relations crisis for the Silicon Valley company and led to a fresh round of calls for new regulation.

Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of criticism.

“This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today,” Adam Mosseri, the head of Instagram, wrote in a blog post.

The decision to halt the app’s development is a rare reversal for Facebook. In recent years, the social network has become perhaps the world’s most heavily scrutinized corporation, grappling with privacy accusations, hate speech, misinformation and allegations of anti-competitive business practices. Regulators, lawmakers, journalists and civil society groups around the world have criticized the company for its effects on society.

Read More

With Instagram Kids, Facebook had argued that young people were using the photo-sharing app anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for ages 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which Google owns, has released a children’s version of its app.

But since BuzzFeed broke the news this year that Facebook was working on the app, the company has faced scrutiny. Policymakers, regulators, child safety groups and consumer rights groups have argued that it hooks children on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.

Mosseri said Monday that the “the project leaked way before we knew what it would be” and that the company had “few answers” for the public at the time.

Opposition to Facebook’s plans gained momentum this month when The Journal published articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had caused teen girls to feel worse about their bodies and led to increased rates of anxiety and depression, even while company executives publicly tried to minimize the app’s downsides.

On Thursday, Facebook’s global head of safety, Antigone Davis, is scheduled to testify at a Senate Commerce subcommittee hearing titled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms.”

Simply pausing Instagram Kids was insufficient, said lawmakers, including Sen. Richard Blumenthal, D-Conn., the chairman of the subcommittee holding Thursday’s hearing. In a statement, he and others said Facebook had “completely forfeited the benefit of the doubt when it comes to protecting young people online and it must completely abandon this project.”

The lawmakers added that stronger regulation was needed. “Time and time again, Facebook has demonstrated the failures of self-regulation, and we know that Congress must step in,” they said.

A children’s version of Instagram would not fix more systemic problems, said Al Mik, a spokesman for 5Rights Foundation, a London group focused on digital rights issues for children. The group published a report in July showing that children as young as 13 were targeted within 24 hours of creating an account with harmful content, including material related to eating disorders, extreme diets, sexualized imagery, body shaming, self-harm and suicide.

“Big Tobacco understood that the younger you got to someone, the easier you could get them addicted to become a lifelong user,” said Doug Peterson, Nebraska’s attorney general. “I see some comparisons to social media platforms.”

In May, attorneys general from 44 states and jurisdictions had signed a letter to Facebook’s chief executive, Mark Zuckerberg, asking him to end plans for building an Instagram app for children.

The Instagram revelations have also set off discontent inside Facebook. Last Thursday, during a companywide meeting led by Zuckerberg, employees demanded to see the Instagram research for themselves and asked what executives planned to do about the findings, according to one attendee, who was not authorized to speak publicly.

“Teen suicide rate has increased 20% in the last 4 years,” read one of the top-voted employee questions to Zuckerberg. “It’s proven that Instagram is toxic for teen girls. What is Facebook doing to address this?”

During the meeting, Zuckerberg passed the question to Mosseri, who said the research actually showed that Instagram mostly improved image issues for teens, according to the attendee. Those points were publicly reiterated in a company blog post on Sunday.

©2019 New York Times News Service

X