There's nothing to like in Facebook's plans to hook our kids

If Facebook is to be believed, the planned Instagram Kids app would include controls to ensure that the worst of Instagram is kept out in favour of an antiseptic version fit for children 12 and under

By Greg Bensinger
Published: Oct 2, 2021

Image: Shutterstock

Facebook was on Capitol Hill on Thursday to receive its semiregular scolding from Congress about how its services are bad for America.

“Facebook is just like Big Tobacco, pushing a product that they know is harmful,” said Sen. Ed Markey, D-Mass., calling its photo-sharing site “Insta-greed.”

“Our products actually add value and enrich teens’ lives, they enable them to connect with their friends, their family,” insisted Antigone Davis, Facebook's global head of safety, unconvincingly.

After Facebook’s countless contrite appearances before Congress, it doesn’t even make good theater anymore. That’s a shame because the hearing was focused squarely on the most vulnerable users of technology — children.

Read More

Ahead of the hearing, Facebook announced it was pausing work on a controversial app designed to hook young people on Instagram.

If Facebook is to be believed, the planned Instagram Kids app would include controls to ensure that the worst of Instagram — body shaming, trolling, bullying, racism, targeted advertising — is kept out in favor of an antiseptic version fit for children 12 and under.

But who can trust Facebook after years of pernicious data harvesting and dissembling about the inner workings of its vaunted News Feed? Time and again, leaks from the company have shown that it ignored signs that its apps sow hate, encourage extremism and widely disperse dangerous misinformation.

Facebook “routinely puts profits ahead of kids’ online safety,” said Sen. Richard Blumenthal, D-Conn. “We now know it chooses the growth of its products over the well-being of our children.”

It’s clear that a pause isn’t good enough — children’s social media apps are simply not ready for prime time. They serve only to build a bridge to the main apps, where the cool, adult stuff happens, and hook ’em young. (My own children avoid the YouTube Kids app like spinach.) And rather than address the systemic troubles with their main sites, the apps foist more responsibility onto parents who don’t have an army of moderators at their service.

Before the Senate, Facebook’s Davis detailed a laundry list of design features, policies and other provisions necessary to shield teenagers and younger children from the dangers of its services while using them. Maybe Facebook ought to read that as a sign that its products are a bad idea for children?

Instagram, in particular, is a hub of youth anxiety and mental health problems. The company’s own research indicates that the app exacerbates body image issues for nearly a third of teenage girls experiencing them, according to a recent Wall Street Journal report. Equally troubling is that Facebook appears to have been proceeding without fully and properly consulting child safety experts. Adam Mosseri, the head of Instagram, said the pause will “give us time to work with parents, experts, policymakers and regulators.” Was that not the plan in the first place?

Not that Facebook is likely to heed the expert advice anyway. It forged ahead with its Messenger Kids app despite an outcry from health experts who said it could be deleterious to users’ health.

“The goal is simply to capture the most users and become the middlemen in our social interactions,” said Priya Kumar, a Pennsylvania State University assistant professor who studies technology’s impact on families. It is also likely to feed Facebook’s ad machine, by providing more insights to serve parents targeted ads on its main sites.

The companies know that kid versions of their apps will quickly drive children to the main apps, where they can be hit with targeted advertising and fall prey to their data collection schemes, just like everyone else. YouTube agreed to pay $170 million in 2019 to settle allegations it served children under 13 targeted advertising and collected their personal information. That was four years after the rollout of YouTube Kids, which was meant to keep children off the main video streaming site. Not exactly a roaring success.

Facebook’s Messenger Kids app for online chatting permitted some children to join groups with strangers. The company’s research to justify the child-safety project was primarily with groups and individuals to which Facebook had financial ties, according to Wired.

YouTube’s chief executive, Susan Wojcicki, recently asserted that the video streaming site was “valuable” for teenagers’ mental health, as a means of destigmatizing sensitive issues. But a lesson from The Journal’s Facebook series is that tech companies’ public statements don’t often gibe with their private data.

Mosseri and others say their kids’ products are a necessary salve to an intractable problem: Children may lie about their age to use the apps or simply use their parents’ or friends’ accounts, making it hard to filter out objectionable content. Surely Facebook, which seems to know my innermost thoughts, must have an inkling of an idea of who is using its services at any given time.

The truth is that Facebook, YouTube, TikTok and other companies are looking for continual growth. Tapping the elementary school set helps ensure a stable of new users who will graduate quickly to the platforms’ most profitable properties.

That’s why the companies have made no earnest effort to clean up their main apps — there’s just too much money at stake. But when their hand is forced, they quickly find creative ways to get in line with local regulations. A law that took full effect in Britain this month to better protect youngsters prompted a flurry of new privacy measures from the tech giants, including requiring Instagram users to affirm their birth date before using the app.

Without a comprehensive privacy law, the United States has largely left it up to the companies to self-regulate. With the same effort and financial commitment they’ve made to creating (and defending) kid versions of their apps, the social media companies ought to have devised better age-verification systems.

Lawmakers then have an obligation to protect our children by mandating better age-verification software and pushing for other design changes like halting autoplay features that can send teenage users down extremist rabbit holes and more transparency around what data is collected from minors and how that is used. They ought to consider fast-tracking proposals to update the long-in-the-tooth Children's Online Privacy Protection Act, such as tighter controls on marketing to children.

Mosseri is right about one thing: Facebook and its competitors have created services that are irresistible to teenagers and younger children, and kids will find their way to them by hook or by crook. And his company’s data shows that there is sufficient harm in allowing them onto their main app.

The outrage is certainly there on Capitol Hill. Let’s hope for our children’s sake it’s not just bluster.

©2019 New York Times News Service

X