Facebook Defends Instagram Policies Toward Teens

SMS
Facebook's head of global safety policy Antigone Davis.
A Facebook executive told Congress the company is working to protect young people on its platforms.

Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. 

The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Additional reporting by the Associated Press.