Another former Facebook employee will testify at Congress about safety issues at Instagram

Another former Facebook employee is going public with allegations that the company failed to act on its own research showing that young Instagram users were having harmful experiences on the platform. Arturo Bejar, a former Facebook employee and consultant for Instagram, is scheduled to testify at a Senate Judiciary Committee Tuesday, November, 7.

Bejar, who detailed his efforts to raise the alarm internally about Instagram safety issues to , was a Facebook employee between 2009 and 2015 and returned to the company in 2019 to advise Instagram’s well-being team. He told The Journal that internal research showed that more than 20 percent of users younger than 16 “felt worse about themselves after viewing others’ posts” and that 13 percent “experienced unwanted sexual advances in the past seven days.”

Bejar’s disclosures come two years after another former Facebook employee, Frances Huagen, came forward with internal research about Instagram’s harmful effects on , along with other disclosures about controversial decisions within the company. Soon after, Instagram paused work on a dedicated app for kids, and dozens of states opened an investigation into the company.

Last week, 41 state attorneys general for “harmful and psychologically manipulative product features” that harmed the mental health of its youngest users. According to The Wall Street Journal, Bejar consulted with state officials on their case. Now, he’s set to publicly air his experiences in front of Congress,

“From Arturo’s disclosures, we now know that Mark Zuckerberg, Adam Mosseri, and other Meta executives were personally warned that millions of teens face bullying, eating disorder material, illicit drugs, and sexual exploitation, often within minutes of opening the app,” Senators Marsha Blackburn and Richard Blumenthal, both of whom sit on the judiciary committee, said in a statement. “Rather than address these deadly harms, Facebook continued to hide this information from the public and Congressional oversight, ignored recommendations to protect teens, rolled back safety tools, and dismantled teams responsible for kids’ safety.”

Meta didn’t immediately respond to a request for comment. The company told The Wall Street Journal it disagreed with Bejar’s claims that well-being research wasn’t addressed, and said the company had introduced several safety updates as a result of the work of Bejar and his team.

This article originally appeared on Engadget at

Source: Engadget