Shou Zi Chew, CEO of TikTok, Linda Yaccarino, CEO of X, and Mark Zuckerberg, CEO of Meta testify before the Senate Judiciary Committee
CEOs of Meta, TikTok, X, and other companies faced intense questioning from US lawmakers regarding the risks posed to children and teenagers using social media platforms.
On Wednesday, the executives provided testimony before the US Senate Judiciary Committee, causing frustration from parents and lawmakers who believe that companies are not taking sufficient measures to combat online threats for children, including the prevention of sexual predators and teenage suicide.
“They’re responsible for many of the dangers our children face online,” US Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”
“Mr Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands,” said Senator Lindsey Graham, referring to Mark Zuckerberg, CEO of Meta, the company that owns Facebook and Instagram. “You have a product that’s killing people.”
Zuckerberg provided testimony alongside X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew, and Discord CEO Jason Citron.
Yaccarino from X expressed the company's endorsement of the STOP CSAM Act, a legislative proposal by Durbin aimed at holding tech firms responsible for child sexual abuse material, permitting victims to sue tech platforms and app stores. Despite several bills addressing child safety, including this one, none have been enacted into law.
X, previously known as Twitter, faced significant backlash following its acquisition by Elon Musk, who relaxed moderation policies. Recently, the platform restricted searches for pop singer Taylor Swift after the circulation of fake sexually explicit images.
Additionally, TikTok CEO Chew made her first appearance before US lawmakers since March, facing scrutiny over the app's impact on children's mental health during the session.
“We make careful product design choices to help make our app inhospitable to those seeking to harm teens,” Chew said, adding that TikTok’s community guidelines strictly prohibit anything that puts “teenagers at risk of exploitation or other harm – and we vigorously enforce them”.
During the hearing, the executives highlighted the safety tools already in place on their platforms and emphasized their collaborations with non-profit organizations and law enforcement to safeguard minors.
Before giving their testimony, Meta and X introduced new measures in anticipation of the intense session.
However, advocates for child health argue that social media companies have consistently fallen short in their efforts to safeguard minors.
Source: Al Jazeera