Meta boss Mark Zuckerberg forced to apologise to families
During a heated hearing in the US Senate, Mark Zuckerberg, along with other social media executives, faced criticism for not doing enough to protect children from exploitation on their platforms. The hearing, held by the Senate Judiciary Committee, addressed concerns about children being exposed to sexual predators, bullying, unrealistic beauty standards, and harmful content related to eating disorders. Families of victims held pictures of their deceased children who had been affected by online harms.
Senator Lindsey Graham accused Zuckerberg and the other company representatives of having “blood on their hands,” referring to their platforms as products that harm and kill people. In response, Zuckerberg apologized to the families and pledged to continue investing in efforts to prevent such suffering.
Meta-owned Instagram was specifically condemned for its feature that alerts users to potentially abusive images but still allows them to view them. Zuckerberg defended the feature, stating that redirecting users to resources is more beneficial than blocking content. He also reiterated that the company had abandoned plans to create a child version of the app.
Meta has committed to blocking harmful content from being seen by users under 18 and instead providing resources from mental health charities for individuals struggling with self-harm or eating disorders.
This was not the first time Zuckerberg faced a committee, as he had previously appeared over privacy concerns related to Cambridge Analytica in 2018. It was the second time for TikTok’s CEO and the first for X’s Linda Yaccarino.
During the hearing, the chief executive of X emphasized that their platform does not cater to children and expressed support for the STOP CSAM Act, a bill focused on addressing child exploitation and providing restitution for victims.
TikTok’s CEO was questioned about the app’s impact on children’s mental health and defended the platform’s efforts to prevent harm to teens, including banning users under 13 and allocating $2 billion for trust and safety measures.
Discord’s CEO highlighted the safety tools already implemented on their platforms and mentioned collaborations with NGOs and law enforcement agencies to protect children.
Prior to the hearing, Snap Inc’s CEO stated that the company would support a bill holding apps and social media platforms legally accountable if they recommended harmful material to children.