Big Tech Fails to Protect Children Online

The internet is a lawless country for children and teens.

Rules are not enforced, adults hid their eyes, people aren’t who they seem to be. Algorithms designed to maximize user engagement have helped build pedophile networks, turn individuals into an army of cyberbullies and bombard young users with addictive content that damages their mental health. These are just some of the truths about child safety problems that confront tech industry employees, journalists, mental health professionals, law enforcements agencies and—most of all—the survivors themselves.

Which is why the tech industry is trying to hide them.

In January 2024, Meta CEO Mark Zuckerberg appeared at a U.S. Senate hearing regarding online child sexual exploitation. In the back of the room stood parents holding photos of their children who died after online sexual exploitation and cyberbullying. At one point Mr. Zuckerberg said to them “No one should go through the things that your families have suffered, and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”

Words could not be more hollow.

In 2022, nearly 32 million cases of online child sexual abuse materials (CSAM) were reported. Over 27 million (85 percent) stemmed from Meta platforms, mostly Messenger and Instagram. Just two months before Mr. Zuckerberg reassured those parents that others would not suffer the same things, Meta started end-to-end encryption on Messenger, and will soon encrypt Instagram­–making them more dangerous than ever.

Child safety and law enforcement agencies have strongly condemned this move, claiming that 70 percent to 90 percent of Meta’s CSAM will be virtually invisible if encrypted. Meta is enabling tens of millions of child sex images and videos to now go undetected, putting children beyond the reach of help, and their predators beyond the law.

At least Meta reported on CSAM. Alphabet (Google / You Tube), the world’s second largest social media company, made 2 million reports accounting for just 7 percent of the total. Apple made a paltry 234 reports; to its credit, Apple did develop a way to scan for known CSAM in the cloud, but it quickly withdrew that after privacy advocates complained. Privacy is critical for protecting human rights, which are under new assaults everywhere. But the answer lies in developing a stronger and broader legal framework, not in settling for business practices that treat children as collateral damage.

The more Tech tries to evade responsibility for allowing its technology to be weaponized against children, the more the world pushes back.

Forty-one states are suing Meta for a "scheme to exploit young users for profit." New York City and school districts in eight states are suing Meta and Alphabet for “fueling the nationwide youth mental health crisis.” California and Texas have passed online age verification laws. The proposed Kids Online Safety Act enjoys public and bipartisan Congressional support and would require companies to prevent or mitigate child risks including suicide, eating disorders and substance abuse. The new European Digital Services Act will make identifying, reporting and removing CSAM mandatory. The U.K. Online Safety bill aims to keep internet users, particularly children, safe from fraudulent and harmful content.

The investor red flags of regulatory, reputational and legal risk keep getting bigger for the Tech industry. They say that ‘what gets measured gets managed.” But what happens when an industry refuses to measure or manage?

Shareholders continue to press their companies on these issues. A resolution at Apple (now withdrawn) focused on CSAM. Others still pending at Meta and Alphabet ask about child safety more broadly including sexual exploitation, cyberbullying, mental health impacts and data privacy. All ask for data showing if company policies and tools are actually reducing harm to children. It's time Big Tech stopped pretending that "out of sight, out of mind" is a viable child safety policy.

 

Michael Passoff
CEO, Proxy Impact