The Wrong Solution to a Complex Problem

The EARN IT Act was reintroduced into Congress last Monday, with the promise that it would end Internet platforms’ “blanket immunity” for “tens of millions of photos and videos” of child sexual abuse that they allow to circulate online. With the bill already scheduled for hearing in committee, it’s on track to be passed quickly. And why shouldn’t it be, if its sponsors’ claims about it are true? Perhaps because they’re not true.

Section 230, the 1996 federal law that shields tech companies from responsibility for user-uploaded content, doesn’t apply to child abuse content. Tech companies are already held federally liable over such content, unless they report it upon discovery to the child abuse reporting clearinghouse, NCMEC. We don’t know how much unique content is being reported, because NCMEC is not subject to freedom of information laws. But according to Meta, which analyzed its own reports to NCMEC over two months of 2020, about 9 out of every 10 were for duplicates of previously-reported content, and a mere six videos accounted for more than half of its reports.

Without knowing the true extent of the problem, it’s difficult to dictate solutions. But the EARN IT Act does so anyway. A Myth vs Fact document released by its sponsors name-checks a Microsoft technology called PhotoDNA as a gold standard that tech companies should be using to detect known child abuse content on their platforms. While major social media companies do use PhotoDNA, Microsoft has refused to license the technology to many smaller Internet platforms. It’s also an aging technology, that hackers recently reverse-engineered. This could enable an attacker to “poison” innocuous images, potentially triggering false reports of child abuse against an innocent victim.

For all its faults, PhotoDNA is the devil we know. The EARN IT Act could see platforms also being required to adopt invasive and experimental AI-based tools to detect previously unknown child abuse content, as well as attempts at child grooming. We already have an idea of what these tools are, because Europe recently went through a process to evaluate them. One of these tools, which aimed to detect grooming in text conversations, was ultimately declared incompatible with European privacy law, because its use amounted to a warrantless interception of private communications. Yet as recently as last month, European Commissioner Ylva Johansson continued to push Silicon Valley to adopt these tools. In December, it was reported that Google used automatic scanning tools to flag an established artist’s artwork as child abuse content. What other innocent content might these experimental tools flag for the attention of investigators? Selfies sent between teens? Fantasy stories exchanged by adult lovers? We simply don’t know.

All of this aside, the problem that the EARN IT Act addresses—insufficient reporting of abuse content by tech companies—isn’t a problem at all, but quite the reverse. NCMEC itself acknowledges that of the reports that it receives now, “many are never investigated by law enforcement due to a lack of resources.” Incentivizing Internet companies to send even more reports will only make this problem worse. While the EARN IT Act would do nothing to increase law enforcement’s capacity to cut through this backlog of child abuse crimes, an alternative law, the Invest in Child Safety Act sponsored by Section 230 author Senate Ron Wyden, would. $5 billion of funding allocated by Wyden’s bill would go not only towards the investigation and prosecution of such crimes, but also towards preventing child sexual abuse to begin with—an endeavor that is both chronically underfunded and widely misunderstood.

Child sexual abuse is an intolerable crime. But it isn’t only children who are harmed by it. Another insidious evil of this crime is how our emotional response to it clouds our judgment about how best to combat it. The EARN IT Act puts forward a snake oil solution to a complex problem. Tech companies don’t hold the keys to ending child sexual abuse, and incentivizing them to increase their surveillance of our private communications won’t make kids safer. Much more can be achieved by addressing the backlog of existing child abuse cases, and even more importantly, by investing in the prevention of child sexual abuse to begin with.



Menu