There is a lot of misinformation being spread by the technology industry regarding the EARN IT Act, reintroduced on January 31, 2022 by Senators Richard Blumenthal and Lindsey Graham together with a raft of other Senators, Democrats and Republicans.
In the face of an explosion of child sexual abuse material (CSAM, aka child pornography) being distributed and consumed online, the EARN IT Act holds technology companies accountable if they aid and abet the distribution and/or consumption of child sexual abuse material.
The National Center on Missing and Exploited Children (NCMEC) is a strong supporter.
Big Tech opposes the EARN IT Act because it wants to preserve the blanket legal immunity U.S. courts have said digital platforms currently have. The problem is, Big Tech has not earned the right to immunity.
As many in Silicon Valley attempt to protect massive profits at the expense of the safety and well-being of children, here’s what you need to know about the EARN IT Act and the solution(s) it proposes to address the problem of online child sexual abuse.
In the past ten years, the circulation of child sexual abuse material (CSAM) online has grown astronomically. According to a months-long New York Times (NYT) investigation:
In 2008, over 600,000 images/videos of CSAM were reported to the National Center on Missing & Exploited Children (NCMEC), calling it an epidemic.
In 2019, 70,000,000—70 million—CSAM images were reported. NYT called it an “almost unfathomable” increase in criminal behavior.
CSAM has overwhelmed law enforcement so much that the FBI now only prioritizes material depicting infants and toddlers while all but ignoring the sexual abuse of “older” children.
In 2020, 21.7 million reports of CSAM were made to NCMEC—the highest number of reports ever received in a year.
The platforms on which CSAM circulates—Instagram, Twitter, and more—are not required by law to report it. Additionally, the Communications Decency Act section 230 (CDA 230) gives near-carte blanche legal immunity to technology companies by saying that they can’t be held liable for facilitating CSAM because they aren’t considered “publishers” under the law.
CDA 230 was passed in 1996. It’s time for a change.
With NO INCENTIVE to report or combat CSAM, why would technology companies ever proactively step up when not doing so rewards greater profits?
This is why the technology industry must be held accountable. This is why the EARN IT Act is important.
The EARN IT Act empowers victims of CSAM by giving them a path to justice and the possibility of restoring privacy lost as their rape, assault, and sexual abuse is endlessly disseminated online.
Under this new legislation, survivors and state attorneys general will be able to sue technology companies for facilitating CSAM using federal civil law as well as state civil and criminal law
The Earn It Act also changes the term “child pornography” in federal statute to “child sexual abuse material” in order to convey the fact that each image or video documents a crime scene.
In addition, the EARN IT Act creates a new Online Child Exploitation Prevention Commission. The commission will establish best business practices and make recommendations to inform policy, the judiciary, and the law enforcement community about protecting children in the ever-changing digital environment.
With goals to prevent sex trafficking, grooming, and predatory behavior, the commission would explore options for protecting children online such as family-friendly filter systems.
That’s why groups such as the Internet Frontier Foundation and Google oppose it: the EARN IT Act has the teeth to push technology companies to proactively clean up their own platforms, detect CSAM, and eliminate it.
Please call and write your state’s members of the U.S. Senate and House of Representatives. Ask them to CO-SPONSOR and VOTE FOR this bill.
It only takes a few minutes and hearing directly from constituents makes a big difference when it comes to convincing elected officials to stand in opposition to Big Tech lobbyists in order to protect our children.