By ALBERT RONG
COVID-19 has sent society into disarray as people try to adjust their daily lives to a new reality. Underneath the cloak of chaos, Senators Lindsey Graham (R-SC) and California’s very own Dianne Feinstein (D-CA), amongst many others, have cosponsored the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019 (EARN IT Act) with the stated purpose to “prevent, reduce, and respond to the online sexual exploitation of children”. However, in order to do so, the legislation calls for the destruction of end-to-end encryption, a system that ensures only communicating users can read messages, not the government or platform that the messages are sent through. This would be implemented through encryption backdoors, allowing the government to decrypt encrypted messages to investigate and prevent child exploitation. This backdoor would most likely be achieved through a key escrow where the government generates and distributes encryption keys to companies while holding onto the decryption/master keys. The proposed legislation does not mention encryption so how would the EARN IT Act break it?
According to Section 230(c)(1) of the Communications Decency Act (CDA), “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”. This protects companies from liability if users of their services upload illegal content. Without these protections, companies would be forced to overcensor content on their platforms in order to prevent any illegal content from surfacing or risk going bankrupt from a multitude of civil lawsuits. The EARN IT ACT requires companies to follow “voluntary principles” to maintain this immunity from lawsuits, essentially forcing them to comply. In an interview with CNET, Lindsey Barret, a staff attorney at Georgetown Law’s Institute for Public Representation Communications and Technology Clinic, “When you’re talking about a bill that is structured for the attorney general to give his opinion and have decisive influence over what the best practices are, it does not take a rocket scientist to concur that this is designed to target encryption”.
The attorney general’s history with end-to-end encryption has been embattled. At an event last October, Barr made it clear that dealing with problems that strong encryption created for law enforcement was one of the Justice Department’s “highest priorities.” He upheld this belief after the Naval Air Station Pensacola shooting last December, where he pressured Apple to unlock the gunman’s phone. In response, Apple provided iCloud data related to the gunman’s account but upheld its policy on encryption stating “law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users’ data”. With Barr in control of the commission established by the EARN IT Act, he would almost certainly use the position to circumvent encryption measures established by companies like Apple.
The main concern stated by the FBI on end-to-end encryption is the “Going dark” phenomenon where progress on criminal cases is hindered due to a lack of access to real-time communications and data. The crux of the issue according to FBI Director Christopher A. Wray surrounded the claim that “Being unable to access nearly 7,800 devices in a single year is a major public safety issue”. However, these numbers are greatly exaggerated. The FBI later revealed that there was an error in testing methodology when coming up with the figure, severely inflating the number of locked out devices, with internal estimates putting the true figure at around 1200. According to Greg Nojeim, Director of a division within the Center for Democracy & Technology, “The report is a clear reminder that policymakers should take the FBI’s claims of going dark with a big grain of salt. This is the third time in three months that disclosures have undermined the FBI’s claims that it needs a mandated backdoor to encryption in cell phones and other devices.”
Now, why is this all a bad idea?
Point 1: Constitutional Rights
The EARN IT Act fails to stand up to strict scruitiny and violates rights established in the Fourth Amendment.
Strict Scrutiny & the First Amendment:
Strict scrutiny is a form of judicial review that courts use to determine the constitutionality of certain laws. Strict scrutiny is often used by courts when a plaintiff sues the government for discrimination. To pass strict scrutiny, the legislature must have passed the law to further a “compelling governmental interest,” and must have narrowly tailored the law to achieve that interest.
The EARN IT Act holds up to the first condition, furthering a “compelling government interest” in reducing the online sexual exploitation of children. Unfortunately, it fails the second condition in “narrowly tailor[ing] the law to achieve that interest” as noted by a coalition of civil society groups stated in a letter to Lindsey Graham:
The recommended best practices must include measures meant to address the problem of “child sexual exploitation.” This term will likely be interpreted in an overly broad manner that would lead to best practices that incentivize impermissible censorship of protected speech alongside efforts to restrict CSAM. This would present service providers of all sizes with a “choice” to either follow government-issued best practices or face liability—thereby violating the First Amendment’s protections for free expression.
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Currently, platforms are free to content filter all material posted on their platforms as Fourth Amendment protections do not extend to private companies, only to the government and agents of the government. The EARN IT Act would extend the federal government’s control over companies and transform them into “agents” of the government, guaranteeing the protections listed. This would mean that companies would be unable to fulfill their obligations in “Prevent[ing] searches of child sexual abuse material from surfacing” listed in the voluntary principles as sitewide content filtering would be considered “unreasonable searches” as listed in the Fourth Amendment due to their newfound classification.
Point 2: Government Responsibility
The government’s history regarding information security has been plagued by overreach, negligence, and abuse.
In the Office of the Director of National Intelligence’s 2019 Statistical Transparency Report, it was reported that the NSA had collected 434,238,543 call records to investigate 11 targets for investigation. The practice of bulk collections of Americans’ call records was outlawed by the 2015 USA Freedom Act, yet the NSA clearly continues in this illegal practice. According to the New York Times, “The N.S.A. blamed the incident on an unidentified telecommunications provider, saying that agency technicians had noticed a problem with the data that the company was sending, and stopped accepting the information in order to fix it.” Patrick Toomey, a staff attorney general with the ACLU, criticized the NSA’s bulk collection, saying:
This surveillance program is beyond redemption and a privacy and civil liberties disaster… The N.S.A.’s collection of Americans’ call records is too sweeping, the compliance problems too many, and evidence of the program’s value all but nonexistent. There is no justification for leaving this surveillance power in the N.S.A.’s hands.
Oversight regarding collected data has also been called into question by Edward Snowden, an American whistleblower who caught national attention when he leaked classified information regarding the bulk data collection capabilities of the NSA. In an interview with The Guardian back in 2014, he recollected his experiences at the NSA:
You’ve got young enlisted guys 18-22 years old. They’ve suddenly been thrust into a position of extraordinary responsibility where they now have access to all your private records. During the course of their work, they stumble across something that is completely unrelated to their work in any sort of necessary sense, for example, an intimate nude photo of someone in a sexually compromising situation. But they’re extremely attractive. So what do they do? They turn around in their chair and show a coworker who says, ‘Hey that’s great. Send that to Bill down the way.’ Then Bill sends it to George, who sends it to Tom.
With this blatant disregard for privacy, Snowden’s experiences reveal the lack of oversight and accountability within the NSA. His claims are further corroborated through a classified internal report by the NSA detailing the internal failure of many in following standard operating procedures.
The EARN IT Act will significantly expand the surveillance capabilities of government agencies which have shown themselves to be incapable of protecting and responsibly using citizens’ data.
Point 3: Backdoors
“You Can’t Have A Back Door That’s Only For The Good Guys”
-Tim Cook (Apple CEO)
The government and its underlying law enforcement and intelligence agencies have long maintained their stance in support of “Strong Encryption” while paradoxically pushing for widespread adoption of encryption backdoor. There is no “balanced” view in terms of encryption, contrary to the viewpoint of the FBI. Once a backdoor is established, it presents a security vulnerability to the system.
A master key could easily be leaked internally. According to the Fiscal Year 2017 Annual Report on Security Clearance Determinations by the National Counterintelligence and Security Center, the number of government contractors, employees, and other workers eligible for “top secret” security clearance in 2017 was 1,309,793, 1,194,962 with immediate access. In total, 4,030,625 are eligible for security clearance on any level, constituting around 1.2% of the US population. With such a large number in access to highly confidential information, the risks to national security are massive. In a letter to NSA Director Adm. Michael Rogers in 2016, Senator Jon Tester (D-MT) expressed his discontent with the security clearance process in response to the arrest of Harold T. Martin III, a contractor who stole approximately 50 terabytes of data from the NSA, stating:
Just as Chelsea Manning, Edward Snowden, and Navy Yard shooter Aaron Alexis exposed the severe vulnerabilities in our security clearance system, it is important that we find out how best to shore up these gaps in order to preserve American security… I am concerned that the process of [Martin’s] vetting – or lack thereof – may have once again exposed the vulnerabilities of our security clearance process to insider threats and, absent serious reforms, will continue to haunt U.S. intelligence agencies for years to come.
Top-level security access is liberally given to many employees, with Edward Snowden and Chelsea Manning relying on this top-level access to leak confidential information to journalists. The creation of a master key could have catastrophic worldwide consequences if it was leaked by one person within the vast sea of carelessly vetted employees with top-level security clearance.
Master keys would also be vulnerable to external attacks. In 2016, the anonymous group “Shadow Brokers” released several confidential software exploits targeting Windows systems developed by the NSA in an easily accessible download. One of the software exploits, EternalBlue, was later used in the ransomware virus WannaCry back in 2017. The weaponized exploit is estimated to have affected more than 200,000 computers across 150 countries, crippling the NHS in the UK by locking doctors out of their patient records. The EternalBlue exploit shows us that it’s not a matter of if backdoors can be exposed but when.
Point 4: Efficacy
The voluntary principles stated within the EARN IT Act seek to achieve its stated purpose to “prevent, reduce, and respond to the online sexual exploitation of children”. Voluntary principles could be counterproductive in achieving this goal. As stated in point 1, the EARN IT Act transforms companies into agents of the government, giving their users 4th Amendment protections concerning data scanning and collections. This makes convicting child sexual abusers much more difficult as evidence collected through platforms could be considered illegally obtained and go against the exclusionary rule derived from the 4th Amendment.
Addressing Concerns on Platform Accountability
Although Section 230 protections shield companies from liability in most cases, there are exceptions. If a company facilitates and profits from federal crimes, they are not granted immunity. Passed in 2017, the “Allow States and Victims to Fight Online Sex Trafficking Act” (FOSTA) made it illegal for websites to knowingly support sex trafficking. Although there are several issues with FOSTA as described by the EFF (that’s a topic for another time), just note that companies do not have free rein to support illegal content and can be charged if willful negligence is proven.
As stated earlier, companies are proactively moderating content on their platforms without laws compelling them to do so. Their support of end-to-end encryption is not incompatible with the fight against child exploitation. Most platforms have implemented their own content filtering systems which include scanning both encrypted and unencrypted content for signs of unlawful content.
Widely used by various platforms in tech, Microsoft developed PhotoDNA is adaptable into an end-to-end encrypted system by comparing digital signatures from user content to a database of hash values of known child exploitation images. A more thorough explanation can be found here.
There are currently 2 techniques that can be used to implement PhotoDNA in end-to-end encrypted systems:
- Data could be scanned client-side before encryption, ensuring that only offending content is blocked while maintaining the privacy of end-to-end encryption for legal content.
- According to Hany Farid, a leading developer of PhotoDNA, “Certain types of encryption algorithms, known as partially or fully homomorphic, can perform image hashing on encrypted data. This means that images in encrypted messages can be checked against known harmful material without Facebook or anyone else being able to decrypt the image.”
The database for child exploitation content is the bottleneck within this process as unfiled child exploitation images would pass through filters. To help alleviate this bottleneck, Google has developed an AI system that helps speed up the classification of child exploitation content by a reported 700%. Continued development of indexing technologies is vital in the fight against child exploitation.
Think of the children!
Politicians have made sweeping accusations directed towards companies using encrypted systems, with Senator Lindsey Graham delivering an ultimatum at a U.S. Senate Judiciary Committee hearing last December:
You’re going to find a way to do this or we’re going to do this for you. We’re not going to live in a world where a bunch of child abusers have a safe haven to practice their craft. Period. End of discussion.
Although content filtering is more effective in non-encrypted systems, the claim that end-to-end encryption creates a safe haven for abusers is patently untrue. Companies like WhatsApp have successfully implemented end-to-end encryption alongside PhotoDNA. Working alongside user reports, they banned approximately 250,000 accounts within the span of 3 months last year. Numerous other companies have implemented PhotoDNA into their encrypted systems with similar stories of success.
A Proposed Solution
Reports of child sex abuse have skyrocketed while arrests and federal funding for divisions stagnated in comparison. These statistics would indicate that federal agencies lack the proper funding and resources to properly investigate reported crimes. At the moment, the EARN IT Act does not include any provisions assisting task forces around the country in stopping these crimes with either funding or resources. Increased federal funding for law enforcement would serve as a more effective solution compared to the voluntary principles found in the EARN IT Act.
It may be that by seizing all of the records for private activities, by watching everywhere we go, by watching everything we do, by monitoring every person we need, by analyzing every word we say, by waiting and passing judgment over every association we make and every person we love, that we could uncover a terrorist plot or we could discover more criminals. But is that the kind of society we want to live in? That is the definition of a security state.
- Edward Snowden
The EARN IT Act is an underhanded attempt to stop end-to-end encryption by leveraging the sexual exploitation of children. The legislation does not provide substantive support in fighting against child exploitation while greatly increasing the potential surveillance capabilities of the government. It attempts to circumvent constitutional protections with its cosponsors either grossly negligent of potential implications or willfully malignant. The heavy-handed solutions proposed are a threat to civil liberties and should be stopped at all costs.
Relevant Donation Links: