Opposition to the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2023 (“STOP CSAM Act of 2023”)

10 May 2023

An Open Letter to Senate Judiciary Committee

Dear Chairman Durbin, Ranking Member Graham, and members of the Committee:

On behalf of the Internet Society,[1] we write to express our strong concern that the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2023 would cause significant damage the operations and even viability of the Internet, and significantly damage security and privacy online. 

Child sexual abuse and the distribution of child sexual abuse materials are both horrific crimes, whether online or offline. Thus, curbing the spread of child sexual abuse material online is a very important goal and we welcome efforts to attempt to hold bad actors to account. However, the STOP CSAM Act would greatly weaken the security and privacy of all Internet users, putting the most vulnerable users (including children) at greater risk of harm. It would also create massive barriers for growth and innovation in the United States tech sector and threaten to damage the foundation of the Internet.

We have a range of concerns with the current draft of the STOP CSAM Act of 2023—many of which are raised by other organizations and experts— including unintended consequences for free expression and limiting law enforcement’s effectiveness on stopping the spread of CSAM. In this letter, however, we focus primarily on one provision that directly threatens the Internet: the broad scope and sweeping potential civil liability in proposed amendments to 18 U.S.C. § 2255. Those amendments—found in Section 6 of the STOP CSAM Act—present an existential threat to the Internet, and broadly undermine security and privacy in the online ecosystem. We believe that Section 6 of the STOP CSAM Act of 2023 must be removed or dramatically reworked.

Civil Liability Amendments in Section 6 of the STOP CSAM Act of 2023

Without the STOP CSAM Act’s proposed amendments, 18 U.S.C. § 2255 already provides victims of child abuse offenses the ability to sue perpetrators of the crimes for civil damages. Six years ago, Congress significantly enhanced the statute in The Protecting Young Victims from Sexual Abuse and Safe Sport Authorization Act of 2017 to allow, among many other changes, an extended statute of limitations on the civil claims. As described by one source of information for victims, the 2017 reforms “further empower[ed] victims of child pornography to hold offenders responsible not only in criminal court, but in federal civil court as well.”[2]

The STOP CSAM Act amendments would do two primary things. First, the amendments specify that any company, non-profit, or individual that is involved in operating any significant aspect of the Internet ecosystem or involved in distributing software that is used in Internet communications, can be subject to lawsuits brought by victims of child abuse. Second, those entities and individuals subject to suit could be held liable for being “reckless” in actions that “facilitate” or “promote” communications over the Internet. This bill would therefore put at legal risk a vast array of individuals, non-profits, small companies, and larger companies that contribute to the development or operation of software or services on which the Internet depends—even though those defendants might have no knowledge of the underlying criminal activity.

The STOP CSAM Act amendments target only the Internet ecosystem for liability. As an analogy, if the amendments had similarly expressly targeted the offline world, package delivery companies could be sued for having unknowingly delivered CSAM, as well as the software distributors who provided the package delivery routing software.[3] Plaintiffs in these hypothetical lawsuits in the offline world might well fail to be successful, but the lawsuits themselves can cause significant harm to the defendants—especially (as is very common the Internet ecosystem) if the defendants are small businesses that cannot afford to defend even a single lawsuit.

More specifically, Section 6 of the STOP CSAM Act of 2023 would amend 18 U.S.C. § 2255 to create new civil liability for “interactive computer services” and “software distributors” by in part, removing Section 230’s applicability to civil lawsuits under Section 2255. Under this legislation, victims could sue a broad set of Internet infrastructure providers and software distributors (many of which have no knowledge of content) for “child exploitation violations” and/or “conduct relating to child exploitation.” In turn, “conduct relating to child exploitation” is defined as “the intentional, knowing, or reckless promotion or facilitation” of a CSAM or related offenses, including the solicitation of minors.[4]

Additionally, the proposed amendments to Section 2255 discuss the use of encryption, making it likely that plaintiffs would be able to argue that the neutral provision of encryption services was in fact evidence that providers were acting “recklessly”. This presents a significant threat to normal, secure operations of the Internet. 

The STOP CSAM Act of 2023 would introduce civil liability for a huge array of Internet providers, software developers, and infrastructure providers that are foundational to the Internet. Compounding on this, these entities would only be required to meet a “reckless” standard to be found civilly liable. As a result, most of the Internet could be construed as open to civil liability for CSAM transmitted over the Internet. Faced with the prospect of civil liability for child sexual abuse material, the following scenarios are very plausible:

  • A distributor of open-source web hosting software that is later used by a CSAM website could potentially be sued under this law. Such software is ubiquitous and widely distributed, and it is unclear whether open-source software used in the operation of the Internet could be distributed without risk of liability under the proposed amendments to Section 2255.
  • Anyone—including the major app stores—who distributes software that permits or supports encrypted communications could be sued under the proposed amendments. The STOP CSAM Act impose great risk in distributing software to protect person-to-person communications. This would impair all Americans’ ability to protect their communications, including victims of abuse who rely on secure communications to seek help. It could also make the distribution of “virtual private network” software very risky, essentially depriving Americans of an often-recommended tool to guard against cyberattacks and identify theft.
  • An Internet service provider that has no knowledge of the content of communications of its customers could be sued for failing to block CSAM passing over their network, even if that traffic was encrypted such that the service provider could not know the traffic contained CSAM. Although some major ISPs might be able to withstand the potential liability, a great deal of Internet access in the United States is provided by thousands of small “mom and pop” ISPs in rural communities, and many of these companies would be driven into bankruptcy by even a single lawsuit brought by a CSAM victim.
  • Numerous other companies that provide essential infrastructure services on the Internet could also be sued for providing services or allocating Internet resources that are subsequently used—unknowingly to the service provider—to “facilitate” the spread of CSAM. While some of these entities might win after months or years of litigation, they will still face the costs and threat of repeated litigation.

Faced with the threat of civil liability, stakeholders across the Internet ecosystem may be forced to attempt to take drastic actions to attempt to demonstrate they are not reckless towards CSAM. For many stakeholders this will be impossible. For software distributors, for example, there is nothing that they can do to eliminate this new risk of liability. If the software they distribute is later used for the promotion or facilitation of CSAM or solicitation of minors, the software distributor could be liable. Other stakeholders would likely be forced to undermine the existing security and privacy of their services or implement altogether new and insecure surveillance methods as an attempt to shield themselves from liability. Network operators may resort to blocking secure traffic and introducing content scanning mechanisms that would both invade the privacy of all Internet users, and slow down Internet traffic overall. The pressure of risk of civil liability would risk undoing some of the key security improvements made on the Internet in the last decade.

We recognize that, over time, it is possible courts would rule that some of these potential liabilities are too far removed from the distribution of CSAM for the provider of the software or services to be liable. But there are two risks implicit in this strategy. First, by removing the liability shield that Section 230 historically provided, some of those providers will inevitably face a long and costly—possibly bankrupting—path to prevail. Second, it is risky to create very broad legislative language and hope that the courts will ultimately do the right thing.

Undermining Cybersecurity

Faced with potential civil liability for the broad category of “conduct relating to child exploitation” under Section 6 of the STOP CSAM Act, Internet computer service providers will likely be forced to undermine the security and privacy of their services and products to attempt to avoid being found reckless and civilly liable. As a result of the changes they will choose to make to their systems, to avoid liability all Internet users, including children, would be less safe online. Far too many children are exploited and abused not by strangers, but by people who are close to them. These children need the protections of strong encryption and high security precisely so that they can get the help they need to escape their abuse.[5] This legislation, as drafted, would unfortunately and ironically increase the security and safety risk to these victims and all children in the United States.

To attempt to safeguard themselves from potential civil liability, Internet computer service providers may attempt to implement techniques to scan and moderate all user content on their platforms or infrastructure. For some Internet computer service providers, introducing these will be technically impossible without greatly undermining security and privacy.

The STOP CSAM Act would pose a severe threat to the use of encryption. Strong encryption is critical to ensuring Internet users are protected online. End-to-end encryption offers the strongest level of protection to its users. The consensus among cybersecurity experts is clear: forcing end-to-end encrypted communications services to scan its users’ messages is impossible without undermining the security and privacy of all its users.[6] Opening a backdoor for scanning also opens a backdoor for criminals to steal our most important information. Not only would our personal medical and banking information be less safe, but it also puts the personal information of society’s most vulnerable at greater risk. For children online, this would make it easier for predators and criminal organizations to access children’s private messages, potentially helping them gather and share information to better engage in grooming. At the same time, encryption software will remain widely available—for free download—around the world. Sophisticated criminals are adaptable, including those who produce or distribute CSAM. They will use encrypted software and services, and will also move to secure services outside of the United States and, putting it farther beyond the scope of U.S. law enforcement.

New Barriers for US Technology Industry

Introducing broad civil liability to interactive computer services and computer software distribution services will create an insurmountable barrier to the technology industry in the United States—including innovators and other small companies. It will not only undermine the ability of new US companies to grow or compete in overseas markets but will also limit investment in the US technology sector.

For smaller interactive computer services and computer software distribution services, just litigating a civil lawsuit enabled by this legislation could be untenable. Small companies could face bankruptcy from the costs of a court case, both in legal and reputational costs. These concerns could dissuade entrepreneurs from launching technology start-ups in the United States in favor of other jurisdictions. Investment capital may also be less willing to invest in new ventures, which would be particularly vulnerable to litigation risks that could sink a new company.

For interactive computer services who attempt to protect themselves from civil liability by undermining the security and privacy of their services, the economic damage can also be severe. Weakened security can leave American businesses more exposed to costly cyberattacks and intellectual property theft. With less secure products, these businesses will also face challenges in foreign markets. When Australia passed a law undermining end-to-end encryption in 2018, the Australian digital industry lost an estimated $AUS 1 billion in current and forecast sales and further losses in foreign investment because of decreased trust in their products.[7]

Conclusion

Beyond the issues discussed above concerning Section 2255, the bill raises other concerns, including:

  • Section 2260B of the proposal would create a criminal provision that raises very similar concerns as discussed above in Section 2255. The criminal risks posed by 2260B are greater than with Section 2255—with the qualification that federal prosecutors are likely to be far more discerning and careful with any charging decisions in comparison with the incentives that face private plaintiffs and their attorneys in deciding whether to bring lawsuits under Section 2255.
  • As with other parts of the STOP CSAM Act, the new “report and remove” process raises significant free expression issues, but also raises concerns—similar to with Section 2255—about Internet infrastructure providers facing numerous legal challenges for content they do not control.

The STOP CSAM Act aims at a very positive goal—allowing victims of child abuse to seek civil damages from perpetrators—and depending on the way it would be achieved the Internet Society could support expanding civil liability to reach those who knowingly support the distribution of CSAM. The STOP CSAM Act, however, would have the unintended consequence of decreasing security and privacy for everyone, undermining the competitiveness of the US technology industry, and destroying key foundations of the Internet. We strongly encourage the Senate Judiciary Committee to, at a minimum, remove Section 6 from the STOP CSAM Act, and address related concerns elsewhere in the bill.

Sincerely,

John B. Morris, Jr., Principal, U.S. Internet Policy & Advocacy, Internet Society


Footnotes:

[1] Founded in 1992, the Internet Society is a U.S. non-profit organization headquartered in Reston, Virginia and Geneva, Switzerland for the world-wide coordination of, and collaboration on, Internet issues, standards, and applications. As a global non-governmental organization, the Internet Society believes that the Internet should be for everyone. It supports and promotes the development of the Internet as a global technical infrastructure, a resource to enrich people’s lives, and a force for good in society, with an overarching goal that the Internet be open, globally connected, secure, and trustworthy. The Internet Society’s staff is comprised of technical experts in internetworking, cybersecurity, and network operations, among other fields, as well as policy experts in a broad range of Internet-related areas.

[2] Child Pornography Victims Deserve Justice, “Section 2255,”.

[3] In this offline hypothetical, a victim might assert that a package delivery company should have refused to deliver any packages to a neighbor who was on the sex offender list.

[4] The very use of the terms “promote” and “facilitate” raise very significant constitutional concerns.  See Elizabeth Nolan Brown, “Appeals Court Panel Seems Skeptical That FOSTA Doesn’t Violate the First Amendment,” Reason, January 18, 2023.

[5] Child Rights International Network and defenddigitalme “Privacy and Protection: A children’s rights approach to encryption”. 2023, p. 87.

[6] Abelson, Hal et. Al. Bugs in our Pockets: The Risks of Client-Side Scanning. 2021.

[7] New Study Finds Australia’s TOLA Law Poses Long-Term Risks to Australian Economy. Internet Society. June 2021.