Privacy 23 October 2025

The Risks Far Outweigh the Goals

On 9 October 2025, Natalie Campbell testified at Canada’s Standing Senate Committee on Legal and Constitutional Affairs on Bill S-209. The session did not allow time for the whole speech, so find the full transcript. Watch her testimony online.

Good morning.

I’d like to thank the Senate committee for the invitation to appear on Bill S-209.

My name is Natalie Campbell. I am a mom of two kids, and that is my first job. I am also the Senior Director of North American Government and Regulatory Affairs at the Internet Society.

The Internet Society appreciates the efforts of the Senator Miville-Dechêne and this committee to focus on child safety online.

We are a global organization—made up of many parents and caregivers of children—all working to make sure the Internet is for everyone.

That includes making sure everyone can have safe experiences online, especially children.

We do this by supporting community-driven solutions to bring fast, affordable Internet to some of the hardest to connect places in the world.

We also help policymakers trying to address important safety concerns understand how laws could unintentionally undermine our shared goals for a safer Internet and what it needs to exist in the first place.

I think we all agree children shouldn’t be exposed to pornographic content, especially unintentionally.

Age checks can help protect children from unwanted or inappropriate online experiences. But how these age checks happen matters because all types of age checks have trade-offs, and none are immune to the risk of privacy breaches.

We are seeing a wave of these proposals globally with a wide array of methods—from requiring users to upload government IDs, to using facial scanning technology, to relying on third-party data brokers.

While these measures stem from a valid desire to protect children, they often create new and significant risks to their privacy, security, and even their ability to access the Internet.

Many of these risks are considered in S-209. We appreciate the work to improve the bill since S-210. The new version is not perfect, but it is clear the sponsors have been working hard to address stakeholder concerns.

The Internet Society has several points of concern with S-209, but given limited time, I will focus on two that require significant consideration.

Age Verification

First, the age verification requirement. As a mom, I am obviously very concerned about my kids’ safety online and understand my personal responsibility in this matter.

I also understand why I can’t trust any online service with their personal data. This contradiction is a burden on parents and a complexity for age verification policy.

I don’t want my kids tracked online and I don’t want bad people to access their personal information when it is breached or because bad people see they are children.

This risk is already playing out before our eyes as more details emerge about the Discord breach involving government IDs and personal information used for age checks.

My own kids are gamers. While I was relieved that I have never let them have a Discord account—and trust me, they tried—my heart breaks for the youth and families who are at risk of having their personal information released on the dark web.

Regardless of what age verification methods are used, and whether or not they are done by third party services, none are immune to data breaches. The harm related to having your sensitive information leaked online is irreparable, and will carry risks that follow children their whole lives.

So, how do we keep our kids safe AND protect theirs and EVERYONE’s privacy and security online? Moms like me want this to be easier and safer to achieve. So we need safeguards to keep children safe and safeguards to protect our sensitive information.

S-209 has considered some important safeguards. It narrowed the scope of entities required to do age verification.

We appreciate the effort to avoid putting that duty on services that operate important functions of the Internet’s infrastructure that aren’t always aware of what content flow through their networks. This will reduce the amount of third-party services people have to trust with our most sensitive data to simply use the Internet.

However, the Bill’s well-intended exclusions are still too vague to provide legal certainty. They could still capture many Internet services involved in the basic functioning of the Internet and private communications. It also needs protections to ensure no covered entity should be required to bypass, weaken, or break crucial security tools like encryption to comply with the law.

Encryption is our most powerful shield to protect privacy and security online, especially in the absence of modern privacy law in Canada.

Content Blocking

Second, the content blocking enforcement mechanism. The bill acknowledges court orders to Internet service providers may result in over-blocking legitimate content, though without safeguards.

More troubling, however, is that it overlooks the broader risk of blocking Internet access options in hundreds of rural, remote, and underserved communities across Canada.

The definition for Internet service provider is too broad. Left as is, court orders could easily issue blocking orders to coffee shops, community centers, public libraries, schools, and universities who provide public Internet access.

It could also target the dozens of small non-profit community networks—from the Arctic community of Ulukhaktok to Winnipeg’s North End Connect—that bring affordable connectivity to underserved communities across Canada.

Faced with a court order, any one of them would likely be forced to shut down, as most lack the expertise, resources, and financial means for expensive content blocking mechanisms.

And this would have a disproportionate impact on Indigenous communities that are already underserved or cannot afford Internet service options.

Furthermore, content blocking is ineffective and introduces security risks that undermine the goals of this Bill. Long gone are the days of video rental stores that hide adult content behind curtains.

Content blocking might seem like an obvious solution, but like the flimsy curtain at the video store, blocking methods like DNS and IP blocking don’t remove the content from the Internet.

The material would still be accessible to a person determined to find a workaround like dodgy sites and apps, or the many websites that wouldn’t comply with this bill.

Close

We all want kids to be safe online, but unfortunately this Bill undermines our ability to reach that goal. While it may end up blocking some kids from adult websites, the risks far outweigh the goals.

I look forward to your questions.

Related Speeches

Internet Governance 13 October 2025

The Future of Multistakeholder Digital Governance in Asia-Pacific

Keynote Address by Sally Wentworth, President and CEO, Internet Society and Internet Society Foundation, to the Asia Pacific Regional...

Community 29 September 2025

Rebuilding Together: The Power of Africa’s Technical Community 

Remarks by Sally Wentworth for the Opening Ceremony of the Africa Internet Summit 2025, delivered live via Zoom to...

Closing the Digital Divide 18 August 2025

100 More Community Networks Across Latin America – Our New Commitment to the CITEL Alliance 2030 Program

On Monday, 18 August 2025, Internet Society President & CEO Sally Wentworth spoke in the opening plenary of the...