Future Thinking: Alissa Cooper on the Technical Impact of Internet Consolidation Thumbnail
Internet's Future 12 February 2019

Future Thinking: Alissa Cooper on the Technical Impact of Internet Consolidation

In 2017, the Internet Society unveiled the 2017 Global Internet Report: Paths to Our Digital Future. The interactive report identifies the drivers affecting tomorrow’s Internet and their impact on Media & Society, Digital Divides, and Personal Rights & Freedoms. While preparing to launch the 2019 Global Internet Report, we interviewed Alissa Cooper to hear her perspective on the forces shaping the Internet’s future.

Alissa Cooper is a Fellow at Cisco Systems. She has been serving as the Chair of the Internet Engineering Task Force (IETF) since 2017. Previously, she served three years as an IETF Applications and Real-Time (ART) area director and three years on the Internet Architecture Board (IAB). She also served as the chair of the IANA Stewardship Coordination Group (ICG). At Cisco, Cooper was responsible for driving privacy and policy strategy within the company’s portfolio of real-time collaboration products before being appointed as IETF Chair. Prior to joining Cisco, Cooper served as the Chief Computer Scientist at the Center for Democracy and Technology, where she was a leading public interest advocate and technologist on issues related to privacy, net neutrality, and technical standards. Cooper holds a PhD from the Oxford Internet Institute and MS and BS degrees in computer science from Stanford University.

The responses below are Cooper’s personal views and do not represent the views of the IETF.

The Internet Society: This year we’re focusing our Global Internet Report report on consolidation in the Internet economy. We’re specifically investigating consolidation trends in the access, services, and application layers of the Internet respectively, as well as consolidation trends acting vertically across layers (e.g., companies gaining dominance in more than one of the Internet’s layers). Have you noticed a trend in this regard?

Alissa Cooper: Yes, although I think it would be useful to develop more quantitative measures to demonstrate the trend over time.

If yes, how does this the trend impact the IETF?

Standards development always has a strong and multifaceted relationship to market dynamics. From a technical perspective, trends toward consolidation have caused engineers in the IETF to consider the implications of their technical designs. If we design standardized communication protocols in certain ways, such protocols may be more likely to support distributed or decentralized infrastructure or services. Conversely, there are design choices we can make that could reinforce consolidation or the offering of certain services from a smaller and smaller number of large companies. This question about whether the building blocks that we design in the IETF are reinforcing the consolidation trend has come into sharper focus in recent times.

The consolidation trend also has the potential to affect who participates in the IETF and how those in the industry view the value of standardization. Larger, more prosperous companies tend to have a greater ability to support standardization work, which is often paid for out of R&D or innovation budgets. As the mix of companies that provide Internet infrastructure and services changes, the composition of IETF participants tends to change as well. That mix can also affect corporate standardization strategy. More dominant players might standardize if they perceive that it reinforces their own technology – or they might chose not to, if they perceive that it is unnecessary given their dominant position.

Could you tell us more about Hypertext Transfer Protocol Version 3 (HTTP/3) as an example of how digital dominance, or scale, can drive standard development?

HTTP/3 is a work-in-progress aiming to define how HTTP traffic will be carried over a new transport protocol, QUIC, which is also in development. Historically, there have been two main transport protocols that have seen wide deployment on the open Internet: the Transmission Control Protocol (TCP) and the User Datagram Protocol (UDP). Many other transport protocols have been designed and standardized over the decades, but few have seen wide deployment. Some of the reasons for this include the existence of equipment in the middle of the network that filters transport protocols it does not recognize, and the difficulty of getting support for new transport protocols into the many different operating systems running on Internet-connected devices today.

QUIC is designed specifically to overcome these barriers. QUIC traffic, including the metadata that identifies the protocol, is always encrypted, so networking equipment cannot filter it based on the metadata or the traffic content. And QUIC is built on top of UDP, so it does not require operating system modifications in order to be deployed. Thus QUIC and HTTP/3 are instructive if we look at what it takes to get a protocol deployed at scale on today’s Internet, which is truly a heterogenous network of networks.

QUIC began as an experiment at Google before it was brought to the IETF for standardization, and Google has a large deployment of its own, pre-IETF version of QUIC. I think the fact that a company with such a large footprint on the web – both from the browser/mobile device side and the server side – was interested in standardizing this definitely caused others to become more interested in participating in the effort. But this is certainly not a case where a single large company has dominated the standards process; in fact we have seen quite the opposite, with participation from dozens of organizations and individuals as well as major improvements to the IETF version of the protocol that are incompatible with Google’s existing deployed version.

Do you consider this a positive or negative example of digital dominance? If successful, could it allow a dominant browser provider to gain significant market power (as argued here)?

If QUIC and HTTP/3 deliver on their design goals – improving performance and security – then my expectation would be that all browsers and web servers that choose to implement them will reap those benefits. I think in general wider use of encryption at the transport and application layers is a positive development because it helps to protect end users’ data in transit to the sites and services to which that data was destined anyway. It creates an impetus to re-think how certain network management and measurement functions that previously required access to unencrypted data can work. This may require some re-engineering, but my hope is that it will not detract from the overall positive impacts of transport protocol evolution.

An IETF working document (or Internet-Draft), Considerations on Internet Consolidation and the Internet Architecture, was recently published. Can you tell us more of what’s being investigated and proposed?

The Internet Architecture Board (IAB) provides long-range technical direction for the Internet’s development. This draft document arose out of conversations that the IAB has had concerning consolidation on the Internet. We have tried to tease out the technical and economic factors that may be contributing to consolidation. This includes looking at the underlying security and privacy properties of networks and services and the evolution of content delivery. The draft currently poses questions and does not provide many answers. The IAB is continuing to discuss how we can learn more about why expectations for decentralized protocol deployment are or are not coming to fruition in practice.

Much of the public debate seems to focus on consolidation from predominantly a perspective of negative implications. Can you think of any positive aspects of consolidation?

Sure. In some cases larger entities can have faster, broader, positive impacts on end users. Today, if one or a small handful of the largest web properties, content delivery networks, or email service providers chooses to deploy a new security technology or implement a performance-enhancing feature, those improvements can benefit millions or billions of users on short order. Furthermore, the ability of larger entities to collect more data about what is happening on the network can help improve the quality of the services they provide, for example by enhancing their ability to identify denial-of-service attacks or spam.

Is competition law the only solution to consolidation problems? If “code is law,” how can the technical community help prevent the potentially negative consequences of consolidation trends?

There are people in the technical community who are trying to identify the relationships between technical design choices and consolidation, but many of the drivers for the consolidation we are witnessing are based in business and economics, not purely in technology. I always found Lessig’s articulation of how technical, economic, social, and legal influences reinforce one another to be a more compelling framework for understanding the shape of the environment in which technology exists than the more simplistic “code is law” tag line. Specifically when it comes to dominance and market power, competition law is likely the most powerful tool available, and one whose application could yield both immediate and longer term effects unlike any that may be achievable merely by shifting the design of the sorts of technical building blocks that we specify in the IETF.

Is there hope, from a technical perspective, in data portability as a way of addressing consolidation concerns?

The main barriers to data portability are not technical ones. We have a multiplicity of ways to port data of all kinds between services in a standardized fashion, if the incentive or regulatory requirements to do so were in place.

What are your fears for the future of the Internet?

My biggest fear is that as the Internet gets more deeply ingrained into society, that it will be increasingly blamed for society’s ills. I believe in tackling problems at their source. At times that means deploying a technological solution or regulating how technology is used, but at other times it means regulating behavior or inspiring behavioral change.

What are your hopes for the future of the Internet?

The Internet has a long history of serving as an open, global platform for communication and human connection. My hope is that even as market dynamics shift, technology evolves, and geopolitics change, the fundamental properties that have made the Internet the most successful communications medium in human history will remain solid and flourish.

We’re getting ready to launch the 2019 Global Internet Report. Read the concept note.

Image ©IETF LLC

Disclaimer: Viewpoints expressed in this post are those of the author and may or may not reflect official Internet Society positions.

Related articles

Internet's Future 22 April 2024

The Internet and Climate Change

As we celebrate Earth Day 2024, the world seems to be on fire. Quite literally with some regions battling...

Internet's Future 2 December 2021

Notes from Madagascar: What It’s Like to Be Affected by the Digital Divide

How can a country with high Internet speeds and comparatively low prices be impacted by the digital divide?

About Internet Society 19 August 2021

A Partnership to Advance Digital Rights and Internet Development in Africa

The Internet Society and CIPESA have commited to work together for an open, secure, and trustworthy Internet for Africa.