Future Thinking: Orla Lynskey on Data in the Age of Consolidation Thumbnail
Internet's Future 7 December 2018

Future Thinking: Orla Lynskey on Data in the Age of Consolidation

Last year, the Internet Society unveiled the 2017 Global Internet Report: Paths to Our Digital Future. The interactive report identifies the drivers affecting tomorrow’s Internet and their impact on Media & Society, Digital Divides, and Personal Rights & Freedoms. We interviewed Orla Lynskey to hear her perspective on the forces shaping the Internet’s future.

Orla Lynskey is an associate professor of law at the London School of Economics and Political Science. Her primary area of research interest is European Union data protection law. Her monograph, The Foundations of EU Data Protection Law (Oxford University Press, 2015), explores the potential and limits of individual control over personal data, or “informational self-determination’” in the data protection framework. More recently, her work has focused on collective approaches to data protection rights and mechanisms to counterbalance asymmetries of power in the online environment. Lynskey is an editor of International Data Privacy Law and the Modern Law Review and is a member of the EU Commission’s multistakeholder expert group on GDPR. She holds an LLB from Trinity College, Dublin, an LLM from the College of Europe (Bruges) and a PhD from the University of Cambridge. Before entering academia, she worked as a competition lawyer in Brussels and as a teaching assistant at the College of Europe.

The Internet Society: You recently edited a symposium edition of International Data Privacy Law (IDPL) in which you argue that the interplay of law related to data protection, competition, and consumer protection is at a crucial crossroads. Why, and how does this play out in the Internet domain?

Orla Lynskey: These areas of law are at a crossroads in two senses. The first is that there has now been increasing recognition from regulators that they do overlap in some circumstances. A good example of this is the reference in the Microsoft/LinkedIn merger decision to data protection as a parameter on which firms compete, or the claim that Facebook is abusing its position of market power by making access to its service conditional on excessive data collection on various third-party websites being investigated by the German Competition Authority. However, we are also now at a crossroad in a second sense: having recognised that these areas of law need to be applied in a holistic manner, we now need to consider from a practical, procedural perspective how this overlap can be managed.

You’ve written elsewhere that digital consolidation can have an effect on digital inequality by giving platforms not just market power, but also the “power of providence.” What do you mean by this and how does it impact marginalised communities in particular?

Providence is defined in various ways, including as a form of non-human influence that controls people’s lives. I argue in the paper that dominant digital platforms have a “power of providence,” as they are – like the eye of providence – all-seeing: they have the ability to link and analyse diverse datasets in a way that provides a comprehensive overview of the lives of individuals, rendering them transparent in the process. Furthermore, they can use this unique vantage point in order to influence individuals in ways that we might until now have viewed as dystopian, for instance through personalised political advertising. Finally, the Internet’s architecture and the terms used to describe its processes (for instance, “machine learning”) give the false impression that the way in which our data is used to influence us online and nudge us in particular directions is untethered from human input, or is “neutral.” In this sense, it is given a quasi-divine status.

I suggest that this power of providence can have the particularly pernicious effect of exacerbating existing societal inequalities. I argue in the paper that this ability to use data to influence people can be used to discriminate, to differentiate and also to create perceptions. For instance, I was able to draw on the work of other scholars to indicate that data mining facilitates differentiation on the basis of socioeconomic status, which is not something that discrimination law prohibits. This research suggests that the poor are subject to more surveillance with higher stakes and are particularly vulnerable to data mining processes as a result of the devices used to connect to the Internet (notably, mobile phones which are less secure than other devices). While differentiation via data mining is not the sole purview of platforms which such power, their privileged position gives them superior data-mining capacity and means the existing information and power asymmetries are exacerbated.

Can competition law challenge the power of providence? What about data protection law? How can these work together to protect digital rights?

Competition law provisions are the only legal provisions explicitly designed to constrain the exercise of private power and so it makes sense to consider whether they can be of assistance in challenging this power of providence. I believe that, at a minimum, competition law should not make matters worse by, for instance, facilitating data-driven mergers that further consolidate our data in the hands of a very limited number of private actors. However, in some circumstances competition law could also limit abusive behaviour – for instance, exploitative terms and conditions for data usage – by firms with market power.

That said, competition law has its own limits and should only ever be a part of the overall jigsaw puzzle, with data protection law playing a leading role in regulating how our personal data can be used. To date, EU data protection law has not been robustly enforced, but I am one of those who remain optimistic that with stronger enforcement this system could be really effective.

If data protection, consumer protection, and competition law are all important in challenging harmful digital dominance, how do the different regulatory agencies responsible for dealing with these respective issues work together without encroaching on each other’s domains? Is there a need for better multistakeholder collaboration in this regard?

It is this question – of the division of labour between regulatory authorities – that has yet to be really ironed out. Ideally, as the European Data Protection Supervisor has proposed, these agencies would collaborate with one another under the auspices of a “Digital Clearing House,” or something similar.

Germany recently announced plans to try to curb digital dominance using competition law. Have you noticed any trends when it comes to other competition authorities’ responses to tech dominance around the world, and particularly how they are defining relevant markets?

There is definitely a growing recognition of the power of technology companies amongst regulators, and the wider public. This may be where competition law hits its limits, however: competition law provisions do not prevent a company from acquiring a position of market power, they simply make it unlawful for that company to abuse that position of market power in a way that is exploitative or that would exclude equally efficient competitors from the market. Economic regulation could, for instance, force tech companies to ensure structural separation between various operations (e,g., a structural separation between Facebook and WhatsApp). However, this would require legislative intervention.

The exception to this is in the context of mergers, where competition authorities get to look at the potential future impact of a transaction on the market. Here, I have argued in the past that data-driven mergers should be treated in an analogous way to media mergers and subject not only to an economic assessment but also to a broader non-competition assessment to gauge their impact on data protection and privacy. This is one of the ideas being considered in Germany and I think it is likely other competition authorities will introduce similar measures in due course.

What do you think of the idea that user data should be given digital property rights (i.e., that platforms should pay users for their data)?

Property rights in personal data are a terrible idea: they offer no real advantages compared to the current legal framework and risk exacerbating information and power asymmetries while undermining data protection as a fundamental right. Giving property rights in data would not strengthen our hand when it comes to negotiating with the tech giants, rather it would simply mean that we would lose all rights over that data once we entered into contracts with these companies. I also worry that going down this route would make data protection a luxury that can be enjoyed by those who could afford not to have their data processed, even perhaps creating the skewed incentive to reveal more data, or more sensitive data to profit from it. This is incompatible with the EU Charter right to data protection. I discuss this issue in my book on the foundations of EU data protection law. 

Is there hope in data portability as a way of countering data effects and addressing consolidation concerns?

Potentially. One explanation for the GDPR right to data portability is that it may empower consumers to switch service providers if they are unhappy with a service (for instance, to switch from Facebook to a mythical alternative if you are unsatisfied with the quality of the data protection offered). However, as I discuss in my research, the impact of this right on competition and innovation is ambiguous. It could, for instance, deter innovation by locking in the standards used by incumbent companies or increasing the costs of startups. This is all the more so as it does not require interoperability. However, whether interoperability is desirable from a data protection perspective is equally contestable. I would suggest that portability should be viewed through the lens of individual control over personal data rather than simply as a market tool, given these ambiguous effects.

What are your fears for the future of the Internet?

My main fear about the Internet is that a medium which promised so much for the advancement of rights – such as freedom of expression and of association – may end up having corrosive and divisive real world effects. One of the advantages of the Internet was that it offered people the opportunity to connect with those with similar niche interests (the Eric Cantona Appreciation Society, for example) but the personalisation of all content, including for instance political content, may push this to an extreme. That is not to say that personalisation is the only factor feeding into this concern, needless to say.

What are your hopes for the future of the Internet?

I think the Internet at present is based on a data bubble that needs bursting. The primary example of this is the excessive data processing that online behavioural advertising entails. Even if we could argue that processing of personal data is the quid pro quo for access to online services and content that are free at the point of access, the amount of personal data processed for that exchange is clearly disproportionate. Regulators have not yet gotten to grips with this but data protection law provides a potential ground on which to challenge this processing: when considering whether consent is freely given, utmost account needs to be taken of whether the service is made conditional on consent to unnecessary processing. I have not yet seen any empirical evidence that convinces me that online behavioural advertising is so much more effective than contextual advertising that it justifies this excessive incursion into our rights.

We’re getting ready to launch the next Global Internet Report! Read the concept note and learn how the Internet Economy might shape our future.

Disclaimer: Viewpoints expressed in this post are those of the author and may or may not reflect official Internet Society positions.

Related articles

Internet's Future 22 April 2024

The Internet and Climate Change

As we celebrate Earth Day 2024, the world seems to be on fire. Quite literally with some regions battling...

Internet's Future 2 December 2021

Notes from Madagascar: What It’s Like to Be Affected by the Digital Divide

How can a country with high Internet speeds and comparatively low prices be impacted by the digital divide?

About Internet Society 19 August 2021

A Partnership to Advance Digital Rights and Internet Development in Africa

The Internet Society and CIPESA have commited to work together for an open, secure, and trustworthy Internet for Africa.