They say “personal data is the new oil”… like that's a good thing. Thumbnail
Economy 16 October 2014

They say “personal data is the new oil”… like that's a good thing.

By Robin WiltonDirector, Internet Trust

But what has experience taught us about oil? Its use brings economic growth, productivity benefits and the personal convenience of global mobility, yes.

However:

  • Its extraction is neither ethically nor ecologically neutral.
  • It is toxic, and can also be flammable and explosive.
  • Its processing, storage and use generate risk, and harmful by-products which are changing our planet’s climate and threatening modern social norms.

Maybe they have a point, at that. The parallels with big data are stronger than they look.

The underlying principle, whether we’re talking about crude oil or silicon, is that technical innovation often asks economic and social questions to which technology does not have the answer.

Let me make this less abstract.

Patterns of location data (for instance, collected from a mobile handset or intelligent domestic objects) may begin to give early clues about the onset of Alzheimer’s. At first sight, this looks like one of those beneficial uses of big data which we should adopt without hesitation.

Before big data, one can imagine that concerns about early signs of dementia might surface through a conversation between the individual and his or her doctor, and would be confidential between the two of them.

But in the big data context, the overwhelming probability is that incipient Alzheimer’s would first become known to a third party; quite possibly a third party with no professional or medical relationship to the individual. After all, your device manufacturer is not your doctor. The question then is: what should be done with that information?

Should it be communicated to the individual’s primary healthcare provider? And if so, is that a state healthcare professional or a commercial one?

Or perhaps it is an insurance company? That may depend on which country we’re talking about, and in each case, the third party’s societal and economic incentives may be different.

Should it be communicated to the individual’s family or partner? Bearing in mind that Alzheimer’s can lead to a loss of mental capability, there may come a point at which the individual is no longer legally competent. At what point does the data mean we should *not* tell the individual, but should deal only with someone to whom authority has been delegated on their behalf?

I describe this example because the problem of delegated authorisation is already known to be a difficult one for technology to solve. It is difficult to solve, even if what you want is a digital authorisation for your neighbour to collect a parcel from the post office on your behalf. And yet here, we are talking about what to do about someone with the initial signs of dementia. Not collecting a parcel. So, next time you hear someone proclaim that “personal data is the new oil” and the key to our future economic well-being, remind them that technology can raise societal and economic issues to which it, technology, does not have the answer. We need to bear that in mind, as we rush excitedly towards the next alluring innovation in data mining.

Above all, we need to be asking not just “can we do this?”, but “should we do this?”. Too often, the present approach of governance, risk and compliance (GRC) becomes an exercise in ticking the boxes on a compliance checklist, with the goal of minimising the liability of the organisation, not protecting the interests of the data subject.

The Internet Society would like to see a focus on ethical data-handling [1]. Organisations should ask themselves:

  • Does this use of data genuinely reflect the interests of the data subject as well as the interests of the organisation?
  • Is there transparency and accountability in its collection, sharing and use?
  • Would this use of data come as a surprise or a shock to the individual concerned?
  • When the organisation faces a choice about what to do with data, which option represents the greatest fairness, transparency and accountability?

The Internet is for everyone. Innovation should respect the interests of the individual.

For further reading:

Transcript of my remarks to the OECD’s Global Forum on the Knowledge Economy (Toyko, Oct 2014)

ISOC White Paper on ethical data handling:

[1] Ethical Data-Handling: Wilton (PDF), Runnegar (ISOC, 2014):

Related academic/research papers:

Critical Questions for Big Data: boyd, Crawford, (Information, Communication and Society, 2012)

Critiquing Big Data: Politics, Ethics, Epistemology: Crawford, Miltner, Gray (IJC 2014)

Cultural Ideology of Big Data: Jurgenson

Disclaimer: Viewpoints expressed in this post are those of the author and may or may not reflect official Internet Society positions.

Related articles

19 December 2018

Empowering Moroccan Cooperatives to Participate in the Digital Economy

KASBUY is a web platform to help Moroccan cooperatives, especially ones from women, to promote their handicrafts on international...

Internet's Future 7 December 2018

Future Thinking: Orla Lynskey on Data in the Age of Consolidation

Orla Lynskey is an associate professor of law at the London School of Economics and Political Science. Her primary...

Internet's Future 27 November 2018

Future Thinking: Payal Malik of the Competition Commission of India

Payal Malik is the Economics Adviser and Head of the Economics Division (Chief Economist) at the Competition Commission of India....