It's time we value our data - as people and society

Written by
David D'Souza
CIPD

Published
11 Jun 2018

11 Jun 2018 • by David D'Souza

In 2013 every conspiracy theorist found vindication in the releases made by Edward Snowden about electronic surveillance. They were right and the rest of us were proved wrong. 

Our communications were being intercepted and processed to produce a bank of information on a scale that was almost incomprehensible. Our thoughts and sentiments were accessed by people other than the intended recipients and technology made this possible.

So what was our shared reaction? Did we review all of our interactions with the world? Did we overthrow governments for the invasion of our privacy? Instead, we committed to a technology utilisation spree that is probably unrivalled in history. Electronic devices connect us all, creating a technological ecosystem where we share our locations, our moments with loved ones and what we like and dislike – whether those preferences are political or just where we think does the best burger. 

That's what we did in the five years post Snowden. We continued on a binge sharing of information that made it appear as though we had no awareness or willingness to confront the truth that our information has value beyond our circle of friends or communities. People evaluated the trade-off of access to those communities as a fair exchange for their data. 

It is possible that the cost of online social exclusion now has forced a position where not being one of the two billion people on Facebook or the over 500million professionals on LinkedIn is unconscionable. It seems unlikely, however, that 2.7billion people have needed to download Candy Crush. 

We trade away our data for access and believe that the consequences of this are either remote or not problematic. We click to allow endless apps access to our contacts and camera in a series of unthinking contributions that add value to organisations that are changing the fundamentals of our society – organisations that have reframed the words ‘friend’, ‘like’ and ‘community’ without us appreciating how far reaching their influence might be.   

The narrative being pushed now is that Cambridge Analytica will change things. Yet Snowden was supposed to have a similar impact and failed to. In the aftermath of Snowden we saw people expressing their concern over government surveillance over Facebook. In the aftermath of Cambridge Analytica we see the campaign to #DeleteFacebook being picked up on Snapchat and Twitter. We hear about the story because the algorithms on Google and other websites surface it. We simply find a less cognitively problematic way to do something similar.

Societally we need to re-evaluate our relationship with our own personal information. Within organisations we are seeing more interest in using biometrics to support well-being or sentiment analysis to gauge someone’s likelihood to leave or ‘flight risk’. There needs to be more than just consent on behalf of employees to justify this – organisations have an ethical duty to reflect and explore. 

There is no part of our lives that seems safe from our desire to explore ‘quantifying’ as a species and where there is data there is value and potential for misuse. It is becomingly increasingly clear that if we are left to our own decisions we will take the short term advantages of sharing data and then object later to that data being used. The costs are hidden, but we need to be aware that there is clear intent to influence our decision making and information access in a way that we would never choose to actively contribute to.  

The direction seems set – and yet where there is exploration there is opportunity for improvement. GDPR has the potential to be a landmark and visionary piece of legislation bringing to the fore both the notion of intent of data consent and the algorithmic processing of data. It helps set boundaries – the extent of which will doubtless be explored through developing case law – and also prevents us being reduced to numbers without sufficient control and understanding. It should prompt conversations in organisations about data use that reflect on what we want and feel comfortable about as people.  

We also know that social media has the power to mobilise and there is opportunity there to create movements for change that coordinate individual action in a meaningful scale – helping to dictate the terms of engagement with technology organisations have in a more even way. We need to educate the coming generations (and this one) on the trade-offs they make to help us shape a world where technology is used for our purposes, rather than having technology shape our thoughts to someone else’s agenda. 

We need to do this together and quickly and demonstratively. Snowden was a wake-up call that we slept through, but if we sleep through the alarm from Cambridge Analytica, then we are definitely going to be too late. 

CIPD