Personalisation practices are everywhere – from the search results and marketing emails we receive, to the behaviourally targeted ads we encounter. Love it or loathe it, unless you live ‘off the grid’, personalisation is unavoidable. Though relevant, personalised content may increase engagement (whether through participation with the content, increased purchase intention or direct action), there is also a hidden psychological cost.
If you’ve ever felt that creepy sense of unease when receiving a hyper-specific, targeted advert from a brand you don’t know or have never bought from, you will have experienced this phenomenon.
A version of this happened in a very public way, when the Cambridge Analytica story broke. When millions of Facebook users realised they had been at the sharp end of a profound breach of trust, the motivation to regain their personal freedom (from being tracked and capitalised on without explicit consent) led many to reduce their participation on the platform, or abandon it altogether.
Once dismissed as the purview of the paranoid, this feeling that our every move is being watched, analysed and monetised no longer seems quite so far-fetched. If left unchecked, it can lead to devastating outcomes for even the most robust companies. While the appropriate collection and use of personal data can bring rich benefits from an HR perspective, such as providing feedback about performance, wellbeing and personal development, the ethical inflectionpoint for deploying such practices rests upon a fulcrum of employer intent and employee consent.
Of course, the use of employee data per se is neither good nor bad – the ethical component emerges through the intent and the methodology we choose to apply. Imagine, for instance, that you turn up to work one morning to be issued with an employee ID lanyard. Nothing too unfamiliar so far. Now imagine that this badge is actively tracking your every movement and interaction, from where you are in the building and who you are with, to the length of your conversations and even your tone of voice and speech patterns. Chances are, you might think twice about showing up again the next day. Or imagine that biometric sensors were surreptitiously monitoring how long you were at your desk, as happened in 2016 at The Daily Telegraph (they were removed after just a day following employee outrage).
Even when done transparently, data tracking can give rise to unwanted behaviours. Employees may try to game the system, for example by avoiding harder jobs or taking harmful short cuts (between 2011 and 2015 employees at Wells Fargo set up more than a million fake employee accounts to meet weekly targets).
It can also lead to employees acting in ways that hinder innovation, making them less willing to spend time helping out colleagues or spending time discussing off-topic, but potentially fruitful, ideas. It can result in workers focusing on shorter- term, more measurable projects. And it may cause employees to turn to their own devices or emails instead of the official ones, increasing the risk of cyber-security breaches.
So how far is too far? It all comes down to trust. Cited as one of the single most-important factors in the development and maintenance of happy, well-functioning relationships, trust, once breached (or even when a seed of doubt has been planted), can be incredibly difficult to build back up. When it comes to the workplace, where the balance of power is far more one sided than in personal or even consumer relationships and where no company is immune to a hack, businesses need only make one small misstep to undermine their reputation and the trust and respect of their employees.
If we are to make the most of what these new technologies have to offer, we have to approach them with explicit ethical conditions for their use. We have to ask whether the tools we use are genuinely harnessing the full potential of employees, or are putting an unwanted yoke around their necks.
Nathalie Nahai is a web psychologist, speaker and author of Webs Of Influence: The Psychology Of Online Persuasion.