A misplaced obsession
The world of employee engagement is obsessed with external benchmarks.
Common questions being:
- “Is our engagement score above or below the global norm?”
- “Is our management index score better or worse than the sector benchmark?”
- “Are our communications items improving at or above the rate of regional standards?”
Companies spend a huge amount of their time combing through comparison after comparison, assessing their own data versus those from external reference points. But just how effective is this practice?
Having spent years working for several of the world’s biggest research firms in this field, I have become less and less convinced about the merits of external benchmarking. In fact, I am starting to feel as if the whole engagement industry is now built around it almost solely as a practice. However, while the costs to clients are often huge, the benefits are almost minimal in real terms.
Get rid of your benchmarking comfort blanket
The main reason organisations want external engagement benchmarks is that they act as a comfort blanket. If a company is “only three points off average” or “a couple of notches above the norm”, then things feel warmer and better. This can also lead to complacency.
However, there are several reasons that senior executives should be much tougher with themselves about the merits of external benchmarking in employee engagement:
1. Staring at external benchmarks doesn’t make you better
• My sport has always been rowing (or “crew”). If there’s one thing every great coach tells you, it’s this: “keep your eyes on your own boat – looking across at how other crews are doing will not make you go faster”. Benchmarking your engagement survey is similar. You can stare forever at how your own scores compare to the “norms” but it doesn’t make you any better at engaging your own people. Knowing how you compare to a bunch of other firms on leadership doesn’t make you better at leading people; knowing what your ratings are on communications versus a basket of other organisations doesn’t make you any better at cascading information or creating dialogue.
• Companies need to get focused on how they can improve their own situation and stop obsessing about how they compare to others
2. Many of the norms used are not really comparable
• Lots of norm data is meaningless. Comparing your own engagement data to a national average is so broad it becomes pointless. Benchmarking yourself against an “industry” average may mean you are comparing yourself to other companies who are of a very different scale or complexity. So just how beneficial is the match?
• Firms need to be more ruthless in what is actually seen as a high utility benchmark
3. By definition, external benchmarks are blunt instruments
• Because engagement benchmark databases are built on generic questions, they only allow you to compare against “lowest common denominator” question items, relevant across all types of organisation. What most firms want to know is how they are doing on engagement questions pertinent to them and their current situation or challenges. What they get are very blunt norms based on very uniform questions which, in fact, tell them very little.
• Client organisations need to understand that survey firms are usually building survey norm databases to create the very industry of which they then become the victims – slaves to the benchmarks. It becomes a vicious circle.
4. Many benchmarks are out-of-date
• Many survey firms, to try and make the norm data more relevant and focused on a particular client, will have to stretch back the data for so long in time that the data itself becomes out-of-date. Companies need to get smarter at asking about the credentials of the benchmarks being used. How pertinent are they and how fresh is the data? Are these comparisons even valid?
What is the benchmarking alternative?
If you stand back from the blinding glare of external benchmarks (which can lure you in but serve little purpose), there are a handful of other approaches you can take to engagement data which will get you greater insight, help you learn how to improve more quickly and aid you in accelerating your performance. For example:
Focus on internal benchmarking and the identification of engagement best practice amongst leaders and managers (including case study references)
o You can learn more from the best in your own organisation than by staring at cold, external benchmarking data. Try, using your engagement data, to identify your best 20 or 30 leaders in the organisation. Who are your best engagers? Who creates the best dialogue? Who inspires others to achieve? Study these exemplars in depth. What is it that’s different? Is it their attitudes, their behaviours, their “make-up”? If you can capture, codify and then spread the learning from studying your own “best”, you will achieve improvements in engagement much more quickly than by looking at external benchmarks year-on-year.
Focus on driver analytics: understanding what drives engagement in your own culture (rather than comparing your own absolute levels of engagement to other firms)
o Knowing what level your own engagement scores lie at versus those in other firms won’t help you to know how to get better. But using smart driver analytics on your own data will. Use key driver analysis to understand what will raise engagement in terms of specific actions around development, performance management or recognition. This will help leaders and mangers pursue specific actions (based on your own, frim-level or local data) which you know will have an immediate and positive impact on engagement.
Focus on segmentation of your own workforce: looking at the drivers of different employee groups to optimise engagement
o Take this a step further and you can get even faster traction. Look not just at what drives engagement across your employees but within certain sub-groups of people. For example, what will raise engagement most for engineers versus administrative workers versus call centre workers? Diving deeper into understanding what motivates your own people at a micro-level (as opposed to looking at high-level, macro, external benchmarks) is much more likely to get your people behaving in the right way to perform.
So the next time your survey provider starts throwing up hundreds of slides all of which are simply repetitive, external benchmarking charts, push back and ask for more OK? I’d be interested in knowing just how useful you really find the external benchmark data you are currently receiving.