What are we going to do with HR Banalytics?

We hear more and more about the virtues of HR Analytics. But how much of it is hype? And is focus on numbers preventing us from focusing on people?

Hence the neologism, banalytics – data that appears on first sight to be useful, but in practice diverts attention from the real issue. Of course, there is a place for numerical data – for example, in identifying trends such as career progression for minority employees or differences in the level of psychological safety in various departments.

Some examples of banalytics include:

  • Generic 360-degree feedback. Not only is the data unreliable (poor managers can receive higher scores than good ones) but the person rated is under pressure to improve on things that may not be fully in their control and may not reflect their developmental priorities identified by other means.
  • Bench strength – often looks much stronger than it really is, when you strip out duplicants (people assigned to more than one option) and whether people would actually take the role. Also very difficult to align the roles being benchmarked with the dynamic changes in what roles are most important for the organisation.
  • Employee engagement surveys usually don’t provide actionable data. They do not provide the team leader with information on the causes of direct employees’ dissatisfaction, nor on what to do about them. They also don’t distinguish very well between general dissatisfaction and specific behaviours by a team leader.

The point of analytic data is to inform policy and decision-making. The following are some of the questions that can be helpful in assessing the utility of analytics:

  1. Are the frequency of data collection and the rate of change in what is being measured aligned?
  2. Does the data lead directly to decision-making?
  3. If it does, what is the structural gap between where the decision is made and where it is implemented? (The bigger the gap, the greater the chances of unexpected consequences.)
  4. How closely does the measure reflect strategic business priorities?
  5. Who determines the utility and appropriateness of the measure? (For example, are neurodivergent employees involved in the design and implementation of measurements relating to neurodivergence in the workplace?)
  6. What is the balance and relationship between quantitative and qualitative data? (Over-reliance on either alone can distort perceptions.)
  7. Does the analytic enhance or supplant real conversations with people?
  8. To what extent are responses dependent on shifting context? Is this sufficiently taken into account?
  9. Are the constructs being measured sufficiently distinct?
  10. What’s the evidence base behind a diagnostic? (Many of the most popular instruments have very poor validity!)
  11. If the analytic is intended to be predictive, how accurate has its predictions been to date?
  12. Do key people believe the data, when they see it?

In short, it’s time for the honeymoon period for HR analytics to give way to more pragmatic, more evidence-based, more human approaches.

©️David Clutterbuck, 2024

 

Our free content is available to everyone. It includes a limited range of Blogs, Videos and Briefing Papers on key topics and the latest trends. If you want to expand your knowledge even further, or support your development or business with up-to-date information and research, sign up for a FREE TRIAL to gain access to the full content of over 500 blogs, briefing papers and videos within our resource library.

Membership with CCMI offer you will access to all the content within this resource of over 200 blogs, video briefings and more.