We believe that you shouldn’t need to be an expert in people analytics or data science to get insights from your employee survey results. While surveys create data, it needs to be processed and organized into a more consumable format. This is information. But turning information into insight requires analysis.
In this blog, our third in a series on running employee surveys, we cover this important step of turning your survey results into actionable insigh
Read our Guide to Running Employee Surveys Part 1 - Employee Survey Design Part 2 - Launching Your Employee Survey
An employee data will create large amounts of data so it’s really important that your survey platform turns that data into meaningful and consumable information,. For example, our standard dashboard starts with participation statistics, followed by what we call our ‘Index Analysis’. What this means is that we shown your overall People Experience score (if appropriate) broken down into the specific themes (indices) that your survey covers, before breaking those down further by demographics (e.g. location, department, age, gender…). We also show the breakdown of scores by question and enable you to map index scores against others (a simple form of correlation).
The point is, even at the top level, that’s a lot of information, and that’s before you start applying filters and drilling down into further detail. It’s therefore critical that the data is presented using appropriate graphs and charts, taking data visualisation and user experience principles into account.
You may be lucky enough to have people analytics experts in your business who can craft beautiful charts from your data, but most businesses don’t and it’s therefore important that your employee survey platform has well-designed dashboards as standard.
When it comes to data visualisation, as with survey design, you must consider inclusion. For example, not everyone can read a RAG chart. In fact, approximately one in twenty of your people may struggle due to colour blindness.
Read about designing your dashboards for colour blindness
So you''ve designed your dashboards well and the results are easy to read. But how do you add the analysis that leads to insight (and in turn to effective action)?
While it’s often desirable to ask one or two open text questions in your employee survey, if you’re not careful with your design, you can end up with more qualitative data than you can realistically work with.
There are software tools that can help you with your qualitative analysis but, in our experience, they don’t completely remove the need for human eyes and effort. Quick and easy tools like world clouds can be appealing, but really don’t add any level of insight to your data.
Read about our thoughts on world clouds.
The major benefit of qualitative research is the depth of insight into an issue, as opposed to the broad coverage that a survey will provide. So its really important to be clear about what approach is best suited to the overall question that you’re asking. Many year ago, I supervised undergraduate students’ research projects and wouldn’t let them think about methods unless they were absolutely clear on their research question.
You might want to go to the next level and examine statistical relationships or differences, but this does take specialist expertise and / or tools. Applying data science in this way can certainly give you greater insight into, for example, what’s driving your engagement score or whether differences between groups are significant.
Plenty of tools are available to help with statistical analysis, making data science more accessible, but it’s easy to go wrong. Every test has certain assumption which, if not met, mean that their results are less valid. For example, they may assume that your data follows a normal distribution (bell) curve.
Another potential pitfall with interpreting statistics is when you assume cause and effect where none is implied by the data. For example, employee engagement might be correlated with perceived quality of food in the canteen, but that doesn’t mean that it’s driving it.
To be fair, the same interpretation issue exists with descriptive data, for example percentages and average scores. But when you show that a correlation has been calculated, it can be used to give an impression of something that’s not necessarily true.
That said, adding the right statistical analyses to your employee survey results can add a great deal of additional insight, if used appropriately.
Ultimately, whatever numerical insights you are given, there is also a skill to interpreting what they mean for your organisation. If you’ve chosen your questions carefully, based on your People Strategy, then their implications are more likely to be clear.
It''s really important to read between the lines of your survey results and consider their meaning in the context of your business. That’s one of the real benefits of facilitating insights workshops, drawing upon the experience and expertise of your colleagues, to agree on the implications of the results (defining the problem to solve) before deciding on actions (solutions).
There are two major distractions when it comes to drawing business insight from your results. These are external benchmarking and focusing on the score. Benchmarking is a distraction because it means that you might focus on the things that don’t really matter in your unique business context, and it can lead to a focus on the score. Focusing on the score can lead to the wrong kinds of action and behaviour.
Read: Why we discourage benchmarking of employee surveys When the engagement score becomes the goal
When you pick a survey off the shelf, you’re more likely to be left wondering ‘so what?’ or end up focusing on pushing up the score rather than using it to create meaningful change. In other words, it’s really important, as it is in design, to understand why you are running your employee survey and what you expect to get from it.
In our next part of the guide to running employee surveys, we’ll cover generating action.