Employee survey design can have a massive impact on the quality of your data and, ultimately, people strategies. In this blog, we explore good practices, how to avoid ‘survey theatre’ and examples of theatre in employee surveys.
Like any other arena, employee survey design is subject to fads and fashions that take hold and often fly in the face of good practice. In some cases, these ‘innovations’ are driven by advances in technology and, in others, by an idea that catches people’s attention, or both.
Just because changes in technology allow us to do something, however, doesn’t mean we should.
Sometimes ideas that catch our attention are more about marketing than they are about improving something. A solution looking for a problem. But they’re often quite compelling, which is why we’d describe them as ‘survey theatre’.
Survey theatre is quite prevalent in the world of employee surveys. Everyone’s trying to be ‘disruptive’, for example. Surveys are dead, long live pulse (surveys). There’s nothing wrong with dramatic effect, per se, but it can be a distraction from what’s important in survey design or worse – it can result in spurious results.
A couple of examples of survey theatre that we’ve written about in more detail are Sliders, which we’d class as ‘measurement theatre’, and eNPS, which we’d describe as ‘analytics theatre’.
Briefly, sliders involve the respondent sliding the scale to the score they want, and sometimes there’s a funky graphic that changes as you do to. Sliders can look cool, but they can create accessibility issues or lead to inaccurate or incomplete responses.
eNPS (Employee Net Promoter Score) has become popular in recent years, following the NPS trend from consumer surveys. We view eNPS as problematic for a number of reasons, but one thing’s for sure: They create dramatic results…
In employee surveys, good practice is less dramatic
As Sliders and eNPS both change the way that responses are collected in employee surveys (e.g. engagement surveys, pulse and more or less anything badged as employee feedback), let’s look specifically at how responses are structured. Good question design is another important matter, which we’ll come back to.
You might not really think that much about response types when it comes to your employee survey, but they do make a difference. And what you choose depends on:
We use a range of ways to collect data, including:
We most commonly use Likert scales (after Renisis Likert, 1932) for gathering employee engagement data, because they are tried and tested ways of dealing with a range of potential responses (including no no opinion). For this reason, we’re going to focus on using scales for the rest of the article.
However, rather than dictating, we work with our clients to help them understand what type of responses will work best for the questions they want to ask that will help them best understand what’s going on in their business and what’s driving engagement.
The table below shows you a quick summary of some more common scales used in employee surveys and their relative pros and cons. There’s no magic when it comes to choosing the number of points for your scale, and it is a matter of ‘horses for courses’.
As a general principle, however, when talking about scales, the advantage of more points is one of nuance, allowing for more degrees of opinion (including no opinion at all). However, more points can also make the participant have to think harder about their response, and over a relatively large number of questions, the responses could be tainted by fatigue.
Using an even number of points on a scale clearly forces an opinion. Whether that’s a good thing or not, as with any of the choices that you have, is entirely dependent on what you’re asking.
We most typically use a five-point scale as, when it comes to measuring employee attitudes and opinions, it tends to provide more pros than cons.
Scale |
Comments |
Benefits |
Concerns |
2 point (Binary response) |
This is used for binary questions where you simply want to know if someone agrees or disagrees (e.g. Yes/No) |
Removes any ambiguity when asking direct questions "Do you like coffee." |
It does not allow any nuanced response from individuals (if required). |
3 point |
The same as the 2-point scale, but when you want to introduce a middle position of neither agreeing or disagreeing |
Allows a neutral response, for example, if you neither love nor hate coffee but you choose not to take any form of caffeine… |
It still lacks any real nuance. |
4 point |
Typically a scale of strongly agree to strongly disagree with no neutral option |
Allows explicit descriptors at each point in the scale to assist attitude/opinion responses Forces a definite choice when this is desirable (more commonly used in, for example, personality typing). |
Does not allow for genuinely neutral responses. |
5 point |
Typically a scale of strongly agree to strongly disagree with a neutral position in the middle |
Allows for neutral opinion Useful for statistical analysis and an industry-standard Allows explicit descriptors at each point in the scale to assist attitude/opinion responses |
Slightly less nuanced than a 7 point scale |
7 point |
Typically a scale of strongly agree to strongly disagree, often anchored only with the comments Strongly Agree and Strongly Disagree at points 1 and 7 |
As per 5 point scale Allows for a more varied and nuanced response, which may be useful for statistical analysis in smaller samples (not often an objective in employee surveys). |
The longer the scale, the more likely people are going to suffer from survey fatigue which could lead to non-completion of surveys or ‘lazy’ choices. |
10 point |
Typically a scale of strongly agree to strongly disagree, often anchored only with the comments Strongly Agree and Strongly Disagree at points 1 and 10 |
It gives a more varied response to allow a better understanding of the context when the subject matter is familiar. |
The longer the scale, the more likely people are going to suffer from survey fatigue which could lead to non-completion of surveys People struggle to know the difference between a 7 and an 8 response or an 8 and 9 as these are simply numbers. Should only really be used for small numbers of respondents (N<100) A 10 point scale is proven to give a lower median score compared to 5 and 7 point scales Data becomes noise |
Net Promoter style (11 points) |
An 11 point scale that are grouped in detractors, passives and promoters. A calculation is then applied to give a "Net Promoter Score" based on the question "How likely are you to recommend us to your friends/family." |
Gives 1 number result, which can be appealing. |
The calculation has flaws when used to measure employee opinion Changes over time are not tracked well using this method Complex to explain to employees Does not allow for true statistical analysis Data becomes noise |
When it comes to measuring attitudes or opinions about their experience at work, people are more likely to identify with a statement than a number.
Given the value that is placed on employee engagement data in the boardroom, and the controversial nature of the subject matter (for example, despite many years of attention Gallup’s view is that 85% of employees are still not engaged), it’s really important to be confident in what you’re measuring and how.
Don’t get us started on what you’re measuring.. ..for now, let’s stay focused on how.
We certainly believe that the way many employee survey companies capture responses has a significant impact on the quality and reliability of the results that it produces. One of the most important ways of improving this is by anchoring to statements rather than numbers.
For example:
Here is a ten-point scale that is anchored to numbers.
Strongly Disagree |
|
|
|
|
|
|
|
|
Strongly Agree |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
While it’s intuitively appealing to ask people to score something out of 10, it’s not great in employee surveys. One reason for this is that some people will be more inclined to score more highly than others, even if they feel the same way about the issue. We’ve seen wildly differing NPS scores between different geographies, for example, with different attitudes to scoring.
So is anything over a 5 ‘positive’ on this scale?
To better understand how our employees feel about work, we need to take a more descriptive approach.
Consider this five-point scale, in contrast, which explicitly describes each point on the scale. (Granted, if you go beyond a five-point scale, you may need to make some points implicit, which one reason why we rarely do).
Strongly Disagree |
Disagree |
Neither Agree or Disagree |
Agree |
Strongly Agree |
O |
O |
O |
O |
O |
A scale like this asks a wide range of questions and allow different responses, but on the same scale, which is important in a good survey design. The table below shows some of these response types:
It is critical that you choose the right response type for the question that you’re asking. See, for example, the variety of response types below:
Response Type |
1 |
2 |
3 |
4 |
5 |
Frequency |
Never |
Rarely |
Sometimes |
Often |
Always |
Quality |
Very Poor |
Poor |
Fair |
Good |
Excellent |
Agreement |
Strongly Disagree |
Disagree |
Neither agree or disagree |
Agree |
Strongly agree |
Awareness |
Not at all aware |
Slightly aware |
Moderately aware |
Very aware |
Extremely aware |
Importance |
Not at all important |
Slightly important |
Moderately important |
Very important |
Extremely important |
Familiarity |
Not at all familiar |
Slightly familiar |
Moderately familiar |
Very familiar |
Extremely familiar |
Satisfaction |
Not at all satisfied |
Slightly satisfied |
Moderately satisfied |
Very satisfied |
Completely satisfied |
We hope that we’ve demonstrated that the way that you capture data when you design your employee survey, in important. Of course, it’s also important to ask the right questions and write them well. On the other hand, it’s less important to create theatre!
At The People Experience Hub, we are always happy to connect with people and talk about your plans and challenges around employee feedback and surveys, or simply what you thought of our blog.
Why not get in touch with the team at hello@pxhub.io