Surveys are a vital information gathering tool. If you want to know what your customers and leads think and care about, you have to ask them. The information they give you can vary though, even from the same respondent, depending on the way you design the survey. Therefore it’s crucial to know how survey design can influence respondent experience.
Here are three of the main ways survey design impacts respondent experience:
Type of Question
Survey questions generally fall into one of three categories: open-ended, closed, and rating scale questions. Which one you use will affect the kinds of answers you will get.
Open-Ended Questions vs Closed-Ended Questions
One of the biggest factors is whether you use open-ended or closed-ended questions. In an open-ended question, you pose the question and let the respondent come up with the answer completely on their own. Whereas with a closed-ended question you provide a list of possible answers. Which format you choose can dramatically alter the respondent’s choice of answer. For example, according to the Pew Research Center, when people were asked what issue mattered most to them in an upcoming election, most people, 58%, responded to the closed-ended question that the economy mattered most. But when that answer was not provided to them on a list, only 35% of respondents said the economy mattered most in the upcoming elections.
Rating Scale Questions
These are a variant of closed-ended questions. You ask the respondent to rate their answer on a range of degrees. Choose from the three basic types of survey scales:
- Rating Scales: This is the most familiar scale, where the question asks the respondent to answer in the form of a rating. For example, “How happy are you with your experience with us today? On a scale of 1-10, with 1 being the most happy and 10 being the most unhappy.”
- Dichotomous: These scales offer two opposite answers, such as “Yes or No” or “True or False.”
- Semantic Differential: These scales combine the two above. Two answers that oppose each other are given, such as Reliable………………Unreliable, and the respondent is asked to tick off a box closer to one of the words to measure their answer. In this example, if they feel strongly the answer is reliable, they would tick the box under that word, but if they have some doubt as to how reliable, they would tick one somewhat to the right of that word, closer to unreliable on the scale.
Another element of survey design that impacts your respondent’s experience and your results is the order of the questions. The Pew Research Center says, “The placement of a question can have a greater impact on the result than the particular choice of words used in the question.” The order you choose will vary due to the content and goals of the survey. However, some general approaches apply. Studies have shown that asking a specific question before you ask a general one can have a contrasting effect. So it’s a good idea to ask general questions first before getting down to specifics. Also, people typically stop answering questions further down on a survey so make sure to put your most important questions near the beginning.
Incentive or Reward
Offering some type of reward for survey participation can increase how many people take your survey. However, there are some things to watch out for. A reward can impact your budget, and it can also skew your results bit. Incentives generally raise customer feedback scores slightly, but in general, they “do not change the reliability of data by respondents.” So even though rewards affect the data, it isn’t by too much. Incentives can be a great way to increase respondent participation without compromising the quality of the information you gather.
As you can see from just these three examples, your survey design greatly impacts both the respondent experience and responses. Our team at EZPIPER can help you navigate these complicated resources as well as fulfill your other market research needs.