Do you need assistance developing a survey and planning for the collection of results? Institutional Effectiveness and Assessment (IE&A) can provide guidance and help. This page contains advice about survey length and question design, as well as questions to consider when planning a survey. For a more in-depth consideration of research methods topics such as sample size, improving response rates, and assessing question reliability and vailidity, click here. If you would like feedback on your survey as you’re developing your questions, please reach out to IE&A staff.
Questions to consider when planning a survey:
- How many people do you intend to survey? Keep in mind that many surveys are sent out each year to the St. Olaf community (especially to students) and that using a smaller sample, if possible, will help reduce survey fatigue. If you plan to survey more than 10% of a given population, please alert St. Olaf’s Survey Coordinating Team of your plans.
- Will this survey be conducted anonymously, or will you collect any names, emails, or other personally identifying information? If you are considering a non-anonymous survey, think about whether you truly need to collect identifying information to serve your research, assessment, or other data collection purposes and, if you do, how you will ensure survey responses remain confidential.
- Who will have access to the raw survey data (e.g., individual participant responses and identifying information, if applicable), and how will it be protected from unauthorized use?
- Do you plan to collect any sensitive information? This could include questions about participant attitudes, behaviors or experiences that may cause them to experience heightened discomfort, stress, or feelings of guilt or shame. It may also include questions about behaviors or activities that could impose legal, academic, or social risks if the information is exposed (e.g., questions about illegal activities, those that violate institutional policies, or beliefs that could invite retaliation from others). If you do plan to ask these kinds of questions, you should consult the St. Olaf IRB website for the type of review needed.
- Are you planning to share data from this survey with an outside third party, including an outside organization that may be conducting the survey on your behalf? If so, you should consult the St. Olaf IRB website for the type of review needed.
- Will you be providing monetary incentives to encourage individuals to take your survey? If so, please be sure to consult St. Olaf’s policy.
Tips for survey length and mechanics:
- Keep the survey as short as possible while still fulfilling your data collection needs. This will help ensure that more individuals are willing to complete it.
- Typically, all questions should be optional so that respondents can skip any that they are not comfortable with or do not want to answer.
- In particular, identity-related demographic questions (such as asking respondents to share their gender, race/ethnicity, sexual orientation, etc.) should be optional and/or include a “prefer not to respond” option.
- If it is important to your survey analysis that participants answer the demographic questions, explain in your survey invitation email or survey introduction how you are going to use this information and how you will protect their privacy.
- You may need to require a response to certain questions, such as those that lead to different “branches” of the survey depending on how they are answered.
- In particular, identity-related demographic questions (such as asking respondents to share their gender, race/ethnicity, sexual orientation, etc.) should be optional and/or include a “prefer not to respond” option.
- Think carefully about the number of open-ended questions (those that allow respondents to type or write out a longer response) you plan to include on your survey.
- On the one hand, open-ended questions can provide more flexibility (i.e., you do not have to try to come up with an exhaustive list of response options) or greater detail (i.e., someone could clarify why they responded in a certain way to a prior question).
- However, these questions tend to get fewer responses than closed-ended questions (which provide a list of response options to choose from).
- They also take much longer to analyze; if you are asking respondents to take the time to provide more detailed responses, you should be willing to acknowledge their effort by committing to reading each comment.
Tips for question design:
- Limit your use of technical jargon or any language that may be unclear to your survey respondents (such as abbreviations or acronyms). Consider having a student – if you are planning to survey students – or another colleague review your survey for any confusing or unclear language.
- Reviewing your survey with a small group from your intended survey population can also be useful for understanding how a potential survey participant might interpret other language used in the survey, such as questions that ask whether they engage in an activity “regularly” or “often.” Individuals may interpret these terms in different ways, thus potentially impacting your understanding of the results.
- Perhaps you are asking students how often they engage in co-curricular activities outside of class. If your response options include words like “sometimes”, “often”, or “very often”, you might ask your survey test group to define their threshold for selecting a particular response. Do they consider “very often” to be 3+ hours a week? 7+? 10+? This might even lead you to change the response options to a range of hours instead!
- Avoid using leading or double-barreled questions.
- Leading questions are those such as “Do you agree that the workshop was effective?” or, “How did your instructor explain concepts clearly in class?” that may lead participants towards a certain type of response (e.g., agreement) or assume they had a particular type of experience (e.g., clear instruction). Instead, phrasing questions along the lines of “To what extent do you agree . . .” will avoid implying that participants are expected to respond in a certain way.
- Double-barreled questions are those that ask about two different experiences or opinions within the same question, such as “To what extent were the following services delivered in a timely and useful manner?” If someone believed a particular service was useful but not timely, for example, they would have no way to indicate this in their response. Instead, it would be better to separate these into two different questions, or have two response scales for each item.
- For questions with a response scale, be sure to include balanced response options.
- For example, a scale using “Strongly Agree, Agree, Disagree” is less balanced than “Strongly Agree, Agree, Disagree, Strongly Disagree.” The former is likely to bias responses towards agreement.
- You might also include a “neutral” option, such as “Neither agree nor disagree” (to continue the example above). Consider whether this option makes sense for a particular question, or whether it might negatively impact your results by allowing individuals to avoid providing an opinion.
- Further examples of different response scales can be found here.
- For questions with a list of response options, do your best to be as comprehensive as possible while still keeping the number of items managable for both the respondent and survey analysis process.
- If your responses are a list of number ranges (e.g., ages), be sure you have not included any overlapping categories, such as: a) 15-20, b) 20-25, c) 25-30, d) 30-35, where 20, 25, and 30 appear twice.
- Sometimes, you may need to include an “other,” “not applicable,” or “I don’t know” option as well. This is another area where a survey test group would be helpful in determining the range of knowledge and experiences of your target survey population.
- For demographic questions, ideally you should only ask for the level of detail that you plan to use when analyzing the responses.
- For instance, if you are interested in high-level differences between responses from LGBTQ+ students and heterosexual students (or if you anticipate that the number of LGBTQ+ respondents will be too small to disaggregate further), there is no need to ask students to provide more detailed information beyond whether they identify as a member of the LGBTQ+ community.
- This not only helps ensure greater individual privacy for respondents, but also avoids asking for information you are not going to use.
- For a more in-depth discussion on designing survey demographic questions, including suggestions for question phrasing and response options, view this presentation from the 2021 Higher Education Assessment Conference.