Request a Demo

Want to see Tk20 in action?

Survey Says: Tips For Planning Your Next Survey

by Monica Joy Krol, Tk20 Product Manager

Recently I was invited to the Minnesota Tk20 User’s Group. Most of the representatives were leaders in their Teacher Education Departments or Colleges preparing for their new state reporting requirements. Among the requirements is a survey of graduates completing their first year of student teaching, as well as their respective employers or school administrators. While many direct assessments of teacher candidate knowledge, skills, and dispositions are designed and implemented, indirect assessments like surveys are not as widely used to measure competency and success.

There are many things to be considered when developing a survey, particularly of graduates. Surveys are fun to create, especially as we think about the content and questions we can ask. However, successful implementation of a survey goes far beyond the content and design. A well implemented survey that yields good results also involves goals and planning for resources, collection, reporting, and follow-up.

Assets and Resources

Before you begin planning, consider the various resources you have on campus. Are any other departments administering a similar survey to the same group and/or with similar content? A good example is when programs survey their graduates. If possible, collaborate with the Institutional Research office or other office that may already administer a graduate follow-up survey. Maybe you can do it together and append some program specific questions to a general survey?

Survey Content

As you consider your content, ask:

  • What do you want to know?
  • What do you need to know?
  • Are their related demographics or information that might alter your analysis?
  • Are there specific strengths or weaknesses of your institution or program that you want to know specifically about?

In addition, consider some ideas that will elicit follow-up or future partnership. For example, if you are surveying employers, you might want to include a question that solicits their interest in participating on a committee or advisory board. You may also want to ask the recipients for contact information or ask them if they would be willing to answer additional questions in a follow-up call or meeting. The contact information is beneficial for updating your student or participant information records.

Response Rates

Assessment is not merely limited to learning or experiences. Evidence of assessment of institutional practices and processes is increasingly gaining more emphasis with accreditors and institutions focusing on resource allocation.

One goal to consider is how to strategically increase response rates on your surveys. Here are some tips:

  • Identify the approximate number of survey recipients.
  • Identify the target response rate you need for analyzing the survey. This may be identified via past response rates on the same or similar surveys or by establishing a realistic goal. Opinions on target response rates may vary, especially in relation to the sample size or audience. e.g. For a survey of recent graduates, you may aim for a 25% response rate. For a survey of your current employees, you may aim for a higher response rate.
  • Consider subgroups of your recipient list and what their corresponding response rates should be. If you are surveying graduates, you may want to create response rate targets by program. Furthermore, you may want to increase the response rate and collection methods for smaller programs. A response rate of 20% in a program of 10 students may not be sufficient. If that program is up for review in the near future, you may want to increase it to 80%.

Example of Response Rate Planning

Program Graduates Target Response Rate Respondent Goal
Art 10 80% 8
Business 271 25% 68
Education: B-2 50 40% 20
Education: K-6 157 25% 39
Education: 7-12 123 25% 31
All Programs 611 39% 238

Collection Methods

Given your response rate goals, consider your timeline and methods for reaching survey recipients. Will you contact recipients by phone, mail, email, in person, or a combination of methods? A combination of these collection methods can be successful.

Timing is everything. Consider the date that you must analyze your results by and work backwards to identify the dates you will send and remind recipients. Also, consider the timing of your recipients; e.g., are you able to reach them better during a holiday break or in the evening?

During the collection process, monitor your response rates. If you are not meeting your response rate in a specific program(s), prioritize your efforts to reach out to them. Perhaps you should make additional phone calls to graduates of the Art program?

Incentives: To Offer or Not to Offer?

Will you offer incentives for completing the survey? “If you respond, you will be entered into a drawing for a $50 gas card.” There is a lot of debate about the use of incentives for increasing survey response rates. Ultimately your budget and timeline for collecting data will impact this decision. If you need an immediate response rate and do not have time to employ multiple collection methods, you may want to consider an incentive. If you have a longer collection period and employ persistent methods for reaching out, you may not need one.


Since you have already considered your survey content and the key things you need to know, focus on the design of your questions.

  • Question Types: What types of questions will elicit the most meaningful information?
  • Engagement Questions: Are they reading this? Consider including a reverse likert scale question to ensure recipients are consistent with the responses to the same question that are written in reverse. Another strategy is to include an obvious question such as, Please mark question 10 as NA. If the answer is anything but NA, you know the recipient did not carefully consider responses for each question.
  • Quantity of Questions: How many questions is too many?
    Person Friendly Language: Are your questions easy for the recipient to relate to? You may be asking graduates to state their agreement with meeting program outcomes; e.g., The program prepared me to be able to … vs. I am able to ….

Reporting and Analysis Methods

How will you report or analyze these data? This all varies on the types of questions and responses you receive.

  • Format: If this survey is repeated on an annual basis, you may want to identify multiple ways to review these data, both for the current year and in comparison to other years. If you have open ended questions, how will you categorize or code those questions so they support or supplement questions that provide support quantitative analysis.
  • Connections: Are there other assessments or surveys that you may want to consider in your analysis? What connections can you draw between responses to a survey and the findings from other assessments?
  • Community: Who will be involved in reviewing these data? How will you present and engage others in the findings? How can you engage them to draw conclusions?
  • Documentation: How will you document your findings? What can you celebrate? Are there any recommended action items?

Reflection and Follow-Up

How Assess your process. Reflect on the aspects of each step. What went well and what could be improved the next time you implement a survey? The assessment and improvement of processes should be documented and can be used in your accreditation or grant reports.

  • Follow up with respondents, if applicable. If you did ask them if they want to participate in a focus group or initiative, connect with them before or after the survey has closed.
  • Update your systems. If you collected addresses or other contact information, make sure you update your system(s) where contact information is stored. This will be very useful for future efforts within your department or institution.
  • Lastly, keep it simple and enjoy the process. There will always be things to learn and improve. Plan to enhance key areas of the survey or process, and you will be amazed at what you learn about your respondents and what is successful for implementing a survey.

About the Author

Photo of Monica Joy Krol from Tk20

Monica Joy Krol, Tk20 Product Manager

Monica Joy Krol joined the Tk20 team in May 2013 in a dual role as Assessment and Accreditation Consultant and Customer Solutions Advocate. Previously she served as the Director of Institutional Research at Corning Community College in The State University of New York system. In this position, she guided faculty and staff in the collection, review, and analysis of program and institutional data, and actively engaged in Strategic Planning and Middle States accreditation efforts. She began her journey in higher education and assessment as a graduate assistant during Nazareth College’s TEAC Self-Study, has served as an Assessment Coordinator at Dominican University’s School of Education, and has supported various institutions nationally in assessment and accreditation solutions. She earned her Bachelor’s from The State University of New York at Brockport, and her Master’s in Education from Nazareth College. A true data and assessment geek, she is also an avid runner, artist, puppeteer, and life-long learner.