Responding to a Perennial Data Challenge
by Jane Leibbrand, Communications Consultant, Education Policy & Practice Writer
Large & Small Preparation Providers Use Similar Strategies to Improve Alumni & Employer Surveys
Educator preparation accreditation has many data collection challenges, but one of the most longstanding is how to improve the very poor response rates to surveys that educator preparation providers (EPPs) ask employers and graduates to complete. EPPs send them out every year, but returns are marginal to abysmal. No doubt some of the reason lies with EPPs.
Until the recent focus on outcomes in education, educator preparation institutions and P-12 education existed in two completely separate spheres. When candidates graduated, the institutions saw their job as finished and did not concern themselves with what happened to the graduates. Likewise, P-12 saw higher education as “the ivory tower,” impenetrable and unyielding.
Until the 1990s, accountability mechanisms, including accreditation and licensing, were input-oriented. Accreditation was concerned with facilities and books in the library. When it came to content, the focus was on syllabi—the information being presented to the candidates—and not on whether they learned it or how effective they became as teachers. State licensing focused on the number of college credit hours. Preparation providers routinely surveyed candidates at the end of their preparation to get a read on candidate satisfaction—but of course those candidates had not yet become autonomous teachers in the field, which can yield significant differences in perception.
An Increasing Focus on Preparation Outcomes
During the past 20 years, that input paradigm has shifted to outcomes, with several influences contributing to the change. One of the first was NCATE’s inclusion of InTASC’s performance-oriented standards in its 1990s accreditation standards. NCATE continued the new emphasis with a heavy focus on assessment in its 2000 performance-based standards and beyond.
Next, foundations also invested heavily in education accountability systems and strategies in the 1990s and 2000s. Their support—worth hundreds of millions of dollars—helped the field turn toward a focus on examining which strategies facilitate effective teacher performance (e.g., the Gates Measures of Effective Teaching project). Another contributing factor in the move to outcomes is the revolution in technology. The advent of the internet in the late 1990s advanced new data collection technology that has helped move accreditation beyond a once-in-seven-year compliance-driven system to a system driven by data collection and analysis. It has enabled matching P-12 student performance to teacher performance and matching teacher performance with preparation provider. It’s now all about outcomes. How P-12 students fare under CAEP grads really matters!
CAEP Standard 4 Requires New EPP Strategies
Today, the focus on outcomes has become full-fledged, and CAEP continues to lead with standards focused on program impact. Part of CAEP’s goal in approving the 2013 standards was to disrupt teacher preparation (albeit on a reasonable time frame, rather than through a more typical disruptive innovation which often requires more drastic, immediate change). CAEP’s new Standard 4 requires data gathering that the accrediting body and its forebears have not required previously: collecting data from alumni who are in the field. During the past decade, NCATE asked for data on graduates, but there were no consequences for not having it. No more. Institutions that had not paid much attention to surveys have quickly moved out of a comfort zone. The standard also requires a minimum response rate on alumni and employer surveys—also new.
Baldridge Lends Impetus to Use of Survey Feedback
Another significant influence on the 2013 CAEP outcomes-oriented standards includes the Baldridge Criteria for Performance Excellence, which emphasize feedback and a fact-based, knowledge-driven system for improving organizational performance. Education organizations have benefited from the increased use of feedback in changing organizational systems and performance. Surveys of alumni who have at least one year of teaching experience, as well as surveys of employers and candidates, can provide invaluable feedback to preparation providers, helping them to make program changes that lead to better outcomes for graduates. With more available data from stakeholders, the data, rather than anecdote or surmise, now increasingly influence program change for improved program outcomes. Surveys which provide stakeholder feedback have moved from the margins to a place of prominence in the CAEP standards.
Thus, 2013 became ground zero as EPPs began examining their programs in light of the new CAEP standards. Some EPPs have had millions in foundation support, while others, often small institutions, have made significant changes with very modest financial resources. Bank Street College in New York exemplifies the latter, while Minnesota State University Mankato provides a picture of large-scale collaboration with other EPPs in a foundation-supported initiative. Both have changed survey practices.
Pre-2013 Survey Practices
At the Spring 2016 CAEP Conference, Bank Street College provided an overview of how it has moved survey administration from an ineffective status quo to an organized, systemic effort designed to meet CAEP’s new components 4.2-4.4 in Standard 4. Amy Kline, Assistant Dean of the Bank Street Graduate School of Education, explained the context for previous weaknesses in Bank Street’s approach to surveys and laid out a plethora of reasons that survey response rates have had unacceptably low numbers.
Bank Street’s philosophy, from its beginning in 1916, has been grounded in a progressive education model and as such, has focused on qualitative rather than quantitative inquiry methods. The surveys reflected this philosophy; they were overly long (up to 30 minutes to complete) and contained too many open-ended questions that resulted in lengthy narratives. Because Bank Street is a small college, it relied on a part-time consultant to send out the surveys, which led to inconsistent administration.
For those reasons and more, response rates were so low that the surveys weren’t deemed useful for improvement. Administrators viewed the task mainly as a compliance requirement for accreditation. Bank Street’s lack of attention to survey administration is representative of many colleges; NCATE and TEAC self-study reports mirrored similar results with very low numbers.
A Small College Responds to the Challenge
Foreseeing a tidal wave of policy change in educator preparation, in 2012, Bank Street’s Graduate School of Education Dean created and still leads its Campus Wide Assessment Task Force, composed of a diverse group of stakeholders. The task force developed a plan to overhaul the School’s approach to data and developed a working group to craft new entry, exit, alumni, and employer surveys. The task force then shared the draft iterations of the surveys with all key internal stakeholders, such as student services, admissions, and financial aid. This step was at first seen as simply a courtesy, but the task force quickly discovered its importance.
“Many of the questions on the old surveys were duplicative of admissions questionnaires; deleting those types of questions helped to pare down the surveys,” says Kline. In the spirit of transparency, at the Fall 2016 CAEP Conference, Kline shared examples of double-barreled, redundant, and unnecessary questions on the old surveys and showed how they were revised or eliminated. The new surveys group questions into categories for coherence and readability (e.g., knowing the subject area taught, engaging student learning, etc.). In addition, they can be completed in no more than 10 minutes.
Increasing Response Rates
Bank Street now asks for permanent email and snail mail addresses on the online graduation application form which all candidates must complete to receive a diploma. That simple addition is a game changer in being able to contact alumni. Kline also notes that sending a “pre-email” from a Bank Street lead administrator, explaining that the grads will soon receive a survey, helps to ensure a better response rate by legitimizing the forthcoming email with the survey link. Since Bank Street is a small EPP, it manages to keep in touch with many of its graduates by inviting them to events—another way to help ensure that the school stays on top of address changes.
New Approach Leads to Program Inquiry; Tk20 Streamlines Processes
Bank Street went from a haphazard approach to survey administration to a technology-based assessment management system. Using a customized Tk20 assessment system, administrators can provide attractive summary reports of key findings on alumni and exit surveys using charts and graphs to create visual appeal and increased attention to the data. Faculty now analyze the data with program improvement in mind and are moving away from the previous compliance mentality. The entry survey, duplicative of admissions data, is now being re-imagined and may focus on dispositions and learning style.
Focus Groups as an Alternate Strategy for Employer Data
In terms of employer surveys, Kline notes that the universe of employer numbers for Bank Street is very small, so the response rate has simply not been statistically significant. She says that Bank Street is thinking of organizing focus groups to gather data from employers. An interesting supposition: once focus groups are in place, could the increased personal attention from those events lead to positive unintended consequences, such as increased survey response rates if Bank Street decides to administer them to employers again in the future? Could it also lead to increased collaboration with employers? Time will tell.
Minnesota State University Mankato Part of Large-Scale Common Metrics Initiative
Bank Street’s effort demonstrates that small colleges can respond effectively to CAEP’s new components 4.2-4.4 in Standard 4. At the other end of the spectrum is a large-scale, Bush Foundation-funded initiative involving 14 institutions. The 14 EPP grant recipients working together, dubbed the Network for Excellence in Teaching (NExT), developed four survey instruments that all 14 EPPs administer. The surveys produce common metrics on the background, learning, and teacher candidate and graduate perceptions of program effectiveness from program entry to exit through their first year of teaching (an entry, exit, and Transition to Teaching survey), as well as an employer survey. The immediate additional benefit is the ability to compare outcomes from one EPP to the next.
Similar Strategies to Improve Response Rates
Similar to Bank Street’s approach to improving response rates to surveys of alumni at the end of their first year of teaching, Daria Paul, Director of Assessment and Research in the College of Education at MNSU Mankato, notes, “We gather alternate and permanent email addresses (not university email addresses), snail mail addresses, and cell and landline phone numbers for all completers prior to graduation.” Both small and large institutions can use this simple fix to increase response rates. The Minnesota institutions also have the benefit of help from the state. The Minnesota Department of Education now collects information on teachers of record, including where they are teaching, and agreed to send EPPs who request it a list of individuals who have graduated from the requesting institution who teach in Minnesota and the location of the school in which they teach.
In addition, Paul also noted all state licensing boards have information on where graduates are teaching. EPPs can also obtain a list of every school in the state and the administrators in those schools. Using that strategy requires work, but it does yield results. Paul says, “As one of the larger programs in the NExT group, we have learned that we must work hard to make personal contacts with graduates to encourage them to complete the Transition to Teaching Survey.”
MNSU Mankato selected a customized Tk20 assessment system as did Bank Street. Paul says that MNSU Mankato plans to place all assessments across all teacher preparation programs and field experiences, as well as demographics, in the Tk20 assessment system. Paul also says that Mankato’s candidate exit survey has been transferred to Tk20, and its first distribution is underway this semester.
Paul shared a graph depicting response rates to the graduate alumni survey and P-12 supervisor survey from 2014-2016. “While we continually work to improve our rates using a variety of strategies, we are very pleased at how these rates have improved since our earlier administrations throughout the 2010-2013 period,” Paul commented. Mankato did not make comparisons with earlier years, because the distribution processes changed significantly from 2013 to 2014.
Paul notes, “We continue to work closely with our 13 partner institutions in NExT to explore methods leading to improved outcomes based on our survey data and other assessments. We have seen very strong results on our surveys in the areas of special education and 504 plans. We continue to see improvements in areas related to technology and field experiences. Sharing our successes and challenges with our NExT partners on a regular basis continues to enhance our collective capacity to evaluate effective instructional strategies and program practices.” With response rates up, surveys are providing new, valuable data to EPPs that are helping to spur program improvement.
The Tk20 Difference
The Bank Street Graduate School of Education viewed the survey task in light of the increasing drumbeat for accountability and realized technology could help to better manage survey administration and more.
Bank Street did due diligence in selecting a data partner to meet its unique needs. Tk20 provided onsite demos of the many functionalities of its data management system and participated in Q&As on how the system could meet Bank Street’s individual needs. Decision makers at Bank Street called several Tk20 institutions. One feature that helped Bank Street make its decision: Tk20 provides 24/7 client support so that assistance is a phone call or email away.
Once the decision was made, Tk20 held training sessions on system functionalities. Bank Street administrators also offer ongoing training sessions and guidance materials to assist faculty and students as they explore and use the many system functionalities. Kline appreciates Tk20’s advice to “start slow” with one functionality, so faculty become used to navigating the system for their specific purposes. Bank Street started using Tk20 for collecting and using data on individual course assessments—the most common faculty need. Kline notes that this way, “Tk20 gained the faculty’s trust.”
Now, Bank Street is fully engaged with Tk20 and uses the system for edTPA® portfolios, the document room, advisement, and administration. Faculty have access to customized analysis—from one student’s records to an aggregate of all of their students. Additionally, the Graduate School of Education has access to aggregated and disaggregated data on program data, “which has been a big improvement,” says Kline.
Tk20’s Many Functionalities Enhance Providers’ Capacity to Gather & Use Evidence
Tk20 seamlessly integrates with all learning management systems (LMS) (e.g., Blackboard, Canvas) and student information systems (SIS) (e.g., Banner, Colleague). Its single sign-on feature makes logging in easy for students and faculty.
Kline said Bank Street has enjoyed many benefits from its Tk20 partnership and foresees a continuing strong relationship. Paul at MNSU Mankato echoes Kline’s feedback.
About the Author
Jane Leibbrand, Communications Consultant Education Policy & Practice Writer
Jane Leibbrand is a Tk20, Inc. communications consultant and education policy and practice writer. She served at NCATE/CAEP for 21 years, first as director and subsequently vice president for communications. In that role, working directly with two presidents of the organization, she helped organization leadership bring the accrediting body to the fore as a leader not only in standards development but also in education policy. Leibbrand’s experience also includes teaching high school English (Virginia) and freshman college English (University of Georgia). She has authored many journal and magazine articles as well as numerous op-eds and reports.
The edTPA trademarks are owned by The Board of Trustees of the Leland Stanford Junior University. Use of the edTPA trademarks is permitted only pursuant to the terms of a written license agreement.