The Mixed Methods Blog
What It Takes to Redesign Advising
In 2012, with support from the Bill and Melinda Gates foundation, 19 institutions embarked on an initiative that aimed to strengthen their advising practice by encouraging the use of degree planning, early alerts, and other advising technologies. By 2015, a combined total of 45 institutions had received grants from the foundation to participate in this initiative.
Since 2012, the Community College Research Center (CCRC) has studied the implementation and outcomes of advising reforms at institutions participating in the initiatives, now called Integrated Planning and Advising for Student Success, or iPASS. Our work has included longitudinal qualitative implementation studies at 17 iPASS institutions and descriptive analysis of student outcomes at 26 grantee colleges. In 2015, we also partnered with MDRC to conduct an experimental evaluation of enhancements to colleges’ existing iPASS practices at three iPASS institutions.
We have learned a lot over the course of these studies about the potential of technology to support good advising practice, as well as the opportunities and challenges that come with redesigning advising. We know that technology alone is not the answer. But our data also tell us that advising technologies can help to transform advising structures and processes and orient them toward a model that allows for strategic, sustained, and personalized support of students throughout their tenure in college.
Most importantly, we see repeatedly in our data that this transformation requires constant cycles of improvement. It takes time. Institutions must work through several implementation questions, including how often students need to hear from advisors, how advisors should respond to data such as early alert flags that suggest students are at-risk, and how to engage students who may be struggling. Working through these questions and many more is a critical part of the process—and it doesn't happen over night.
Because iPASS is intended to help institutions change system-wide, it is difficult to determine its impact on individual students. In a descriptive study, we looked at aggregate outcomes in multiple iPASS institutions. We observed little change in most key performance indicators over time, except for some positive trends in the number of credits attempted and earned within one year of enrollment at two-year colleges. Most recently, we conducted an experiment to examine the effect of enhancements to iPASS practices at three specific colleges. The early findings show no significant positive effects, although we will be looking at longer-term outcomes with more data next year. When interpreting these results, it is important to keep in mind that iPASS at most colleges is a work in progress; few (if any) have achieved iPASS in its idealized form. That is, it is too early to tell how advising redesign using technology impacts student outcomes.
This fall, CCRC will publish two reports that will further unpack lessons regarding implementing advising redesigns. The first report explores the internal and external dynamics in and around an institution that influence how advisors and students experience iPASS reforms. The second report looks closely at the design and implementation of advising interventions at the three institutions that participated in the experimental study. Both reports will describe lessons that can inform the continued refinement of technology-based advising structures and processes at iPASS institutions and other institutions working on improving student supports.
Meanwhile, all of the participating institutions continue to refine their advising practices in ways that are likely to lead to improvements in student outcomes over time. Qualitative data indicate that advisor practices are changing in ways that students appreciate, and technology is increasingly being used to permit advisors to better understand and support their students.