As the world of Higher Education increasingly responds to the demands of the labor market as well as federal and state governing agencies, new and improved approaches to assessment and accreditation are moving to the forefront. Addressing how to analyze and assess online learning, General Education models, proficiency-based learning, and embedded skills all require new ways of thinking about understanding how to define and measure “success.” Many of these were on display at the Middle States Commission on Higher Education (MSCHE) Annual Conference held recently, and are beginning to reflect the increasingly dynamic nature of higher education as a microcosm of broader society.
This dynamic environment is reflected in the nature of issues focused on at this year’s MSCHE conference, as attendees and presenters focused on new and better ways of collecting, analyzing, and interpreting data to buttress assessment and accreditation. The result of can be seen in the move to more analytic and away from descriptive self-study reports.
MSCHE is following suit (and leading the way!) in responding to changes and challenges in the higher ed landscape by working to reduce the institutional burden on member institutions as well as incorporating and supporting a data driven approach to accreditation. Using available IPEDS, College Scorecard, and other data, MSCHE intends to promote a longitudinal approach. The longitudinal point of view is embodied in forthcoming Annual Institutional Updates, which incorporate traditional metrics such as graduation rates and loan default rates, as well as data points that are often unused, such as debt burden and tuition dependency.
In our previous existence as the Analytics Division at University of Maryland University College, we took a similarly data-driven approach to supporting accreditation. HelioCampus applauds the efforts of MSCHE to continue to improve access to and use of data for accreditation. While MSCHE is developing a portal to review and monitor the new AIUs, HelioCampus is providing support by developing our free-of-charge IPEDS Explorer to supplement these efforts, and is in the process of adding MSCHE-specific KPIs. In addition to providing access to specific institutional AIU data, IPEDS Explorer also supports comparison among other similar institutions, providing additional context to observed trends. We hope that IPEDS Explorer provides an efficient means to identify useful “optional metrics” that institutions may not have the ability to analyze (such as Education-and-Related Expenditures, or credit ratio, the proportion of credits successfully completed).
These sorts of overviews are certainly a first step that institutions can take toward enhancing the data-centric nature of accreditation. IPEDS data provide a useful high-level perspective of institutional performance, but because they are so high-level (and often “silo”-ed, especially between academic and financial data), it can be difficult to understand and effect program-level review and improvements. As always, answering one question usually leads to many more, and more interesting questions! However, the path that MSCHE is taking, the top-down” approach, should encourage a “bottom-up” approach as individual academic units begin to see how they can define the right questions to ask and then devise a method to answer these questions. Systematically developing a process to store, access, analyze, and share data is a long-term solution to the challenges of accreditation and the resources that are used. In turn, institutions can be even more responsive to the needs of their students and other stakeholders. This responsiveness will be particularly important for emerging non-traditional paths that many students undertake (but which may not result in formal certification), which are now being incorporated into models such as “guided pathways.” As these models become more ingrained into the culture of higher education, developing and maintaining good data structures and reporting systems will become more and more important.