...
Technology Research

The Blender Problem: CBE and Technology

Blog post cover image

Eduventures’ two most recent reports on competency-based education (CBE) are Deconstructing CBE: Portraits of Institutional Practice, published last month, and last year’s Deconstructing CBE: An Assessment of Institutional Activity, Goals, and Challenges in Higher Education. One finding is that institutions are deploying a wide variety of CBE-related models and practices.

From a technology point of view, we wondered about challenges in ensuring that an institutional ecosystem is prepared to support a CBE program, particularly given the variety of models and practices. The excellent Technical Interoperability Pilot (TIP)—a partnership between the Competency-Based Education Network (C-BEN) and the IMS Global Learning Consortium—has already identified some of the friction points, such as non-standard terms, competency scores, and financial aid processing. We thought that we would dive more deeply into this issue, targeting three key questions:

  1. Does the variety of CBE models and practices exacerbate the challenge of aligning a technology ecosystem with a CBE program?
  2. If one decided to implement a CBE program, how would the technology ecosystem need to look?
  3. Would it have to be different than one that supported a more traditional program and, if so, in what ways?

Through conversations with leaders at six institutions—Western Kentucky University, Lipscomb University, Brandman University, American Sentinel University, Western Governors University, and Wichita State University—we discovered some critical areas where technology is not fully supporting CBE. Interviewees spoke about how CBE challenged existing practices, which, in turn placed their ecosystems under pressure or, as one interviewee nicely put it, “CBE puts everything into a blender, and then you have to make your technology respond.”

We are scheduled to publish a report based on these interviews later this year, but we thought we’d give you a glimpse of some initial findings:

  1. Student Expectations Place Stress on Technology ArchitectureIn the age of online streaming, many students expect engagement with a video or piece of music to be available the moment they make a payment. Likewise, many students enrolling in online CBE courses expect that they can begin the course when they complete registration. Indeed, CBE’s emphasis on convenience and personalization feeds this expectation. Some institutions told us that it could take up to a week for a student to start her course after registration, no matter how well integrated the different solutions required to share data were, as data had to move from the registration software to whatever solution stored the online course. A series of daily system update cycles can really slow things down.
  2. Leveraging the same technology for both traditional programs and CBE is difficult.Many institutions are looking to launch CBE programs alongside traditional programs, offering students a choice. For these institutions, a fundamental question is whether there is a single solution, such as an LMS, they can use for both programs. Many of our interviewees found that vendors design solutions for either program, but not both. Faced with this question, most interviewees decided not to acquire two solutions and must figure out how to configure other systems (CRM, for example) to fill in the gaps.
  3. Technological support for regular and substantive interaction is inadequate. If offering federal student aid, institutions must ensure that students and instructors have “regular and substantive interaction.” Although the precise meaning of “regular and substantive” is not altogether clear, these institutions do at least want their technology to be able to record interactions between students and instructors. Our interviews indicate that solutions fall short here—while solutions can easily track aspects of instruction and engagement with instructional materials, they find it difficult to track other interactions, such as tutoring and advising, which may take place offline. As a result, some of the institutions we interviewed had to deploy other processes for this, such as having faculty manually record their interactions with students.

The US Department for Education Inspector General’s investigation into Western Governors University’s compliance with federal student aid requirements is a case in point. The investigation, initiated in 2015 and assumed to be ongoing, is concerned with whether WGU’s competency-based model embodies “regular and substantive” interaction between faculty and students.

How do institutions identify and overcome these and other challenges? Our findings showed that there were two first steps:

  • Accept the different nature of CBE. Many of our interviewees reported that the first step was to recognize the unique character of CBE. Understanding how CBE breaks from a traditional educational model—regarding instruction, student tracking, etc.—freed instructional teams to question how CBE might impact their technology.
  • Review the student lifecycle with a cross-functional team. Our interviewees advised that the best way to examine the impact of CBE in its entirety was to gather stakeholders from different departments—registration, enrollment, technology, etc.—and review the paths students might take as they progress through their CBE programs. These reviews often identify areas where technology comes up short and helps clarify whether the institution would need to acquire new solutions or, at a minimum, develop processes to close the gap.

Please look out for our CBE Technology report, based on these interviews, in the coming weeks.

Like, Follow, Share.

Subscribe card logo

Never Miss Your

Wake-Up Call