News: ACT and Encoura are unifying to better serve educators, institutions, and students. Learn what this means for you.
Wake Up Call

Technology implementations related to teaching and learning increased during the pandemic, with a combined annual growth rate of 14% between 2019 and 2021. In fact, every segment in this category increased over this period, with Online Course Solutions (24%), Accessibility Solutions (22%), Productivity and Collaboration Solutions (20%), and Assessment Integrity Solutions (16%) showing the highest growth.

Clearly, many institutions invested in technology to enable remote delivery—a fact that has received plenty of attention over the past few years. But is this the only significant technology trend to emerge from the pandemic? Our analysis indicates no. A deeper look at the implementation data also reveals a lesser-understood trend that may better help us better predict what comes next in edtech.

Our Approach

To better understand technology implementation trends, we analyzed data from LISTedTech for about 2,700 institutions—public, private, two- and four-year, for-profit, and not-for-profit—to determine net-new implementations for technology segments within the teaching and learning ecosystem. We used a time series clustering approach which organizes trend data points over time into groups based on their similarities. This approach has two goals:

  1. Identify implementation patterns: Gaining insight into whether groups of trends historically move upward, downward, or horizontally together over time.
  2. Predict and recommend next steps: Understanding these clusters can help predict the future in the same way that understanding past wind patterns can predict today’s wind patterns.

For example, Figure 1 looks at the overall implementation of technology related to teaching and learning between 2016 (pre-pandemic) to 2021 (“post”-pandemic).

Implemntations Overall

Figure 1

In isolation, this aggregate view might lead us to believe that all technologies rose and fell similarly since 2016, with the most significant spike in implementations occurring in 2020 (27%), mid-pandemic. As a result, we may conclude that technology acquisition decisions have been pretty much in lockstep since then. Yet, further analysis of implementation trends suggests more to the story.

A Deeper View

Applying our time series clustering approach to implementation trend data shows that not all institutional technology acquisitions followed the above pattern. For example, Figure 2 shows the implementation patterns of a specific bundle of technologies that provide a teaching and learning ecosystem’s core functions, such as course delivery and assessment management. These segments include Assessment Management Solutions, Curriculum Management Solutions, e-Portfolio Solutions, and Learning Management Systems. We’ve described the technology strategy of these institutions as “Focusing on Infrastructure.”

Institutions included in Figure 2 are comprised of public two-year (32%), public four-year (32%), and private not-for-profit, four-year institutions (29%).

Focusing on Infrastructure

Figure 2

Note that institutions in Cluster 1 partly mirror the overall pattern in Figure 1, with a spike in implementations in 2020 (16% of all implementations). Yet, this cluster differs from the overall pattern, as the 2020 spike is not as high as the 2016 increase (16% of all implementations vs. 29%).

This indicates that instead of focusing on acquiring solutions to enable course delivery, like Zoom, leaders at these institutions decided to strengthen their teaching and learning infrastructures during the pandemic. These institutions also bucked the overall trend of acquiring significantly more technology during the pandemic; the spike of acquisitions during the pandemic is lower than it was in previous years.

Figure 3 provides another example of the dangers of assuming that the overall implementation pattern is true for all types of technologies and institutions. We characterize the technology strategy of Cluster 2 institutions as “Supporting Student Progress.” The implementations of this cluster consist of bundles of solutions that help institutions support the learning journey. It is comprised of: Advising Solutions, Course Evaluation Solutions, Course Registration and Degree Planning Solutions, Digital Courseware Solutions, Digital Credential Solutions, and Lecture Capture and Video Solutions.

Supporting Student Progress

Figure 3

Consisting of a high percentage of public, four-year institutions (50%), this cluster shows a small spike in implementations during 2020 (19% of all implementations vs. 16% in Figure 2), consistent with the overall pattern. Yet, unlike the overall pattern, the most significant spike in implementations occurred well before 2020, in 2017 (39% of all implementations).

This indicates that, unlike both the overall trends and trends in the previous cluster, these schools concentrated more on implementing tools that helped them track and understand student progress. As we saw in the previous cluster, however, these leaders also acquired less technology during the pandemic than in some previous years.

Figure 4 shows the final cluster. This cluster, which we call “Managing Course Delivery,” consists of technologies that help institutions focus on ensuring that they can deliver learning in a remote environment, thereby ensuring that students can access content and that faculty can assess their understanding of that content. This cluster is comprised of solutions with the highest combined annual growth over the pandemic: Accessibility Solutions, Assessment Integrity Solutions, Online Course Solutions, and Productivity and Collaboration Solutions.

Managing Course Delivery

Figure 4

Consisting of the most significant percentage of public two-year institutions (31%), this cluster closely mirrors the overall pattern. It shows the most significant spike (35% of all implementations) during 2020.

Why do these clusters matter? First, these clusters give us more insight into the interrelationship between implementation trends. Rather than looking only at an individual segment’s upward or downward movement, we now see segments showing similar implementation patterns over time.

This approach also helps us become more efficient in our predictions as the longitudinal correlation of products guides us to consider whether and how institutions group products together. As a result, when predicting a segment’s future market share increase, like the LMS, we might consider the consequent increase in the market share of another product in the same cluster, such as Assessment Management Solutions.

The Bottom Line

This type of analysis reveals the interrelationships of technology implementation trends over time. But what do they predict about the future?

In our conversations with institutional leaders, we know that they increasingly view technologies as groups to support different problem domains, such as onboarding or recruitment. Many vendors we talk to are also closely considering how their products fit within groups, whether they provide many components, like Anthology and Ellucian do, or just one, like Othot or Discourse Analytics, etc., which together can address problem domains.

Therefore, we predict that the higher education technology market will show significant tectonic shifts as clusters of segments begin to show conjoined increases and decreases in implementations over time. Also, we recommend that institutions and vendors place greater focus and demand for interoperability of products within clusters to meet the requirement that products within clusters work together.

 

Never Miss Your Wake-Up Call


Learn more about our team of expert research analysts here.

Like, Follow, Share.

Twitter
Facebook
LinkedIn

Recent Posts

James Wiley

Eduventures Senior Fellow at Encoura
Contact

 

Eduventures 2022 Higher Education Technology Landscape (Landscape) visualizes 367 vendors and their products, organized into over 44 separate market segments rolled up into four major categories aligned to the student lifecycle.

Throughout the year, we analyze these vendors and products and make that content available to clients. Several vendors have multiple products in their education technology portfolios.

Evaluate the Success of Your Programs to Prioritize Innovation

The Program Strength Assessment (PSA) is a data-driven way for higher education leaders to objectively evaluate their programs against internal and external benchmarks. By leveraging the unparalleled data sets and deep expertise of Eduventures, we’re able to objectively identify where your program strengths intersect with traditional, adult, and graduate students’ values, so you can create a productive and distinctive program portfolio.

Also in Technology Research