Understanding the Limitations of Descriptive Analytics

Rob Walz 4 minute read
Woman writing notes on a notepad while reviewing pivot tables in Microsoft Excel on her laptop.

Imagine this: your CEO comes to you and says, “The board is very concerned about identifying new business opportunities next quarter and scoping internal risks. What does our training data tell us?”

What is your internal reaction?

  • “I have some great thoughts. Give me two minutes to send you a report.”
  • “I have some great thoughts. Give me two days (or weeks) to get a report together.”
  • “I have a lot of thoughts but how am I going to make the data tell the story?”
  • “I have 99 problems and accessing the training data I want are 98 of them.”

Most training teams can’t answer this question quickly or confidently. Why not? Training data is a wealth of information about the health of an enterprise and training professionals know that same data reveals both risk and opportunity for the business.

The problem lies in limited learning analytics. Typically, the training function uses a single learning analytic to tell their story: descriptive analytics.

Descriptive analytics are the backbone of training reports, and the foundation for a mature data-driven decision process. However, this category of analytics can not interrogate data very well so showing results is easy, but adding context becomes difficult. Forecasting is nearly impossible.

Let’s explore what defines this category of analytics, how we use it to show training impact, and understand the limitations created by a single data framework.

Traits of Descriptive Analytics

Descriptive analytics, as the name suggests, describe what happened. How many learners attended an event? What percentage of a cohort completed required training? This data is very accessible and easy to share with stakeholders.

This data is highly visible points in time that are easily traced back to quantitative and qualitative metrics. You can quickly know what happened, when it happened, who was involved, and to an extent how it happened. Reports you’re running today such as seat time, attendance, instructor performance, completion rates, achievements, and learner satisfaction are composed of this descriptive data.

Using descriptive analytics, you create and leverage a historical outlook of training performance to form the basis by which further data inferences are informed downstream. The problem is: enterprise training gets stuck going downstream.

More Below

Data-Driven Training Decisions: Start Your Journey Here

Leverage your training data to make smarter, faster decisions. This guide explores the foundational analytics that power true training intelligence.


Limitations of Descriptive Reporting

Training teams rely too heavily on descriptive analytics, hamstringing the ability to answer why something happened (or didn’t happen). Answering what happens next is almost impossible with only descriptive analytics.

No context within your data. The nuance and layers that explain why a particular course is effective, or why one event exceeded expectations and another did not are not possible. Descriptive analytics provide the basis for answering why but the heavy lifting is on your team. Descriptive analytics are tied to activity metrics, not the KPIs that measure training’s performance.

Weak relationships with other data. Descriptive data often lacks a solid relationship with other data points. Training teams have to infer these relationships, which can lead to incorrect conclusions and difficult decision making. For example, you know a particular cohort of learners has performed poorly across a specific course, but you only have test scores to consult, not other data points leading to the poor performance. The poor results may be due to scheduling inefficiencies or technical issues.

Data is frozen in time. Descriptive data alone can only ever show you what happened in the past. Training teams have been stuck in this backwards view for so long, it is a shock to realize how sophisticated predictive analysis has become for other teams, especially sales and marketing. This is the limitation that teams hit hardest, most often.

Moving L&D Reporting Beyond Activity Metrics

Maturing training data, reporting, and forecasting isn’t as difficult as it sounds. Here’s what to keep in mind:

  • There is a defined maturity model, trailblazed by others, you can use to map your team’s own transition.
  • A tech investment is likely, because enterprise training software has never been designed to go beyond descriptive analytics. Building complex spreadsheets is an enormous time sink, and they aren’t the best solution anyway.

The best tech won’t help you unless you’ve embraced a mindset shift away from mere reporting and toward business intelligence.

Subscribe

Join thousands of training leaders around the world who have our content delivered straight to their inbox.