Learning Analytics

Diagnostic Analytics Mean Better Training Reports

Rob Walz 4 minute read
Training team gathered around a desk while the lead gestures at a report.

Training reports typically rely on descriptive analytics to tell a story, but because this data can only answer the question “what happened?” the reports lack meaningful context.

Though descriptive analytics form the foundation of data-driven decision making, they alone can’t reveal a complete picture of your program’s impact. To understand why changes are happening within your program, we turn to more advanced data analytics such as diagnostic analytic.

What Are Diagnostic Analytics?

Diagnostic analytics interrogate data to answer the question, “why did this happen?”

Diagnostic analytics aren’t a replacement for descriptive analytics, in fact they heavily rely on the descriptive data to reveal correlations between training outcomes and organizational goals. Rather, diagnostic analytics are the contextual data that reveal cause and effect. These analytics answer questions such as:

  • Why is one cohort outperforming a similar cohort in the same course material?
  • Why did learner satisfaction drop when we changed the event schedule?
  • What is causing a spike in employee turnover?

These are advanced analytics that help training leaders identify causes and correlations that have an impact on larger organizational goals. Modeling diagnostic analytics and working with this sort of data can be messy, lead to innaccurate conclusions, and can cause spirited debates. For these reasons, many training teams struggle to  deploy advanced analytics in your day-to-day operations and decide to return to basic reporting.

Here are a few tips for leveling up your own reporting by layering in diagnostic analytics.

Identify Variables & Rates of Change

Identifying variables and rates of change is a good place to start. You want to list anything that could change between two or more groups that you are analyzing. Rates of change are another variable that can be very revealing and should be noted. Even if you don’t have the data in hand, know the elements that could be variables will help you create a diagnostics analytics plan.

For example: what are the variables between similar groups, courses, or events? Why did the same course that performed well with one group failed with another? Did anything change between two learner cohorts? Did the training location change, or the delivery format? Did the same instructor lead ?

Think of Training KPIs as Questions

You will always ask the question: why did this happen? But asking why is entirely too broad to be useful. Instead, we need to narrow our focus and ask precise  questions.

A good way to get started with this is to consider your existing KIPs, OKRs, or other success metrics. You are going to be evaluated on how well you meet these goals, so they form a natural foundation for diagnosis.

Rewrite each of your KPIs as a question. Keep the questions short and direct to eliminate extraneous data. For example, if your goal to increase average attendance, reframe the KPI as a question: why does attendance change? Take that example a bit further and ask: why is average attendance decreasing overtime?

Pinpointed questioning keeps the report on track and helps training leaders  diagnose an accurate connection between training outcomes and business objectives.

Take Small Steps to Solve Complex Problems

As you begin to ask and answer these pinpointed queries, you will reveal more questions. It is easy to fall down a rabbit hole, until you are trying to answer very large, very complicated questions that are disconnected from previous answers. To top it off, these are usually questions you didn’t have until you started turning over data and seeing what is hiding underneath. This is exhausting and can lead to burnout with advanced analytics.

Counter this by building a ladder, or progression, of questions from simple to complex. A progression may look like this:

  • Is a key onboarding course worth keeping or should it be removed?
  • To answer that we need to know if the course is adding value.
  • To answer that, you can ask what percentage of learners are applying the concepts to their daily work.

The answers to these questions to spark new lines of questioning, and if they do, the questions often are more complex in nature. As more questions are asked, training analysis comes into focus, and iterative responses to the story behind the data become more complex.

You can learn more about how to improve your training reports with advanced analytics in our Learning Analytics Blueprint guide.

Subscribe

Join thousands of training leaders around the world who have our content delivered straight to their inbox.