Blog

The one step you need

to add to measure the impact of training

  • Female,Hands,With,Pen,Writing,On,Notebook

Published on: January 2018

Written by: Fredrik Schuller, Kevin Bronk

Originally published here by TrainingIndustry.com

A New Mindset for Realizing ROI

Nearly every training professional says, “I care deeply about results.” However, the results of a training program are really tough (not to mention time-consuming and expensive) to measure, and we often settle for using standard methods to determine the effectiveness of a program. Glowing smile sheets are the status quo, handed to trainers as participants leave the room after a week of intense learning. We often leverage a level 2 analysis grounded in a pre- and post-learning delta survey and eagerly ask participants, “How much do you think you’ve learned?”

We used to believe that the best in breed took the energy not only to survey the participants but also to evaluate them using engagement surveys or 360-degree assessments that compare the participants to control groups. But if we actually care deeply about the results of training, these evaluation efforts – while valuable – are not that helpful for authentically measuring and, more importantly, realizing ROI.

The Missing Step for Measuring ROI

As program designers and facilitators, we need a new mindset for thinking about ROI, and we can look to the software as a service (SaaS) industry for inspiration. In the SaaS industry, organizations are driven by customer success, and this should be no different in the L&D world. Our participants are our customers, and we should apply a laser focus to enabling their success.

Picture a world where entire programs are designed around the actions participants will do or try after training. No more waxing philosophical for two days and then dedicating a tiny portion of the experience to action planning (in a notebook that may or may not be left behind, only to be recycled alongside all the frameworks and tools presented throughout the workshop.) Instead, what if we intentionally designed every aspect of a learning experience around the actions we wanted participants to take? Whether the experience is in person or digital, brief or months long, the goal is to unleash participants to take action, apply learning and try new things back on the job.

But we can’t stop there. Focusing on customer success means going beyond intensive action planning. We need to ensure our participants are supported to make changes when they leave the workshop or log out of a module. Unfortunately, this support is easier said than done.

No More Settling

Twelve months ago, we embarked on an experiment to test this theory. We wanted to capture what participants said they would do or try after a program, identify whether or not they followed through on those actions, and track their outcomes. We developed technology that used email to track participants’ actions, removing annoying barriers like apps or login screens.

During programs, participants used their phones to input the experiments or actions they intended to take back on the job. These actions were captured in a database, and participants received follow-ups from the facilitators in the weeks and months after the program, providing encouragement and inquiring on progress and outcomes. At the beginning of this experiment, we observed about 20 percent of participants completing their actions or experiments and reporting success. Although low, this result was above average. In their book “High Impact Learning,” Dr. Robert Brinkerhoff and Anne Apking estimated that only 12 to 15 percent of participants successfully apply what they learn from a typical training engagement.

After months of iteration across 25,000 participants, the average in our experiment increased to 40 percent. Some programs had as high as 80 to 90 percent of participants reporting success from new behaviors. Here are a few key take-aways from these remarkable results:

  • Proactive reminders and resources significantly increase the chance that a participant will try new behaviors, and the addition of targeted peer coaching aligned to actions produces even better results.
  • Some programs with sky-high ratings on smile sheets have misaligned actions or outcomes, and this discrepancy helps highlight missed opportunities in the program design.
  • Capturing success stories provides clients with unparalleled data and inspiration for what future participants could achieve after a similar learning experience.

This experiment fundamentally changed our approach to program design and the questions we ask executives when we begin building or updating programs. It’s still early, but the results are powerful. For example, we recently examined success stories from a series of programs with a major telecom provider and found that the actions of four participants alone had driven four times the ROI on the L&D investment. This is just the tip of the iceberg.

In our quest to drive ROI from training, it’s easy to skip over considering what participants actually need to take away from a training. Customer centricity means we need to start with actions – how to turn learning into doing or trying new behaviors following a program. Then, and only then, can we truly realize meaningful, credible results.