Connecting the dots to measure sales training effectiveness

May 24, 2023
Connecting the dots to measure sales training effectiveness

Six months ago, on a brisk January afternoon in Mumbai, we met three key people at one of our oldest clients. The first was the head of their India business, the second was the head of their HR function, and the third was a participant from one of our earlier workshops.

They often could not agree on how effective a training program had been, even months after it was completed

As we discussed their business and its future, one concern stood out – because it came up in all three meetings. They often could not determine (or agree on) how effective a training program had been, even months after it was completed.The problemOn the rare occasions that one of them accepted a method to ‘measure’ effectiveness, the other two often disagreed, or disregarded the proposed measurement. For example, the HR head sometimes put her faith in measuring competency improvement, while the business head would accept only documented, on-the-job results.We thought it was a remarkable coincidence that all three spoke about the same concern - until we found out how prevalent this concern was. Meeting a few more clients, we could confirm this was a major issue – and one worth solving.Finding a solutionWe worked with business heads, HR partners, participants, our team of advisors – and introduced a system of ‘connected measurement’ to address it.We have since been using this to help clients evaluate the effectiveness of our programs (which are built around simulation-based workshops).Connected measurement

sales training connected measurement

The system measures two distinct aspects, which are completely independent of each other, but fully connected within themselves:

  1. Learning objectives, agreed-upon in advance, with HR and business leadership, are tracked at the end of our workshops to check how far participants have learnt, retained and are comfortable with these particular concepts, tools or frameworks.
  2. Action items, agreed-upon with each participant (and approved by their respective reporting managers), are tracked after many months, to assess the program’s actual on-the-job impact.

For the first part, i.e. learning objectives, the tools vary. Some clients want to track retention through an independent assessment, while others trust only their own employees – so self-assessment is the last word.For action items, we have refined a system where each participant decides on (and commits to) certain specific action items at the end of their workshop. These action items are shared-with (and approved by) their individual reporting managers. We then go back to assess if they have been implemented – three (sometimes even six) months after the workshop itself, verifying every reported result with their reporting manager.

Happiness:The Non-Linear Multiplier to Workplace Productivity

And how you can implement it at work

Talk to Our Solution Expert

Related posts

Browse all posts