Over the last few weeks, our CEO Philip Huthwaite has had dozens of conversations with L&D leaders. As we work on our exciting new product, we wanted to make sure we’re getting it right for the people who will actually be using it, so we’ve been asking all the important questions – in particular, how are L&D professionals measuring learning impact right now?
The answers were eye opening. There is huge variation in the way businesses across all sectors are measuring learning impact. But to be clear, this has nothing to do with the skill or intentions of the L&D teams themselves. We know what we should be measuring, but budget, time and tech constraints, as well as a lack of understanding from senior leaders, often hinders our progress. L&D is working really hard, and we think we have the solution to help them achieve their goals.
So, how are we really measuring learning impact in 2025, and what does that tell us about what we need to do it better? We spoke to both independent learning consultants and in-house L&D leaders to find out.
The fluffy side of ROI
When describing how L&D leaders are measuring ROI right now, two of the words that jumped out are ‘fluffy’ and ‘woolly’:
“Clients are often satisfied with ‘fluffy’ or anecdotal outcomes, even without robust data."
This includes things like post-learning surveys (about how the learners felt about the learning), attendance at workshops, LMS logins and course completions – they may be easy to collect, but many of the leaders we spoke to say that these metrics are just superficial proxies for measuring real learning impact.
Of course, these metrics all have their place in L&D, but we can see why leaders find them ‘fluffy’. It tells us how many people attended a training session or completed a course, but not what they actually learned, or more importantly, what they did with their new knowledge.
The Kirkpatrick model

“I’ve yet to work in an organisation that effectively tries, let alone is capable of understanding, the genuine impact of training."
The majority of learning leaders we spoke to mentioned the ever-popular Kirkpatrick model.
However, we heard that many companies stop at Kirkpatrick level 1 (reaction surveys after the training), meaning they’re missing out on a huge amount of data relating to knowledge retention, real skills application and behaviour change.
One leader said that ‘companies don’t really care’ about learning ROI beyond compliance, leading to a lack of long-term measurement.
“I rarely see organisations truly measuring the impact of training. With the current methods, it's hard to quantify improvement or behavioural change.”
Assessing knowledge retention
The majority of organisations attempting to measure learning impact use the traditional post-learning tests and assessments – for instance, a quiz at the end of an elearning module.
Beyond this, some businesses also build follow-up questionnaires into their learning programmes, allowing them to track knowledge retention and implementation at 3, 6 or 12 months after learning.
Some learning leaders we spoke to said they will offer refreshers after the main training, but as to whether or not these refreshers are monitored to track knowledge retention, it’s a very mixed bag. Sometimes the refresher training will be delivered, but with no follow-up quiz or assessment, meaning L&D can’t measure how well the programme is working.
Pre- and post-assessments are also popular, allowing L&D to measure the difference in knowledge or skills before and after the training has taken place. These, combined with confidence assessments, give L&D some insight into how well learners understand the material and how confident they feel about applying it on the job.
Bringing it into the real world
Some organisations are going even further, and are integrating learning into the flow of work. Combining real-world tasks, mentorship, manager observations and post-training assignments gives a more holistic view of learners’ progress and behaviour change – but L&D leaders are conscious that it’s not always easy to go beyond anecdotal evidence.
One leader shared that A/B testing has historically been a reliable method for monitoring learning impact in the real world (e.g. splitting learners into groups and comparing business results). However, they acknowledged that this method can be limited, and requires a lot of setup.
Syncing learning impact with business impact
Pretty much every L&D leader we spoke to would love to be in a position where they can measure the impact of all the learning they offer – not just the skills that are easy to track.
“There’s a slow but growing shift towards measuring actual workplace impact, accelerated by AI.”
Tying learning metrics to business KPIs, such as customer service case volume, is a more meaningful way to explore learning impact. Some L&D teams are already finding success using AI to track and analyse capabilities, while others are manually integrating learning data with business outcomes – for instance, using CRM data to measure improvements in sales over time.
Some businesses are also using Net Promoter Scores (NPS) to measure the impact of learning. While it can be tricky for L&D to directly take credit for improvements in NPS, the impact of a learning initiative on NPS can often be deduced from the wider context – but of course, there is always an appetite to clarify the link between learning and business impact.
So, where does that leave us?
All in all, there’s a huge variety of methods and approaches for measuring learning impact in 2025.
Many of the learning leaders we spoke to acknowledged that attributing ROI to training is hard, as external factors often influence outcomes.
And while in the majority of businesses, learning measurement remains relatively shallow (based on learner satisfaction surveys, happy sheets and rudimentary quizzes), that’s rarely an active decision from L&D. Learning teams are being held back by squeezed budgets, having too much to do in too little time, limited stakeholder buy-in and outdated tech.
We don’t think that’s good enough. L&D needs tech that genuinely helps them achieve their ambitions – to measure ROI and demonstrate the immense value they’re adding to their businesses.
And we think we have the answer. Take a look at the very first sneak peek at our AI-powered skills management solution: