Summary:
Kevin M. Yates, also known as the L&D Detective™, has 30 years of experience working in training, learning, and talent development. In his current role, he investigates the extent to which training and learning contributes to workplace performance. In this episode, Kevin talks about how training is measured today; how that might change in the future; and how organizations can start measuring the impact training has on business and employee performance.
Chapters:
- Welcome, Kevin M. Yates!
- Today’s Topic: Measuring L&D’s Impact on Business Performance
[5:35 - 18:19] How is training measured today?
- All the different ways training can be measured
- How learning outcomes are affected by the tools and resources used
[18:20 - 32:06] What is the future of training measurement?
- It rhymes with “yay bye”… It’s AI!
- Using new tools to measure how learning and development contributes to workplace performance
[32:07 - 36:28] How can organizations begin measuring the impact training has on business and employee performance?
- Why you should intentionally plan for impact from the beginning
- How to approach ROI questions related to training programs or learning solutions
- Thanks for listening!
Quotes:
“We have tools that measure outcomes and then we have actual outcomes, and those two aren’t necessarily related.”
“For some [L&D] goals, you’re going to be very specific with how you’re trying to move performance . . . that’s where you have to have a hypothesis [about the ROI].”
Resources:
L&D Detective
Contact:
Kevin's LinkedIn
David's LinkedIn
Dwight's LinkedIn
Podcast Manager: Karissa Harris
Email us!
Production by Affogato Media
To schedule a meeting with us: https://salary.com/hrdlconsulting
For more HR Data Labs®, enjoy the HR Data Labs Brown Bag Lunch Hours every Friday at 2:00PM-2:30PM EST. Check it out here: https://hrdatalabs.com/brown-bag-lunch/
Produced by Affogato Media
Powered by the WRKdefined Podcast Network.
[00:00:00] The world of business is more complex than ever. The world of human resources and compensation is also getting more complex. Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources. Listen as we explore the impact that compensation strategy, data, and people analytics can have on your organization.
[00:00:25] This podcast is sponsored by Salary.com, your source for data, technology, and consulting for compensation and beyond. Now, here are your hosts, David Turetsky and Dwight Brown.
[00:00:38] Hello and welcome to the HR Data Labs podcast. I'm your host, David Turetsky, alongside my co-host, friend, partner, and well, I guess...
[00:00:47] I don't know what else to say.
[00:00:48] I really want to see where you go with it.
[00:00:50] No, no, I'm going to end it there.
[00:00:52] Dwight Brown from Salary.com. Dwight, how are you?
[00:00:54] I'm good, David. How are you doing?
[00:00:56] I'm good. I'm good.
[00:00:57] Today, we're going to have a fascinating conversation.
[00:00:59] I think you and I both have asked the world to provide us with people to talk a little bit more about the ins and outs of the learning and development and training function.
[00:01:10] And I think we're going to have a pretty good time with that today because we have with us Kevin Yates.
[00:01:15] Kevin, how are you, sir?
[00:01:16] I'm doing great. I'm doing great.
[00:01:18] And I'm intrigued because you said that you asked the world for people who could talk about this.
[00:01:25] And so it sounds like the world recommended me.
[00:01:29] So I think one of the posts that Dwight and I had sent out into LinkedIn said, hey, I think we should be talking more about L&D and training.
[00:01:39] And it just so happens in this season, we've actually, this season and the last season, we have people that we've been talking to in that world.
[00:01:50] So karma provided us with opportunities to talk to brilliant people like yourself, Kevin.
[00:01:55] Wow. Well, thank you for that. I'm very humbled. And knowing that I was recommended by the world is a lot of pressure.
[00:02:04] No, no pressure.
[00:02:05] Let's go. Let's do it.
[00:02:07] So Kevin, why don't we start by giving us a little bit about your background?
[00:02:11] Yeah. So my background is leading up to about 30 years in training, learning and talent development.
[00:02:19] It's been a great journey, man. It's been a great run, particularly as I think about the progression of my career and the different hats that I've worn, the different organizations with whom I've worked and really working with some marquee brand businesses and organizations.
[00:02:34] And so my career in training, learning and talent development started as a trainer.
[00:02:40] Gosh, 30, like I said, about 30 years ago at a small community bank on the south side of Chicago, where I was doing stand up training day to day on bankware applications and customer service that ultimately led to a role in instructional design, which ultimately led to a role in curriculum development.
[00:02:58] And that led to multiple roles in like learning operations, learning administration, leadership development, learning solutions, learning technology.
[00:03:10] And ultimately, I landed like where I am right now with focusing on measurement, more specifically investigating the extent to which training and learning contributes to workplace performance.
[00:03:23] So that's, you know, like the trajectory. And as I said, I've worked with some amazing organizations, most recently Meta formerly loanless Facebook and McDonald's and Kemper Insurance and Kantar Media and Grant Thornton and information resources.
[00:03:39] So it's been a great journey. And all of the work that I've done and all the roles that I've served have really informed the work that I do today.
[00:03:47] It's been a really interesting journey. And we're going to take and try and listen to a lot of your experience in that.
[00:03:54] But first, before we do that, we need to hear from you. What's one fun thing that no one knows about Kevin Yates?
[00:04:01] One fun thing. Well, some people know and some people don't, but I love a glass of wine.
[00:04:09] And more specifically, what some people may not know about me is that I have a preference for white wines.
[00:04:15] So I love a good Riesling. I love a good Gewerser Menor.
[00:04:19] I've tried to transition to red, but it's just not working.
[00:04:23] So, so again, it's a secret to some, some know it, some don't, but I love a good glass of white wine, particularly in the summer months.
[00:04:31] All right. Well, next time I see you, I'll buy you one.
[00:04:33] Okay. I'm going to hold you to that.
[00:04:35] Okay. Well, don't worry about that. I'm happy to buy it for you.
[00:04:38] I'm going to have a Diet Coke, but you can have the glass of wine.
[00:04:40] Okay.
[00:04:41] But let me ask you a question. Is it the first glass that you love or is it the last glass?
[00:04:45] Ooh, now you're getting all up in my business as we like to say.
[00:04:50] Or maybe it's a really big glass.
[00:04:52] Yeah, right. Let's go with that.
[00:04:54] Let's go with that. Okay. Yeah. A really nice big glass.
[00:05:00] There we go. There you go.
[00:05:02] Well, you need to decant and you need to let it breathe.
[00:05:05] So, you know, the big glass probably serves a lot of purposes.
[00:05:07] Yeah. Let's go with that.
[00:05:10] All right. Cool.
[00:05:11] So today's topic is one that's near and dear to the hearts of the HR Data Labs podcast,
[00:05:15] probably from back from the beginning, which is talking about measuring training and learning's
[00:05:21] impact on human and business performance and trying to get into where it's been and where it's going.
[00:05:34] So, Kevin, we got to ask the question, how is training measured today?
[00:05:40] Man, where do I start?
[00:05:42] So when I think about where training is today in terms of the measurement journey,
[00:05:48] I think that we're at different points in the journey.
[00:05:52] And by we, I mean the training, learning and talent development community.
[00:05:55] And so there are organizations and teams who are measuring traditional things like how many people did we train?
[00:06:04] How many hours did we offer?
[00:06:06] How many courses are in our catalog?
[00:06:09] How much time are people spending in and with our training and learning solutions?
[00:06:15] So those are very traditional types of measures.
[00:06:18] And there are many organizations who are at that traditional point in the journey of measurement.
[00:06:24] And on the other end of the spectrum, in terms of where we are with measurement,
[00:06:29] is organizations who are doing some great work with focusing on measuring how training and learning
[00:06:36] is contributing to human and business performance.
[00:06:39] So to kind of take that back to your question, where are we on the measurement journey as a profession?
[00:06:45] I think that we're at many points along the spectrum, right?
[00:06:49] Different organizations, different teams are at different points.
[00:06:53] There are some, again, who are engaged in the very traditional types of measures.
[00:06:59] And then there are some who are chasing after the very advanced progressive types of measures.
[00:07:06] And those are the ones that give us the insight on how training and learning is measurably contributing to workplace performance.
[00:07:13] And that's where I like to spend most of my time.
[00:07:15] Before we get there, though, Kevin, because I definitely want to touch on that.
[00:07:19] When I think about learning, typically I think about it filling gaps in skills or filling a need for either a feature need or there's also compliance training, right?
[00:07:31] If you look at the different types of things, whether it's skill-based learning and contributing to removing gaps or where I've also seen, which is compliance training,
[00:07:41] where do you think the pendulum or the balance of training sits today?
[00:07:46] Is it more in one versus another?
[00:07:49] Is it across the board or does it depend on the company?
[00:07:51] I think it depends on the company.
[00:07:53] And I think it's across the board, right?
[00:07:55] Because you just gave some great examples for different ways in which we are trying to fulfill a purpose with training and learning, right?
[00:08:03] So there are some purposes for compliance and regulatory training.
[00:08:08] And then there are some training and learning solutions that are purposely designed to sustain or move performance.
[00:08:16] So in terms of where we are, I would say we're all over the place.
[00:08:20] I mean, I don't mean that in a negative way.
[00:08:22] But again, I think it depends on how you are measuring purposeful field.
[00:08:27] Because if there is a purpose for regulatory and compliance training, then you're going to measure completions because you have to somehow demonstrate that 100% of your population has fulfilled requirements for some type of regulatory compliance training, right?
[00:08:43] Right.
[00:08:43] And then on the other end, there are training and learning solutions that are purposely designed to move or sustain performance.
[00:08:51] And so then we engage in measurement to show the extent to which training did what it was intended to do.
[00:08:58] So we're all over the spectrum.
[00:09:00] And again, I don't mean that in a bad way.
[00:09:02] But again, it just depends on purpose.
[00:09:04] And then outcomes are measured and determined by evaluating purposeful field.
[00:09:12] Does that make sense?
[00:09:13] It totally makes sense.
[00:09:14] And as we go through today, what I would like to do is kind of revisit the differences and the purpose as we're measuring or as we're talking about measurement.
[00:09:26] Because for a lot of the people that are going to be listening, some of their investment or a lot of their investments get targeted based on the state of the economy, based on their industry, based on the maturity of the company.
[00:09:38] And I think a lot of them might be interested to understand if there are differences in measurement that they need to focus on, specifically because of those differences.
[00:09:48] That makes sense.
[00:09:49] So, Kevin, let's then go back to where I think you were headed, which was the focus on how measurement then is purposefully built for today around the outcomes.
[00:10:02] I think what you were talking about or where you were going was, was that companies settle based on what they have to do, right?
[00:10:09] And where they have to be in order for all those different things to be satisfied.
[00:10:14] Yeah.
[00:10:14] And it depends on the industry.
[00:10:17] It depends on the culture.
[00:10:20] And it depends on the goals, right?
[00:10:22] So, for example, there are some industries that are heavily regulated.
[00:10:27] And so, for those types of industries, you know, compliance and mandatory training is going to be a priority just by the nature of the business and what those organizations have to do to stay aligned with whatever those regulatory and compliance obligations are.
[00:10:44] Right.
[00:10:45] So, there's that.
[00:10:46] And then there's those organizations that aren't regulated by, you know, any type of industry compliance requirements.
[00:10:57] And in those organizations, they may be more focused on things like people development, leadership development, skills and capabilities, which is not to say that those industries that have a heavy regulatory and compliance aspect don't focus on those things.
[00:11:14] But what I'm trying to do is just give you the spectrum, right?
[00:11:18] Sort of a weighting on that spectrum.
[00:11:20] Yeah.
[00:11:21] Yeah.
[00:11:21] That makes sense.
[00:11:22] Kevin, does that mean that they're actually using different technologies or different methods of collecting that data?
[00:11:29] Or is it just, are they using more of the same common technologies?
[00:11:33] So, that means that they may be using the same technology, but that they are getting different things from the data ecosystem.
[00:11:45] Sure.
[00:11:45] Right.
[00:11:46] And here's what I mean by that.
[00:11:47] So, you can get from an LMS, and most organizations have an LMS, most, not all.
[00:11:53] You can get from an LMS, how many people did we train?
[00:11:58] How many completions do we have?
[00:12:00] How many hours of training did people complete?
[00:12:03] And there may be organizations who are at different points in the measurement journey who are collecting that, but they're all using an LMS to do it.
[00:12:14] Does that make sense?
[00:12:15] Because their question was, are there different technologies depending upon where you are in the spectrum?
[00:12:20] So, where you might see a similarity, again, of what people are measuring that is kind of like spectrum agnostic, if you will, is the types of data that you get from an LMS.
[00:12:32] Now, let's take it to the other end, where we want to use facts, evidence, and data that give insight into human and workplace performance.
[00:12:40] And if you really want to get sophisticated, then you might be using a data warehouse to do that, to extract data and analyze data.
[00:12:47] So, if you are a smaller organization, a smaller business, maybe you don't have a data warehouse yet, which means that the types of insights that you can get about training and learnings, contributions to workplace performance, might be a bit more difficult to do.
[00:13:04] As compared to, say, some really large organizations who are further along the journey, who have data warehouses and who have easy access to business performance data and human performance data.
[00:13:15] You know, they might be using tools that those other organizations aren't using, right?
[00:13:20] So, it might be something like using Tableau or Power BI, right?
[00:13:25] So, that might be, those tools might be exclusive to those organizations that are further along in the journey compared to those who might not be as far along, who are pretty much just relying on, say, their LMS data and maybe even, I don't know, Google Sheets or Microsoft Excel.
[00:13:41] Yeah, a question that kind of popped to mind for me, it's a little bit of a tangential question, but you bring up a key point in terms of obviously some more sizable organizations are going to have more resources to invest in these various LMSs and measurement systems and everything.
[00:14:01] In your experience, do you typically see a big difference in learning outcomes based on the availability of resources to be able to put to these?
[00:14:15] What do you see in the measurement?
[00:14:17] Yeah, that's a great question.
[00:14:18] And I think that this is a good point to really kind of create a separation between church and state, if you will, right?
[00:14:24] Because on the one hand, we're talking about technology and tools that allows you to measure.
[00:14:28] And then on the other hand, we're talking about measurable outcomes.
[00:14:34] You know, again, we have tools that measure outcomes and then we have actual outcomes.
[00:14:39] And those two aren't necessarily related because outcomes are the result of training and learning solutions that produce either a shift or a change of performance or a way in which performance can be sustained.
[00:14:56] So the learning experience itself, the training program, the learning solution, whatever that is, hopefully the goal for the creation of a training program or a learning solution is to move performance or sustain performance.
[00:15:14] So there's that.
[00:15:15] And then you have the tools and the technology and the methods that measure the extent to which performance is moved or changed.
[00:15:26] So really, the only connection or relationship that those two have with each other is that you use one to measure what happened and then you use the other to make it happen.
[00:15:38] Right.
[00:15:38] Does that make sense?
[00:15:39] Yeah.
[00:15:40] And so do you see, do you see that organizations that don't have the resources, let's just stick with the measurement side of the fence on that.
[00:15:47] Do you see that organizations that don't have the resources for that are at a disadvantage?
[00:15:52] Or do you see them figure out different ways to more effectively measure the outcomes or measure their, measure their learning throughout the continuum?
[00:16:04] Yeah, that's a great question.
[00:16:06] And if I were to restate what I think you're saying just to confirm so that I answer correctly, I think you're asking about the extent to which not having access to tools and technology inhibits the ability.
[00:16:18] Right.
[00:16:18] To answer the question, what is training and learning's contribution to workplace performance?
[00:16:23] Is that kind of what you're saying?
[00:16:24] Yeah.
[00:16:25] Perfect.
[00:16:25] Yeah.
[00:16:26] I don't know that I'd use the word disadvantage so much as maybe not being able to go as far as you could go in the presence or with the use of tools and technology that helps you go as far as you can go.
[00:16:44] Right. So, for example, if you don't have a data warehouse and if you don't have the tools that help you extract data in ways that help you gain insight, you might just have to go at it another way.
[00:16:59] Right. Which means it might be rather might take longer to get at the answer than, say, an organization who has those tools and those technologies and those resources. So the questions can be answered, but it just might depend on how long it takes, because if you have access to not only tools and technologies, but expertise.
[00:17:23] Right. If you have access to that and that in those tools, technologies and expertise is helping you answer the question, what is he?
[00:17:31] What is the contribution of training and learning? You're able to do that a lot faster and maybe with a lot more confidence than, say, a smaller organization that does not have access to those tools, that that technology and that expertise.
[00:17:44] And so it might take those organizations that don't have access a little longer. So, you know, I could use the word disadvantage, but I would just say, you know, it just takes a little longer if you don't have access.
[00:17:56] I would call it brute force. You need to brute force the analysis rather than or finesse the analysis rather than having it actually come out as an output of the technologies you're using.
[00:18:06] Yeah.
[00:18:07] Yeah. Makes sense.
[00:18:10] Like what you hear so far? Make sure you never miss a show by clicking subscribe.
[00:18:14] This podcast is made possible by salary.com. Now back to the show.
[00:18:20] So, Kevin, let's go to the next question, which is the one that I'm I was really can't wait to hear, which is how could training be measured?
[00:18:29] What is the future of this? Where are we going?
[00:18:32] So you're probably not going to be too surprised to hear what I'm about to say.
[00:18:36] Is it a two letter initial?
[00:18:38] How'd you guess?
[00:18:41] How did you guess?
[00:18:43] And the first letter starts with yay.
[00:18:46] And the second letter starts with bye.
[00:18:49] So, yeah.
[00:18:51] So we're talking about AI.
[00:18:52] We're talking about artificial intelligence.
[00:18:54] But you know what I'm going to do?
[00:18:55] I'm going to tone down the hype.
[00:18:57] Okay.
[00:18:58] Because I'm going to contextualize what I believe artificial intelligence can do and what it can't do.
[00:19:04] So for me, you know, particularly when you consider who I am known as in the industry and I am known as the L&D detective.
[00:19:12] Right.
[00:19:13] And you guys may be old enough to remember or maybe you're not.
[00:19:17] Maybe you're just 20 something.
[00:19:20] But maybe you guys remember.
[00:19:21] I love how you give us the benefit of the doubt.
[00:19:23] Well, I tried.
[00:19:24] I tried.
[00:19:25] My gray hairs betrayed that right away.
[00:19:27] That's right.
[00:19:28] So you guys might remember Sherlock Holmes.
[00:19:31] Right.
[00:19:32] Of course.
[00:19:32] And he had an assistant whose name was Watson, Dr. Watson.
[00:19:35] Right.
[00:19:36] So in my work as the L&D detective, I consider artificial intelligence to be my Watson.
[00:19:41] Meaning artificial intelligence assists me with conducting impact investigations.
[00:19:50] I don't believe that they can do it for me, but artificial intelligence can certainly help me work smarter, not harder and definitely faster.
[00:19:59] Right.
[00:20:00] Right.
[00:20:01] So in answer to your question, like, where do I see us headed or even like where are we kind of right now?
[00:20:08] I see where artificial intelligence can and is going to do a great job at supporting us and assisting us in measuring training and learning's contribution to workplace performance.
[00:20:22] Again, I don't believe that artificial intelligence can take my place as the L&D detective just because there are some nuances that there are some nuances for measuring training and learning's contribution that are uniquely human.
[00:20:36] Meaning there's just this this human side of that work that a machine can't do for us.
[00:20:41] But what it can do and by it, I mean, AI, artificial intelligence.
[00:20:45] It can help me work a lot faster.
[00:20:47] It can actually take away some of the work that I don't like doing and it can do it for me and it can do it a lot faster than I ever could.
[00:20:56] And so I'm excited to continue to use AI as my Watson when I am conducting impact investigations.
[00:21:06] That's where I think we're headed.
[00:21:07] And I also believe and I hope that we're headed in the in the direction of really focusing on measuring training and learning's contribution to performance.
[00:21:21] I think that we know how to measure how many people that we train, how many courses that we offer, how many hours of training was complete.
[00:21:28] We know how to do that.
[00:21:29] I mean, that's that's just easy now.
[00:21:31] Right.
[00:21:32] But where we have not been as focused because it is not as easy is producing fact based evidence that shows how training and learning is measurably contributing to workplace performance.
[00:21:45] So I believe that artificial intelligence is helping us and will continue to help us do a much better job at answering that question.
[00:21:53] Yeah, Kevin, I think where I where I totally agree with you on is because a lot of the things that we're dealing with are facts and a lot of the things that we're dealing with within the context of measuring training outcomes enables us to look at facts that have a ton of data associated with them.
[00:22:15] I think you're right.
[00:22:16] I think AI can assist there.
[00:22:18] The part where I'm going with my question is going to be, you know, is this going to help us with the correlative versus causal?
[00:22:28] Because obviously, if you're trying to say, am I getting the ROI out of training?
[00:22:35] You're going to always have that question.
[00:22:37] Well, did it cause it or is it just absolutely just correlative to the things that naturally happen?
[00:22:43] The answer to your question is yes.
[00:22:46] Yeah.
[00:22:48] Well, here's what I mean by that, man.
[00:22:50] You know, I think that when we are investigating the extent to which training and learning measurably contributed to human and business performance, I think we're looking at causation and correlation because that's where the story is, right?
[00:23:03] Yeah.
[00:23:03] I don't see us focusing more on one on one and less on the other.
[00:23:10] I think that if you're going to tell a good, robust, fully inclusive story, we have to talk about correlation and causation.
[00:23:18] And artificial intelligence as an assistant to impact investigation helps us reveal correlation and causation.
[00:23:28] Because I believe getting the answers to both of those informs decisions about what we do with our training and learning solutions, what we should do, and what we need to stop doing.
[00:23:41] Yeah.
[00:23:42] And that's where I was saying we've got so many facts going into those models, right?
[00:23:46] We have so many facts that we can put in.
[00:23:48] Like, I'll go back to the thing that I brought up at the beginning about skills, right?
[00:23:54] You have skills on every job.
[00:23:55] And now we're going to be able to do, let's just say we do assessments on skills on people and we know that gap.
[00:24:02] Well, you know you took a training and we assess that person again.
[00:24:06] Do they still have that gap?
[00:24:08] So we can, you know, we could obviously do the math ourselves, but not at scale.
[00:24:13] So I love where you're going with this, that the AI can help us do these things at scale and not just on the microcosm of that person with those skills, you know, that we've enumerated by job.
[00:24:24] Now we've tested those people and we've now closed those gaps.
[00:24:28] Yeah, that's all great.
[00:24:29] But now you can actually prove out the ROI of the investment in not just the skills themselves, but the people, the training, and the measurement of it.
[00:24:38] You're proving the ROI because now you're telling the company, look, not only did we close the gaps, but now we're also seeing better performance from it.
[00:24:46] And here's how that got contributed to with all these different factors.
[00:24:50] Yeah, and I would add to that, and this is just really important message for me, that not only is training and learning a good solution for moving performance, but it's also a good solution for sustaining and maintaining performance, right?
[00:25:02] Quite often the conversation is how can training and learning change performance?
[00:25:06] And sometimes that is what is needed.
[00:25:09] But there are also times where we just need to keep the train on the track, so to speak.
[00:25:13] And so training and learning can then become a good, viable solution to maintain and sustain performance where it needs to be maintained and sustained versus where it needs to be changed.
[00:25:24] And I think then it also lends to the AI models being able to then provide guidance to the practitioners on, hey, person A has great skill sets.
[00:25:37] If we sent them to training, they could be a succession candidate for this set of jobs based on what we've seen as success, not only in being able to close those skill gaps, but knowing we have this training there.
[00:25:50] And then, you know, for the outcome of succession, being able to give that person a path, a career path, which is obviously necessary, and being able to use it outside of just the world of training, but being able to provide that to that HR person slash the employee so they know where they can head.
[00:26:06] So there's a lot of really cool outcomes you can get from this if you've got that data and if you're measuring it correctly.
[00:26:13] Right?
[00:26:14] I mean, am I off base on that?
[00:26:15] No, you're on track.
[00:26:17] And I would also add to this conversation the idea that when it comes to performance, training and learning is not the only thing that influences business performance and human performance.
[00:26:29] Right.
[00:26:29] And I think that that's where we have to be very careful with our storytelling.
[00:26:33] We don't want to position training and learning as being like the savior or being, you know, like the magic wand or that training, learning and talent development teams and organizations have, you know, pixie dust and, you know, ways in which we don't.
[00:26:50] No, we don't.
[00:26:51] We don't.
[00:26:52] And quite often the idea or the perception is that, you know, we need to fix people and training will fix them.
[00:27:00] So let's do some training.
[00:27:02] Right.
[00:27:02] Right.
[00:27:02] What we really have to consider as it relates to performance is all that there is that contributes to the performance ecosystem.
[00:27:12] And so when I talk about that, guys, what I'm talking about is all that there is that has the potential and power to influence human performance.
[00:27:23] That includes training and learning, but it is not limited to training and learning.
[00:27:26] And so what are some of the other things that contribute to human performance?
[00:27:32] That's a great question.
[00:27:33] I'm glad you asked, because some of the things that contribute to human performance include manager coaching.
[00:27:42] Right.
[00:27:42] Yeah.
[00:27:42] It includes compensation.
[00:27:45] We all, well, I shouldn't say we all, but most people want to get paid.
[00:27:49] Right.
[00:27:50] And more.
[00:27:51] And more.
[00:27:51] Right.
[00:27:52] Rewards and recognition.
[00:27:54] Tools and technology.
[00:27:56] Performance support.
[00:27:58] Culture.
[00:27:59] Natural ability.
[00:28:00] Those are some of the things, including training and learning, that influence people's performance.
[00:28:08] So we have to be thinking about that.
[00:28:10] And then when we think about all that there is that contributes to business performance, because ultimately, when we talk about training and learning contributing to workplace performance, we're talking about human and business performance.
[00:28:21] But let's think about that.
[00:28:22] Right.
[00:28:23] So the training team can contribute to business performance, but so does, for example, the marketing team or the products and innovation team or the sales team.
[00:28:36] Right.
[00:28:36] So the essence of what I'm saying, guys, is that as we think about measuring the impact of training and learning, we need to be thinking about all that there is that contributes to performance.
[00:28:47] Because at the end of the day, I believe that training and learning fulfills the highest purpose with measurable contribution to performance.
[00:28:55] And I use the word contribution intentionally and purposefully because we need to be thinking about all that there is that contributes to performance, not just training and learning.
[00:29:06] And part of what you guys are talking about is it's that nuance factor.
[00:29:10] So you're taking the facts and you're either applying AI or you're applying human thinking and probably optimally both of those together in concert to understand the nuance behind just the hard facts.
[00:29:26] Yes. And I am 110% in agreement with you also on the fact that I think our flex reaction anytime that there's a decrement performance or some issue that we see, oh, we got to train.
[00:29:39] We got to train. We got to do training.
[00:29:41] It drives me nuts.
[00:29:44] But it's that nuance piece of things in the measurement process with the training and development that I think is really kind of the heart of where you start to see that effectiveness.
[00:29:59] But you've got to be able to understand that nuance to be able to get to how do we train?
[00:30:05] How do we measure the effectiveness of the training?
[00:30:07] And, you know, it's kind of a continual circle.
[00:30:09] Yeah.
[00:30:10] Yeah. And, you know, guys, for me, in my L&D detective technique, there are nine questions that I ask business partners and stakeholders.
[00:30:18] And those the answers to those questions give really good insight into when training and learning is part of the solution and when training and learning is not.
[00:30:28] Right. Those nine questions also help determine if training and learning is the answer and the solution or part of it.
[00:30:36] The answers to those nine questions also proactively determine what you're going to measure.
[00:30:41] What I continue to see, and it is it is so disappointing, is training and learning is designed, it's built, it's launched.
[00:30:52] And then there is consumption and utilization of it or participation in it.
[00:30:56] And then the follow up question is, well, what's the impact?
[00:31:00] So quite often in my career, I had been brought in at the end where the training, again, has already been designed, launched, consumed, utilized and participated in.
[00:31:10] And then I'm asked to measure the impact.
[00:31:13] And my follow up question is, well, what was the intended impact?
[00:31:17] And the answer to that question is, well, we don't know.
[00:31:19] We just want you to find the impact.
[00:31:22] And, you know.
[00:31:23] Measure something, Kevin.
[00:31:24] Measure something.
[00:31:25] Exactly.
[00:31:25] Measure something.
[00:31:26] Give us some data.
[00:31:27] We don't care what it says.
[00:31:28] Yeah.
[00:31:28] Yeah.
[00:31:28] But I think that if we are strategic and deliberate and work through some of that at the front end,
[00:31:36] measuring training and learners contribution is going to be much easier to do on the back end.
[00:31:41] Hey, are you listening to this and thinking to yourself, man, I wish I could talk to David about this?
[00:31:46] Well, you're in luck.
[00:31:47] We have a special offer for listeners of the HR Data Labs podcast.
[00:31:51] A free half hour call with me about any of the topics we cover on the podcast or whatever is on your mind.
[00:31:58] Go to salary.com forward slash HRDL consulting to schedule your free 30 minute call today.
[00:32:07] So I don't want to lose sight of the fact that one of the things that we want to have as a goal from this discussion is also to talk about how do we get there?
[00:32:15] Because you're talking about a lot of great things, Kevin.
[00:32:18] But one of the things I think that listeners will think about as they're contemplating, you know, measuring outcomes from learning and going into the learning with the mindset of what's our goal?
[00:32:29] What are we trying to accomplish?
[00:32:30] That's a really good learning of how do I get there?
[00:32:34] What are the other things that you would suggest people do in terms of getting started on this journey of measuring training and being able to align with business and employee performance?
[00:32:45] Well, I'm going to be intentionally repetitive to answer that question, because the question is, what is it that people need to be doing to get to a point where they can measure the impact of training and learning?
[00:32:55] And what I just said a few moments ago was that we need to be proactive in our planning and our thinking.
[00:33:01] So, again, there are nine questions that I have.
[00:33:04] And, you know, no, no shame here, but I'll just put out this plug in the L&D detective kit, which is on my website.
[00:33:13] I identify what those nine questions are.
[00:33:15] So, in the L&D detective kit, there is a methodology that I illustrate for how to measure the impact of training and learning and how to be proactive with doing that so that you don't get in that situation where you've designed and you've launched and people are consuming it and using it.
[00:33:35] You're learning and your training solution.
[00:33:37] And then you ask, well, what was the impact?
[00:33:39] Well, what I do in the L&D detective kit is show how to ask nine questions and use the answers to those nine questions to not only design training that will purposefully contribute to workplace performance, but also how you're going to measure it.
[00:33:56] Great.
[00:33:57] So that's my recommendation.
[00:34:00] You know, my recommendation is to intentionally and purposefully plan for impact in the beginning so that it's easier to measure in the end.
[00:34:08] Sure.
[00:34:08] And I show how to do that in the L&D detective kit.
[00:34:13] It's on my website at KevinMD8s.com.
[00:34:15] And we're definitely going to have that link available in the show notes.
[00:34:19] And, Kevin, that's brilliant.
[00:34:21] I love where you're going with this.
[00:34:23] Is there a need?
[00:34:24] I mean, as a good data scientist and also as a good econometrician,
[00:34:27] is there a need at the beginning to have either, I don't want to call it an ROI, but a hypothesis about your ROI and about the goal, a hypothesis?
[00:34:38] It's not just if we do this training, we're going to do better.
[00:34:41] That's kind of, that's really cheating.
[00:34:44] But is there a hypothesis that you have to come up with that says that performance will increase by X percent to give that ROI to the business leaders?
[00:34:53] And actually, it goes back to your point of informing the direction of where you want the training to go.
[00:34:58] But is there really, do you need to get that specific and that sophisticated or can you be much more obvious about it?
[00:35:05] I think it depends on the goal, right?
[00:35:08] Because for some goals, you're going to be very specific with how you're trying to move performance.
[00:35:13] And so it's going to be reducing X by 5 percent, increasing Y by three points, as an example, right?
[00:35:24] So that's where you have to have a hypothesis that says training and learning, or rather the goal of training and learning in combination with other influences and contributors will be that we reduce errors by 3 percent.
[00:35:40] Right.
[00:35:40] That's where you're going to get very specific, right?
[00:35:43] There may be other types of situations that may be less specific, but where the goal is still clear, right?
[00:35:53] So it really depends on what the goal is.
[00:35:57] It depends on who the other key players are in terms of achieving that goal.
[00:36:05] And it also depends on what are all the variables that have influence on business performance metrics, movement or stability.
[00:36:14] So there is no one answer, but hopefully I just kind of gave you context for how you want to be thinking about it.
[00:36:28] I can't think of a better way to end the program because I think you just dropped the microphone on being able to solve ROI for people who are kind of...
[00:36:38] And I've done lots of investments in ROI analysis on training and development programs in the past.
[00:36:47] And then one of the things I've mistakenly tried to do is the kitchen sink.
[00:36:52] And you can't solve for kitchen sink.
[00:36:54] You got to be able to solve for individualized goals, maybe even business goals, but, you know, or looking at the overall organization's business goal, but being able to, you know, find a goal, make it something that's potentially addressable, and then go for it, right?
[00:37:09] Yeah, you're so right.
[00:37:10] And what you just said brings to mind those times where I have been asked to measure the impact of L&D.
[00:37:18] And I'm like, well, what does that mean exactly?
[00:37:20] Right.
[00:37:21] We want to measure the impact of L&D.
[00:37:24] I'm not quite sure how to do that, right?
[00:37:26] Right.
[00:37:27] I do believe there are ways in which to measure how specific training programs and learning solutions have contributed to human and business performance.
[00:37:38] I know how to do that, right?
[00:37:40] But to say that we want to measure the impact of L&D, that's big.
[00:37:45] That's like boiling the ocean.
[00:37:46] So I focus more on specific training programs, specific learning solutions that have been designed to produce specific outcomes.
[00:37:58] And that is where I focus in terms of what I measure versus measuring the air quote impact of L&D.
[00:38:05] That's, I haven't seen it done yet.
[00:38:08] Maybe it can be done.
[00:38:09] I don't know, but I've not seen that.
[00:38:11] Well, let's ask our friend AI and see if it can.
[00:38:15] You know what?
[00:38:15] I'm going to try that as soon as we end our discussion today.
[00:38:19] I'm going to go to chat with TPT and say, see what it says, man.
[00:38:24] Well, who knows?
[00:38:25] I mean, AI may have figured it out.
[00:38:27] Who knows?
[00:38:29] May have.
[00:38:30] But to your point, I think even that's a little bit beyond where the models are today.
[00:38:34] But what we might want to do is we'll come have another episode maybe next year and see if AI did solve that problem yet.
[00:38:42] I'll meet you back here next year.
[00:38:43] I'm all for it.
[00:38:44] Awesome.
[00:38:45] All right, cool.
[00:38:46] Kevin, thank you so much.
[00:38:48] Thank you for having me.
[00:38:49] Thank you, guys.
[00:38:50] Thank you, guys.
[00:38:50] Great to be here.
[00:38:51] Thank you.
[00:38:52] And thank you, Dwight.
[00:38:53] Thank you.
[00:38:54] Hope you both have a wonderful rest of your day.
[00:38:57] And thank everybody for listening.
[00:38:59] Take care and stay safe.
[00:39:01] That was the HR Data Labs podcast.
[00:39:04] If you like the episode, please subscribe.
[00:39:07] And if you know anyone that might like to hear it, please send it their way.
[00:39:11] Thank you for joining us this week and stay tuned for our next episode.
[00:39:14] Stay safe.


