Send us a text

Chris Taylor, Founder and CEO of Actionable, joins us this episode to discuss why—and more importantly, how—organizations might invest in measuring behavioral change resulting from training and development initiatives. 


[0:00] Introduction

  • Welcome, Chris!
  • Today’s Topic: How to Measure Behavioral Changes with Learning Interventions

[4:08] What behavioral change should we expect from talent development programs?

  • Establishing a baseline prior to measuring behavioral change
  • The pertinence of behavioral change across all training programs

[12:25] How can organizations measure engagement?

  • Starting with the organization’s strategic priorities
  • Lessons learned from a case study in high employee turnover

[23:51] How can organizations ensure that behavioral change lasts?

  • Helping participants discover their “why” in learning
  • The value of post-session facilitator-participant follow-up

[33:06] Closing

  • Thanks for listening!


Quick Quote

“If we assume [an organization’s] strategic priorities involve achieving something that we’ve never achieved before, then it’s going to require some new . . . processes and/or competencies.”

Resources:

Actionable

Contact:
Chris' LinkedIn
David's LinkedIn
Dwight's LinkedIn
Podcast Manager: Karissa Harris
Email us!

Production by Affogato Media

To schedule a meeting with us: https://salary.com/hrdlconsulting

For more HR Data Labs®, enjoy the HR Data Labs Brown Bag Lunch Hours every Friday at 2:00PM-2:30PM EST. Check it out here: https://hrdatalabs.com/brown-bag-lunch/

Produced by Affogato Media

Powered by the WRKdefined Podcast Network. 

[00:00:00] The world of business is more complex than ever. The world of human resources and compensation is also getting more complex. Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources. Listen as we explore the impact that compensation strategy, data, and people analytics can have on your organization.

[00:00:25] This podcast is sponsored by salary.com, your source for data, technology, and consulting for compensation and beyond. Now, here are your hosts, David Turetsky and Dwight Brown. Hello and welcome to the HR Data Labs podcast. I'm your host, David Turetsky. And as always, we try and find brilliant people inside and outside the world of HR to bring you the latest on what's actually happening. Today, we have with us Chris Taylor from ActionWell. Chris, how are you? Chris Taylor- I'm fabulous, David. Better for being here. How are you?

[00:00:55] Chris Taylor- I'm good. I would never say fabulous because I kind of, I don't want to be, and I'm not trying to put this on you, but I actually don't even know the concept of fabulous in terms of how I am anymore. I mean, I've been to the Rangers winning the Stanley Cup. That was fabulous. I've been there when my kids were born. That was fabulous. No offense, dude, but this doesn't rise to that level yet. But we'll see. Maybe at the end of the podcast, I will be.

[00:01:24] David Turetsky Maybe. Something to aspire to. David Turetsky Yeah, yeah. Well, it's good. It's good to have aspirations, as they say. Chris, tell us a little bit about yourself and about Actionable. David Turetsky For sure. So I'm fabulous because I'm just back from Lisbon, Portugal, where I did very little but eat incredible pastries. So I'm still basking in the glow of Portuguese baked goods. When I'm not in Portugal, I run a company called Actionable. We help shine a light on the impact of corporate training programs. We support learners.

[00:01:51] David Turetsky And putting ideas into practice and bring the reporting data back to the stakeholders so they can actually see which needles have been moved by that report or by that program. It's sort of the, you know, we're aspiring towards that holy grail ROI of corporate training. David Turetsky Which is what every CFO asks for every time we ask them for another dollar. David Turetsky Exactly right. That's it. David Turetsky That's really good. Well, it's really an important thing. So we should be talking a lot about that measurement today. But first, before we do, what's one fun thing that no one knows about Chris Taylor?

[00:02:21] David Turetsky I've forgotten about this. And then my wife reminded me, I am a direct descendant of Sir Francis Drake, who was a famous pirate. David Turetsky Yeah. David Turetsky Yeah. So wow. David Turetsky Right? Take that. David Turetsky Exactly. It goes against my Canadian sensibilities, but I'll take it. David Turetsky Yeah. Well, that's actually pretty amazing. David Turetsky My mom was really into genealogy when I was a kid and then discovered going back far enough that there it was a great, great, great, great, great, great, great, great uncle.

[00:02:50] David Turetsky So do you have any, is there a good bead on where he kept the treasure or is that all gone? David Turetsky No treasure, but you know what? We actually had until about two generations ago a piece of the Golden Hine. Is it Golden Hine? Is that his? Or is that Magellan? Anyway, I think the Golden Hine. His ship that he was on, we had a piece of that. David Turetsky Oh, wow. David Turetsky Oh, wow. David Turetsky I know, right? And then someone lost it in a basement flood, as happens. So yeah. David Turetsky Wow. Yeah, that kind of sucks.

[00:03:17] David Turetsky I don't know how you transition out of that, David, but that's the story. David Turetsky Well, we'll do our best. But it's actually kind of funny that pirates have been in such a big thing in the movies and on TV that that's a badge of honor. I am honored to be in your presence. So I'm already more fantastic than I was at the beginning. David Turetsky Perfect. We're just inching in north. That's great. David Turetsky Hey, listen, an inch at a time will get us there. It's all progress, Chris. It's all progress. But now let's talk about our topic, which is near and dear to a lot of people who listen to this podcast.

[00:03:47] David Turetsky Which is not just trying to figure out how to make learning work, but actually figuring out how to measure behavior changes and how to create what you call learning interventions.

[00:04:00] David Turetsky So, Chris, you focus on supporting and measuring behavior change on talent management and talent development programs. What is the behavior change and why?

[00:04:17] David Turetsky Behavior change for me is the first possible moment that we can actually start to measure something as far as efficacy is concerned. I think as an industry, we've gotten real good at the smile sheets, the eval forms on what happened in the room. And I think that the best we can capture on there is the intention that a participant might feel towards putting ideas into practice. David Turetsky And that's all well and good, but it's not actually going to create impact until people start doing something with it.

[00:04:47] David Turetsky So, I've been really obsessed for the last 16 years on that first step. Once people leave the room, literally as they're going home from the session or after they've clicked out of Zoom or Teams, then what? David Turetsky What's that first piece of movement that breaks the inertia of the status quo? David Turetsky So, I'm fascinated by it. I think it's really interesting and it works in the favor of the participant and the organization. David Turetsky So, how do you judge a baseline? Because I think for measurement, we always want to know what's the beginning and how did it move?

[00:05:16] David Turetsky So, is it the efficacy of the skill or whatever was being trained prior to? Is it an assessment that needs to happen at the beginning or is it something that you have to observe? David Turetsky Well, this is okay. So, I'm so glad we're going here because I get that this is the sort of foundation of the industry, right? David Turetsky We need a baseline. We need to show change compared to something else, ideally external measurements, right?

[00:05:44] David Turetsky Behavior change, if we go down to its true nuclear core, is a deeply internal, deeply personal thing, right? David Turetsky In many cases, right. We're shifting mindset in many cases or even just awareness before we're actually shifting action. David Turetsky And so, without taking away from the value of external validation, what we focus on in behavior change specifically is around the individual's self-assessment of where they're starting from and then how they're progressing.

[00:06:12] David Turetsky We're not trying to get to an absolute measurement on this. What we're trying to do is just bring to the surface something that was historically invisible. David Turetsky So, the way that we go about this in a really simple fashion is at the end of the session, participants will commit to a behavior change, a habit that they want to establish, something that they want to do differently moving forward. David Turetsky And then they rate themselves one to 10. How do you feel you're doing with this currently?

[00:06:35] David Turetsky The number doesn't matter. What matters is that it creates a mental placeholder for them to say, I'm at a four and I want to improve. David Turetsky Then as they self-reflect over the coming days, weeks, months, we'll typically on average collect about 8.2 data points per person over the following month to be able to say the individual believes that they are making this sort of progress. Is that empirical by itself? Of course not, right? And on an individual basis, the numbers don't actually mean anything.

[00:07:06] David Turetsky But directionally, being able to see the self-reported trend allows us to shine a light to say, okay, now, a month later, two months later, if we start looking for observable change, asking the people around them to review their progress on that specific area, can we see a correlation between the two? David Turetsky And in our experience, the overwhelming answer is yes, we absolutely can see that. David Turetsky When the individual feels they've improved in an area, a month or two later, those around them will be able to see it as well.

[00:07:35] David Turetsky Four to six months later, they'll start to see the ramifications on externally measured KPIs. David Turetsky I guess the question I'd go to is there's a ton of training programs, tons, that all try and give different types of learning. David Turetsky Some of it might be required, some of it might be licensure, some of it might be health and safety, some of it might be, you know, job duties. It also could be culture change.

[00:08:00] David Turetsky You know, the company's going through massive change, they do change management on something, and, you know, that's also in there. David Turetsky Is there a sweet spot where this is more useful than others, or is this applicable across the board for learning? David Turetsky Yeah, so there's, okay, so I'm going to take that through two lenses. David Turetsky One is the content, and the second is the audience.

[00:08:22] David Turetsky So from a content standpoint, anywhere that new information is being communicated to individuals, where there would be benefit to the individual and the organization, and people making regular changes to their typically daily practice. David Turetsky If we can get it down to a daily practice, then yes, there's absolutely value in pursuing this, and this we can get into. David Turetsky From an audience standpoint, the same thing applies, but there needs to be desire and intent.

[00:08:51] David Turetsky I don't, I've yet to find a piece of tech on the planet that will take someone who is obstinately refusing to change, and suddenly, magically, they're now taking action, right? David Turetsky Sure. David Turetsky The role of the facilitator, the intervention itself, is, in my view, to transfer the information and to do it in a way that helps the learner find a so what for them, so they want to change. David Turetsky Right.

[00:09:13] David Turetsky Assuming those two things are true, then yeah, I mean, we had, last year, there's about 3,000 programs that we were involved in globally, everything from health and safety, culture change, soft skill development. David Turetsky I think where it doesn't work super well is where the goal is not behavioral change. David Turetsky So if we're doing a technical training that we might need to utilize in case of emergency once a year, then no, we're not trying to change any behavior.

[00:09:38] David Turetsky But for something that even, you know, sorry, I'm rambling a bit here, David, but the way I think about diversity, equity, inclusion, right, was obviously a big focus several years ago. David Turetsky So it didn't work because, not the movement, although there could be comments on that, because we were focused on exception-based training, right? David Turetsky We were, as an industry, focused on when you see this bad thing or when you experience this bad thing, do this.

[00:10:03] David Turetsky We did find the DEI programs that shifted that focus from exception-based activity to something that I can notice or plan for or be deliberate about on a daily basis, then it absolutely worked. David Turetsky So that's the whole trick. How do we get it to a place of daily practice? David Turetsky I think when you bring up the DEI world, the thing that it really certainly sticks out for me in terms of training is when you have to train for inclusion.

[00:10:32] And you have to make sure that we're dealing with inclusive language, which is a very big behavior change. It means stop using these words and in many circumstances, stop calling people or things these types of pronouns. And we can get into the politicalization or the politicization, whatever it's called, of those things. You know, I can't say it. But that's not the point.

[00:10:56] The point is when you're in a work situation, show respect and be respectful. And this is how to do it. That's the one piece or the caveat I would take away and say that's definitely behavior change and could show a measurable difference in how people feel, what their engagement scores are. David Turetsky Yep, that's exactly it. And I'm glad you brought up engagement scores because that's probably the softest of the empirical data.

[00:11:25] David Turetsky As long as you're not getting my COVID, then you're going to be okay. David Turetsky Thank goodness for virtual interviews. Yeah, but employee engagement scores, that sense of psychological safety. Do I trust my team? Do I trust my leader? David Turetsky That's a great example where there may be specific scores in that 76-question juggernaut that the organization really wants to move the needle on. If we can back out of that or extrapolate out of that, what are the behaviors that have shifted should move the needle on those specific questions?

[00:11:53] David Turetsky And then we target training around new ideas and creating the space for people to explore why engaging that behavior change would make the most sense. Then time and again, I have lost count of how many culture change programs we've been involved in where that's been the goal, right? We want to move the needle on this employee engagement score metric. Like what you hear so far? Make sure you never miss a show by clicking subscribe. This podcast is made possible by Salary.com.

[00:12:22] Now, back to the show. And for everybody in HR, when we're asked for how do I improve productivity, how do I improve engagement? One of the first things we do is to ask what will be the second question that we're going to focus on, which is how do I really measure it? Because those engagement surveys, no offense, but they suck. I mean, to your point, it's not just the 76 questions.

[00:12:47] It's that a lot of times people think they're being tricked into trying to answer the same damn question five different ways to be able to do what? I mean, give me a five-question survey of do I like my company? Do I like my manager? Do I like the situation I'm in? Do I like how I'm paid? You know, am I going to leave tomorrow? There you go. There are five questions you can ask me to tell me whether I'm engaged or not. But still, we're trying to measure something differently, right?

[00:13:16] I mean, isn't that kind of where all this is starting from? It's trying to do some pseudoscience or psych bullshit on, pardon my French, on people. Well, I think that's exactly it, David. I think, you know, I watch the conversations on LinkedIn, which is my professional social nesting ground. And it's fascinating because there's 80%, I'm making up numbers now, which is not great for a data guy.

[00:13:40] But 80%, 90%, the majority of the conversations just get so convoluted around how to measure this stuff. And it's, you know, and it's 18 layers removed from reality. Right. And yeah, I mean, this is a major, major challenge. So I think there's, you can approach this from one of two directions, but not in the messy middle. So direction one is top down, let's call it, where we start from the strategic priorities. And, you know, I've got a four circle visualization on this, but there's strategic priorities for the organization.

[00:14:10] Right. If we assume that those strategic priorities involve achieving something that we've never achieved before, then it's going to require some new stuff. Some of that might be process. Some of that is going to be competency at an individual level. Right. And then the organization can choose to buy those competencies, hire for it, or build those competencies, train and develop for it. Right. Like, again, I appreciate this is overly simplistic language, but that's kind of the goal. Right. Right.

[00:14:36] So if we can identify them, what do we need people to be competency wise, different or culturally competency in order to achieve that strategic outcome? What are the behaviors that we need that make those competencies happen? What are the interventions that we need to make those behaviors happen? So we can, we call it the impact value chain. If we string those together and it can get complicated quickly, but it doesn't have to, then we can at least show the through line of why we're doing the trainings that we're doing.

[00:15:04] And we could say, these are the intelligently why the behaviors we're focused on are the ones that we're focused on. And we can start to measure that way where with the, I mean, their strategic priorities have X number of metrics that they're already tracking. Right. And so that becomes our big lag. The lead becomes our behavior shifting. And then if we organize the people data, so we say, yes, all these people went through training, but guess what?

[00:15:29] HR and L and D folk, most executive people don't care about the cohorts or the modules or the sessions or the experiences that people were in. What they want to see is the data sliced by function, geography, seniority, all this stuff they measure everything else by. Right. They want to see the leading indicator data of, can we see which behaviors are shifting for which clusters within that? And then can we map that to any movement in the KPIs? Do you want me to make this real or is this still to, this is pretty theoretical, right?

[00:15:59] To me, this is perfect because I live and breathe everything you just said. But to the listeners, I mean, all those things make sense because that's what the leaders want. Yep. But when it comes down to HR actually executing against that. Yep. How do you do it? Yeah. So here's an example. We worked with a casino just before the pandemic and I keep using it even though it's five years old now because it's such a clean example.

[00:16:25] So the casino had very high voluntary turnover. The industry has high turnover, but they were like an order of magnitude beyond the industry norm. They'd figured that it was costing them about seven million bucks a year in replacement fees on their frontline staff at the casino. So they were investing in a frontline leadership program. The core metric that they were looking for was retention on this piece. Right. You're shaking your head, David. I don't know if people are listening. Tell me.

[00:16:55] Well, that's one way of going at it. But OK, because I'm an analytics guy and retention. Yeah. Management and leadership training is great. But is that is that really the issue? Right. Yeah, no, absolutely. And this is I think this is where we can get tangled. Right. Because, of course, it's not in isolation. Right. There's so many external factors. What we wanted to do, though, was to see what percent, not even what percentage, but does working on leadership and management actually move the needle on the retention side?

[00:17:23] And so what they did was they took 176 frontline leaders that were going in this program. They were split. This is why I love it. It's so clean. They were either food and beverage or gaming. They were the AM shift, PM shift or graveyard shift. Oh, that's perfect. Right. So every manager neatly clusters into one of those six boxes. Right. They'd done the impact value chain and said, what are the capabilities of one develop? What are the behaviors related to that? What's the content that's going to drive those behaviors? Then they start running them through the program.

[00:17:50] Each month, not based on when the sessions took place, but each month being able to say cluster A, the AM food and Bev group, disproportionately self-assessed progress in this particular behavioral area. Are you seeing any correlation on voluntary turnover? And of course, the answer is no, because it just happened. Right. And month two, no. Month three, no.

[00:18:15] Month four is where they started to see, in their words, enough correlation that they could say, interestingly, we're seeing an improvement in retention in the food and Bev AM shift. Let's look back to see what behaviors we were working on at the beginning four months ago. And can we replicate that now across the other groups? So they use it not as an empirical causality, like this is what caused this, but to say, is there a potential role that this is playing? Yes or no.

[00:18:44] And I think this is where the ROI of learning falls down, is that we're trying to get to a black and white, this equals this. Right. When instead, if we can simply shine a light, not simply, but shine a light and say, here's some stuff that happened. As we move through time, some positive ramifications of potentially that happened, but positive things happen. Is there enough correlation that we can double down on this?

[00:19:09] It makes our program smarter when we can at least look at what were the behaviors that were shifting earlier. And for those of you who are reading my mind, as Chris was talking, one of the things I was going to ask him was, for that group A, was it causation or correlation? Exactly. And so there's lots of ramifications of just the training, but did they change how they hire? Did they change who they hire? Did they get a better job description? Did they improve pay at all?

[00:19:38] You can't just say everything else held equal because the world does not work like that. But I guess the question I wanted to ask you, in all of what you were saying, did they measure for those things as well or at least hold them as a control to be able to make those determinations along the way? I certainly hope so, David. I didn't see it. I mean, it's interesting because we work mostly with consulting firms that work with their clients. So we're a couple of layers removed.

[00:20:05] And then this is where you get the internal culture. We were working with the L&D team. Are they actually exposed to some of these other pieces? I don't know. Right. I think the way I see it is, can we incrementally come back to our earlier point about progress and inches? Can we incrementally increase the level of awareness and intelligence that's going into design of learning programs? Sure.

[00:20:28] Well, but it's important to at least be able to have something because the CFO is going to ask, how do I know it's working? You know, give me something. And to your point, this gives them something, right? You've got to be able to say everything else held equal, at least for them, and be able to make some sort of hypothesis that I'm making some progress because I'm trying to change these skills, these behaviors.

[00:20:56] And I'm seeing something, at least in these types of groups. So in the groups where we didn't see it, we're going to do something else. But I'm not going to change the fact that I invested or will invest in this for those groups that it did work for to at least see that those are carrying through. I mean, to me, that just makes HR better, smarter, and trying to be better partners with the CFO's office because you give a damn that it's actually working or not. Yeah, absolutely.

[00:21:26] I think there's two pieces I want to build off there if I can. Sure. The first being, and I think it's so critical, that HR folks, when they have this data for the first time, sometimes over-exuberance can lead to statement of causality, right? Yes. And I think really important for your own credibility and for the industry's credibility that we're not stating causality. This is a correlation piece, as David, as you were saying. Absolutely.

[00:21:49] The second thing that, sort of building off your point about now we have something, I mentioned there's sort of two ways to go about this. The ideal is to start at the strategic priority and work backwards. And there's a whole bunch of L&D folks that are asked to go out and find a leadership program. Why? Well, because we need a leadership program. Okay. So, because we have budget we need to spend. It's like, oh, okay. Yeah, that would be a great problem to have. Uh-huh.

[00:22:18] Even in that case, if for some reason you're stonewalled towards understanding how the program connects to the strategic priorities of the organization, being able to start from the content and show the behavior changes that it's driving invites the conversation from the leadership team. Because you're bringing data that's different than what you've shown in the past around completion rates and how much people like the sandwiches. We're now able to say self-reported and then ideally externally like third-party validated.

[00:22:46] Here's the behaviors that are shifting in the month following the delivery of this session. What I find interesting with that approach, and again, it's if you can't get, you know, strategic alignment early, when you bring that data forward, people will start asking questions that actually shows what some of the strategic drivers are in the organization. Yeah. And then invite different new training that get them similar behaviors and people start going, wow, this shit works.

[00:23:15] Turns out. Yeah. Job secured for another year. That's good. Hey, are you listening to this and thinking to yourself, man, I wish I could talk to David about this? Well, you're in luck. We have a special offer for listeners of the HR Data Labs podcast. A free half hour call with me about any of the topics we cover on the podcast or whatever is on your mind.

[00:23:42] Go to salary.com forward slash HRDLConsulting to schedule your free 30-minute call today. This gets to our third question, which is how do you make sure that this stuff sticks? Because you can't just do training once and expect it to be forever. People change. Jobs change. The world changes. Requirements change. We hire new people. People leave.

[00:24:11] So how do you make sure that these things continue? Do you have to just keep doing these trainings over and over again to the same people or to the different cohorts now? Or how do you make it stick? Yeah. So I think there's sort of the micro and the macro on this because the first piece that I look at is how do we make sure that people leaving the room actually put anything they took from the room into practice right afterwards, right?

[00:24:33] And then the second, I think from sort of a quote-unquote cultural level, whether that's on a team level or department or organization, how do we make sure that this becomes more of a sort of a new normal moving forward? So one of the things that I love is that because Actionable is at the center of all of these programs, it's a little bit of hubris to start at the center of it, but involved in so many programs.

[00:25:01] We're sitting now on about 4 million data points around what works and what doesn't in helping people actually drive behavior change following a session. And that includes a number of the factors that exist after the session, but also what happens in the room itself. So one of the things that we stumbled upon last year was this 3 to 1 ratio. And again, I appreciate this sounds trite, but I also want to give some practical stuff that people can apply.

[00:25:28] What we have found is that when the content of the session is complemented by three times the amount of air space of contextualization, i.e. not new content, but giving people space to turn it over, figure out, do I give a shit about this? You'll fairly dramatically increase the likelihood that they'll actually follow through afterwards because you're helping them find the why that will break the inertia of status quo, right?

[00:25:54] So does that mean use cases and practice and like, so I get taught something and then I do three activities right there? Is that what you mean? Yeah, at least three times the amount of time. So if I've just taught 10 minutes of content, giving them half an hour to do, you know, sort of current state, future state consideration, how would this impact me? How does this impact us? How does this impact them? Types of exercises.

[00:26:19] Most facilitators would have sort of their bag of tricks around what are some of the exercises to contextualize, but that's what we're talking about. Even just journaling, right? As, you know, non-high tech as that sounds, taking time to figure out why would I care about this? How does this make my life better though? Anyway, so that's a key piece. And then the two things, there's a bunch. And if anybody wants it, I can make sure David, you have the, we do an annual insights report free.

[00:26:48] That's got all of this stuff in it. Yeah, it'd be lovely. If you can give me a link, I'll put it in the show notes. Yeah, for sure. Post session, it's the social reinforcement. So can we have the team normalize the fact that we're going to be engaging in behavior change and that it's going to be awkward for a bit? Can we have that conversation in the room, bridging out of the room? Can we bring a specific accountability partner in? We see, you know, we're talking about those self-reported changes between beginning and end of. Yeah.

[00:27:18] There's a, I should have these numbers in front of me, but it's more than 40% improvement between first and second. If I have an accountability partner who's actively engaged in the process. That's not your boss? That's not your boss. This is someone you chose to invite into your commitment to actually see and support you on that. Sounds like a 12-step process. It's probably more than that.

[00:27:42] I'm being serious because that person's there to support you throughout, making sure that you're living up to the commitments you've made. So I was not necessarily being tongue-in-cheek. I was being serious. Yeah, that's really good. I'm Canadian, David. My default reaction is to laugh at everything. It's like, my grandmother died. Oh, wait. No, I'm actually sorry to hear that. That's terrible. I'm a hockey player, so I appreciate that. Perfect. Kindred spirits.

[00:28:08] The other piece that's been interesting, and this is a mental shift for a number of HR groups, is the facilitator continuing their relationship with the participants post-session is actually even more impactful than an accountability buddy that I chose, percentage-wise.

[00:28:25] What that looks like is the facilitator having visibility into what each participant is working towards and being able to provide commentary, whether it's just celebrating or encouraging their progress, whether it's around probing to understand digital coaching, or just reminding them of the thing that mattered to them, why they chose that commitment in the first place.

[00:28:44] But does that mean that that person needs to be internal to the organization, or does that mean there's an added expense to that, to have that person who may be an external resource continue the relationship? Yeah, so it would be an added expense if all things being equal. And I think there's that question to ask with the program to say, is the purpose of this training to deliver content, or is it to drive change? Right. If it's to drive change, maybe we should put some emphasis on what happens after people leave the room.

[00:29:13] So, having said that, it doesn't need to be much. We typically see it's a 10-15% premium over what the event costs, and usually that premium can actually just be dug out of the extra pastries or the VR experience that we added in or whatever. Hey, leave my donuts alone! Well, I'll liken it to this. You know, I need to get my car painted. Am I going to get acrylic or enamel paint? Right? Because the acrylic will wash off for the first rain. But, hey, listen, I got my car painted.

[00:29:43] But if you really want it to, sorry, this is where I was going with that. I like that. If you really want it to stick and to look good and to actually change. But no, seriously, you're right. This is not even just about ROI anymore. It's about, are you just trying to check a box of saying, yeah, we did the training? Or are you really trying to make sure that there's stickiness in this behavior change? And if there's that much change in the effectiveness of the training, gosh, you know, I'd write that check every day.

[00:30:13] Well, I mean, this is the thing that blows my mind, right? You're spending, including the actual hourly rate on the participants in the room. The cost of that session is quite high to then do nothing to support it afterwards. And pushing content of people is not supporting the learning, right? And that's where we get, well, we have a sustainment strategy. We're just going to blast them with 12 emails. Like, well, nobody needs more emails. All right, let's send them a freaking book. Right. Yeah, that'll get read. 100%. Not, not 100%.

[00:30:42] Let's just test the measurement on that one. Hey, did anybody open the package? What package? The thing we sent you last week. Oh, I was supposed to read that? Yeah. And supposed to, too, right? This is the whole thing. So if we can shift the conversation in the room, allow more breathing room, less content, more context, we increase the likelihood of people wanting to change. Then if we support them through a couple people around them, checking in with them and seeing how they're doing, and there's efficiency tools for this too, right? Like the very, very not so subtle plug for actionable is that part of what we can do

[00:31:12] is make it really efficient for facilitators to do it. We're not the only ones. There's other stuff. But just if you think about that primary question of, is the purpose, the experience or the impact of the experience and invite everyone to explicitly state their response to it. One of two things happens. Either we start to think about how to solve for impact or we go, yeah, no, the point of this was the experience. So now we can just ignore everything afterwards. And, you know, I used to David think like, well, that's ridiculous. What a waste of money.

[00:31:41] I'm, I'm coming around. There's time in a place where just being in the space with their colleagues is the point and that's okay. Right. Most of the time it's around impact. And for everybody who's in HR, who's tried to prove to their bosses that not only are these things, these trainings, these behavior changes are important and sometimes they're required from by law. Yeah. Um, you know, like sexual harassment training, but there are some things that we're trying to do.

[00:32:07] Like, um, right now it's really popular to do pay transparency training and have managers and employees understand what it all means. Why are we doing it? And how does it impact them? Those things are critical because if we don't train them and there's no behavior change, then we're going to a get sued or get fined. And we're also going to lose our people. And so it really matters. So we're, we're actually on the front lines of this, Chris, and we're telling people you

[00:32:36] need to do really, you know, really comprehensive training for managers and employees. So they understand what all this stuff means. It can't be an in one ear and out the other because they're going to lose people and it's going to be the wrong person and it's going to lead to, you know, really bad things. So yeah. And expensive things. We're with you. Power to the hockey players. And the Canadians.

[00:33:06] Chris, thank you very much. This has been really cool. I actually would probably like to bring you back to talk a little bit more about measurement of programs, because one of the things that our listeners love is to actually hear actionable advice on how things can work better. And so I love discussing the ROI of programs. So we'll help you back again, if you don't mind. I'll bring some case studies. We can make it a lunch. And I'll bring Molson. Please don't do that. I mean, no offense to your sponsors. I'm kidding. No.

[00:33:34] And we're sponsored by Molson Breweries. No, we're not. We're not. We're sponsored by salary.com. Again, Chris, thank you very much. You're awesome and phenomenal insights. I really, I learned a lot today. Thank you, David. I appreciate the conversation. My pleasure. Take care. And everybody stay safe. That was the HR Data Labs podcast. If you like the episode, please subscribe. And if you know anyone that might like to hear it, please send it their way. Thank you for joining us this week and stay tuned for our next episode.

[00:34:04] Stay safe.