Kevin M. Yates, also known as the L&D Detective™, has 30 years of experience working in training, learning, and talent development. In his current role, he investigates the extent to which training and learning contributes to workplace performance. In this episode, Kevin talks about how training is measured today, how that might change in the future, and how organizations can start measuring the impact training has on business and employee performance.
[0:00 - 5:34] Introduction
[5:35 - 18:19] How is training measured today?
[18:20 - 32:06] What is the future of training measurement?
[32:07 - 36:28] How can organizations begin measuring the impact training has on business and employee performance?
[36:29 - 39:29] Closing
Connect with Kevin:
Connect with David:
Connect with Dwight:
Podcast Manager, Karissa Harris:
Production by Affogato Media
Resources:
Announcer: 0:01
The world of business is more complex than ever. The world of human resources and compensation is also getting more complex. Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources. Listen as we explore the impact that compensation strategy, data and people analytics can have on your organization. This podcast is sponsored by Salary.com, your source for data, technology and consulting for compensation and beyond. Now here are your hosts, David Turetsky and Dwight Brown.
David Turetsky: 0:38
Hello and welcome to the HR Data Labs podcast. I'm your host. David Trotsky, alongside my co host, friend, partner and well, I guess
Dwight Brown: 0:47
I really want to see where you go with this!
David Turetsky: 0:50
No, no, I'm gonna end it there. Dwight Brown from Salary.com, Dwight, how are you?
Dwight Brown: 0:54
I'm good. David, how you doing?
David Turetsky: 0:55
I'm good. I'm good. Today, we're gonna have a fascinating conversation. I think you and I both have asked the world to provide us with people to talk a little bit more about the ins and outs of the learning and development and training function, and I think we're going to have a pretty good time with that today, because we have with us. Kevin Yates, Kevin, how are you, sir?
Kevin M. Yates: 1:17
I'm doing great! I'm doing great, and I'm intrigued, because you said that you asked the world for, you know, people who could talk about this. And so it sounds like the world recommended me?
David Turetsky: 1:29
So I think that one of the posts that Dwight and I had sent out into LinkedIn said, Hey, I think we should be talking more about L and D and training and just so happens in this season, we've actually this season and the last season, we have people that we've been talking to in that world. So karma provided us with opportunities to talk to brilliant people like yourself, Kevin.
Kevin M. Yates: 1:56
Wow. Well, thank you for that. I'm very humbled and knowing that I was recommended by the World is a lot of pressure. Let's go!
David Turetsky: 2:05
No pressure! So Kevin, let's start by giving us a little bit about your background?
Kevin M. Yates: 2:11
Yeah, so my background is leading up to about 30 years in training, learning and talent development. It's been a great journey, man, it's been a great run, particularly as I think about the progression of my career and the different hats that I've worn, the different organizations with whom I've worked in, really working with some marquee brand businesses and organizations. And so my career in training, learning and talent development started as a trainer, gosh, 30 likes, about 30 years ago, at a small community bank on the south side of Chicago, where I was doing stand up training day to day, on bankware applications and customer service. That ultimately led to a role in instructional design, which ultimately led to a role in curriculum development, and that led to multiple roles in like learning operations, learning administration, leadership development, learning solutions, learning technology. And ultimately I landed like where I am right now, with focusing on measurement, more specifically investigating the extent to which training and learning contributes to workplace performance. So that's, you know, like the trajectory. And as I said, I've worked with some amazing organizations, most recently Meta, formerly well known as Facebook and McDonald's and temper insurance and Kantar Media and Grant Thornton and information resources. So it's been a great journey, and all of the work that I've done and all the roles that I've served have really informed the work that I do today.
David Turetsky: 3:48
It's been a really interesting journey, and we're gonna take and try and listen to a lot of your experience in that. But first, before we do that, we need to hear from you what's one fun thing that no one knows about Kevin Yates?
Kevin M. Yates: 4:01
One fun thing. Well, some people know and some people don't, but I love a glass of wine. And more specifically, what some people may not know about me is that I have a preference for white wines. So I love a good Riesling. I love a good Gewurztraminer. I've tried to transition to red, but it's just not working. So, so again, it's a secret to some. Some know it, some don't, but I love a good glass of white wine, particularly in the summer months.
David Turetsky: 4:31
All right. Well, next time I see you, I'll buy you one.
Kevin M. Yates: 4:34
Okay, I'm gonna hold you to that.
David Turetsky: 4:35
Okay, well, don't worry about that. I'm happy to buy it for you. I'm gonna have a Diet Coke, but you can have the glass of wine.
Kevin M. Yates: 4:40
Okay!
David Turetsky: 4:41
But let me ask you a question, is it the first glass that you love, or is it the last glass?
Kevin M. Yates: 4:45
Ooh, now you're getting all up in my business, as we like to say.
Dwight Brown: 4:50
Or maybe it's a really big glass!
David Turetsky: 4:52
Yeah, right.
Kevin M. Yates: 4:55
Let's go with that. Okay, yeah, a really nice, big glass. Yes, there we go.
David Turetsky: 5:02
Well, you need to decant and you need to let it breathe. So, you know, a big glass probably serves a lot of purposes.
Kevin M. Yates: 5:07
Yeah, let's go with that.
David Turetsky: 5:10
All right, cool. So today's topic is one that's near and dear to the hearts of the HR Data Labs podcast, probably from back from the beginning, which is talking about measuring training and learning's impact on human and business performance, and trying to get into where it's been and where it's going. So Kevin, we got asked the question, how is training measured today?
Kevin M. Yates: 5:40
Man, where do I start? So when I think about where training is today in terms of the measurement journey, I think that we're at different points in the journey. And by we, I mean the training, learning and talent development community, and so there are organizations and teams who are measuring traditional things like, how many people did we train? How many hours did we offer, how many courses are in our catalog, how much time are people spending in and with our training and learning solutions? So those are very traditional types of measures, and there are many organizations who are at that traditional point in the journey of measurement, and on the other end of the spectrum, in terms of where we are with measurement, is organizations who are doing some great work with focusing on measuring how training and learning is contributing to human and business performance. So to kind of take that back to your question, where are we on the measurement journey as a profession? I think that we're at many points along the spectrum, right? Different organizations, different teams are at different points. There are some, again, who are engaged in the very traditional types of measures, and then there are some who are chasing after the very advanced, progressive types of measures, and those are the ones that give us the insight on how training and learning is measurably contributing to workplace performance. And that's where I like to spend most of my time.
David Turetsky: 7:16
Before we get there though, Kevin, because I definitely want to touch on that. When I think about learning, typically, I think about it filling gaps in skills or filling a need for either a future need or there's also compliance training, right? If you look at the different types of things, whether it's skill based learning and contributing to removing gaps, or where I've also seen which is compliance training. Where do you think the pendulum or the balance of training sits today? Is it more in one versus another? Is it across the board, or does it depend on the company?
Kevin M. Yates: 7:51
I think it depends on the company. And I think it's across the board, right? Because you just gave
David Turetsky: 7:55
Right. some great examples for different ways in which we are trying to fulfill a purpose with training and learning, right? So there are some purposes for compliance and regulatory
Kevin M. Yates: 8:07
And then on the other end, there are training training, and then there are some training and learning solutions that are purposefully designed to sustain or move performance. So in terms of where we are, I would say we're all over the place. I mean, I don't mean that in a negative way, but again, I think it depends on how you are measuring purpose fulfilled. Because if there is a purpose for regulatory and compliance training, then you're going to measure completions, because you have to somehow demonstrate that 100% of your population has fulfilled requirements for some type of regulatory compliance training, right? and learning solutions that are purposely designed to move or sustain performance. And so then we engage in measurement to show the extent to which training did what it was intended to do. So we're all over the spectrum. And again, I don't mean that in a bad way, but again, it just depends on purpose, and then it is outcomes are measured and determined by evaluating purpose fulfilled. Does that make sense?
David Turetsky: 9:13
It totally makes sense. And as we go through today, what I would like to do is kind of revisit the differences in the purpose as we're measuring, or as we're talking about measurement, because for a lot of the people that are going to be listening, some of their investment, or a lot of their investments, get targeted based on the state of the economy, based on their industry, based on the maturity of the company. And I think a lot of them might be interested to understand if there are differences in measurement that they need to focus on specifically because of those differences.
Kevin M. Yates: 9:48
That makes sense.
David Turetsky: 9:49
So Kevin, let's then go back to where you I think you were headed, which was the focus on how measurement, then, is purposefully built for today day around the outcomes. I think what you were talking about, or where you were going was, was that companies settle based on what they have to do right and where they have to be in order for all those different things to be satisfied.
Kevin M. Yates: 10:13
Yeah, and it depends on the industry, it depends on the culture, and it depends on the goals, right? So, for example, there are some industries that are heavily regulated, and so for those types of industries, you know, compliance and mandatory, mandatory training is going to be a priority, just by the nature of the business and what those organizations have to do to stay aligned with whatever those regulatory and compliance obligations are, right? So there's that, and then there's those organizations that aren't regulated by, you know, any type of industry compliance requirements. And in those organizations, they may be more focused on things like people development, leadership development, skills and capabilities, which is not set, which is not to say that those industries that have a heavy regulatory and compliance aspect don't focus on those things. But what I'm trying to do is just give you the spectrum.
David Turetsky: 11:17
Right.
Dwight Brown: 11:18
Sort of a weighting on that spectrum.
Kevin M. Yates: 11:20
Yeah.
Dwight Brown: 11:20
Yeah, that makes sense.
David Turetsky: 11:21
Kevin, does that mean that they're actually using
Kevin M. Yates: 11:22
So that means that they may be using the same different technologies or different methods of collecting that data? Or is it just they're, are they using more of the same common technologies? technology, but that they are getting different things from the data ecosystem.
David Turetsky: 11:45
Sure, right.
Kevin M. Yates: 11:46
And here's what I mean by that. So you can get from an LMS, and most organizations have an LMS, most, not all. You can get from an LMS how many people did we train? How many completions do we have? How many hours of training do people complete? And there may be organizations who are at different points in the measurement journey who are collecting that, but they're all using an LMS to do it. Does that make sense? Because their question was, are there
Dwight Brown: 12:15
Yeah. different technologies depending upon where you are in the spectrum? So where you might see a similarity again, of what people are measuring that is kind of like spectrum, agnostic, agnostic, if you will, is the types of data that you get from from an LMS. Now let's take it to the other end, where we want to use facts, evidence and data that give insight into human and workplace performance. And if you really want to get sophisticated, then you might be using a data warehouse to do that, to extract data and analyze data. So if you are a smaller organization, a smaller business, maybe you don't have a data warehouse yet. Which means that the types of insights that you can get about training and learnings contribution to workplace performance might be a bit more difficult to do, as compared to say, some really large organizations who are further along in the journey, who have data warehouses and who have easy access to business performance data and homeless performance data. You know, they might be using tools that those other organizations aren't using, right? So it might be something like using Tableau or Power BI, right? So that might be, those tools might be exclusive to those organizations that are further along in the journey, compared to those who might not be as far along, who are pretty much just relying on, say, their LMS data, and maybe even, I don't know, Google Sheets or Microsoft Excel? Yeah, question that that kind of popped to mind for me. It's a little bit of a tangential question, but you bring up a key point in terms of obviously more sizable organizations are going to have more resources to invest in these various LMSs and and measurement systems and everything. In your experience, do you typically see a big difference in learning outcomes based on the availability of resources to be able to put to these? What do you see in the measurement of this?
Kevin M. Yates: 14:17
Yeah, that's a great question. And I think that this is a good point to really kind of create a separation between church and state, if you will, right? Because, on the one hand, we're talking about technology and tools that allows you to measure and then on the other hand, we're talking about measurable outcomes. You know, again, we have tools that measure outcomes, and then we have actual outcomes. And those two aren't necessarily related, because outcomes are the result of training and learning solutions that produce either a shift or a change in performance or a way in which performance can be sustained. So the learning experience itself, the training program, the learning solution, whatever that is, hopefully, the goal for the creation of a training program or learning solution is to move performance or sustain performance. So there's that, and then you have the tools and the technology and the methods that measure the extent to which performance is moved or changed. So really, the only connection or relationship that those two have with each other is that you use one to measure what happened and then you use the other to make it happen.
David Turetsky: 15:38
Right.
Kevin M. Yates: 15:38
Does that make sense? Yeah.
Dwight Brown: 15:40
And so do you see, do you see that organizations that don't have the resources, let's just stick with the measurement side of the fence on that. Do you see that organizations that don't have the resources for that are at a disadvantage?Or do you see them figure out different ways to more effectively measure the outcomes, or measure their measure their learning throughout the continuum?
Kevin M. Yates: 16:05
Yeah, that's a great question. And if I were to restate what I think you're saying, just to confirm so that I answer correctly, I think you're asking about the extent to which not having access to tools and technology inhibits the ability
Dwight Brown: 16:18
Right.
Kevin M. Yates: 16:19
To answer the question, what is training and learners contribution to workplace before workplace performance? Is that kind of what you're saying? So
Dwight Brown: 16:25
Perfect.
Kevin M. Yates: 16:25
Yeah, I don't know that. I've used the word disadvantage so much as maybe not being able to go as far as you could go in the presence or with the use of tools and technology that helps you go as far as you can go, right? So
Dwight Brown: 16:45
Sure.
Kevin M. Yates: 16:46
for example, if you don't have a data warehouse, and if you don't have the tools that help you extract data in ways that help you gain insight, you might just have to go at it another way, which means it might be, or rather might take longer to get at the answer than, say, an organization who has those tools and those technologies and those resources. So the questions can be answered, but it just might depend on how long it takes, because if you have access to not only tools and technologies, but expertise, right? If you have access to that and that in those tools, technologies and expertise is helping you answer the question, what is the what is the contribution of training and learning? You're able to do that a lot faster and maybe with a lot more confidence than, say, a smaller organization that does not have access to those tools, that that technology and that expertise, and so it might take those organizations that don't have access a little longer. So, you know, I could use the word disadvantage, but I would just say, you know, it just takes a little longer if you don't have access.
David Turetsky: 17:56
I would call it brute force. You need to brute force the analysis rather than or finesse the analysis, rather than having it actually come out as an output of the technologies you're using.
Kevin M. Yates: 18:06
Yeah.
Dwight Brown: 18:07
Yeah. Makes sense.
Announcer: 18:10
Like what you hear so far? Make sure you never miss a show by clicking subscribe. This podcast is made possible by Salary.com. Now back to the show.
David Turetsky: 18:21
So, Kevin, let's go to the next question, which is the one that I'm I was really can't wait to hear. Which is, how could training be measured? What is the future of this? Where are we going?
Kevin M. Yates: 18:32
So you're probably not going to be too surprised to hear what I'm about to say.
David Turetsky: 18:37
Is it a two letter initial?
Kevin M. Yates: 18:38
How'd you guess? How did you guess and the first letter starts with yay, and the second letter starts with BI. So yeah, so we're talking about AI, we're talking about artificial intelligence. But you know what I'm going to do? I'm going to tone down the hype.
Dwight Brown: 18:58
I love how you gave us the benefit of the
David Turetsky: 18:58
Okay.
Kevin M. Yates: 18:58
Because I'm going to contextualize what I believe artificial intelligence can do and what it can't do. So for me, you know, particularly when you consider who I am known doubt. That's right. Gray hairs betrayed us! as in the industry, and I am known as the L and D detective,
Dwight Brown: 19:11
That's right. right? And you guys may be old enough to remember, or maybe
Kevin M. Yates: 19:13
So you guys might remember Sherlock Holmes, And he had an assistant whose name was Watson, you're not. Maybe you're just 20 something, but maybe you guys remember. right?
David Turetsky: 19:17
Of course! Dr Watson, right? So in my work as the L and D detective, I consider artificial intelligence to be my Watson, meaning artificial intelligence assist me with conducting impact investigations. I don't believe that it can do it for me, but artificial intelligence can certainly help me work smarter, not harder, and definitely faster. Right? So in answer to your question, like, where do I see us headed? Or even, like, where are we, kind of right now, I see where artificial intelligence can and is going to do a great job at supporting us and assisting us in measuring training a learnings contribution to workplace performance. Again, I don't believe that artificial intelligence can take my place as the L and D detective, just because there are some nuances that there are some nuances for measuring training learners contribution that are uniquely human. Meaning, there's just this, this human side of that work that a machine can't do for us, but what it can do, and by it, I mean AI, artificial intelligence, it can help me work a lot faster. It can actually take away some of the work that I don't like doing, and it can do it for me, and it can do it a lot faster than I ever could. And so I'm excited to continue to use AI as my Watson when I am conducting impact investigations. That's where I think we're headed. And I also believe, and I hope that we're headed in the in the direction of really focusing on measuring training and learnings contribution to performance. I think that we know how to measure how many people did we train, how many courses did we offer, how many hours of training was complete. We know how to do that. I mean, that's that's just easy now, right? But where we have not been as focused, because it is not as easy, is producing fact based evidence that shows how training and learning is measurably contributing to workplace performance. So I believe that artificial intelligence is helping us and will continue to help us do a much better job at answering that question. Yeah, Kevin, I think where I where I totally agree with you on is because a lot of the things that we're dealing with are facts, and a lot of the things that we're dealing with within the context of measuring training outcomes, enables us to look at facts that have a ton of data associated with them. I think you're right. I think AI can assist there. The part where I'm going with my question is going to be, you know, is this going to help us with the correlative versus causal? Because obviously, if you're trying to say, am I getting the ROI out of training, you're going to always have that question, well, did it cause it or is it just absolutely, just correlative to the things that naturally happen.
Kevin M. Yates: 22:43
The answer your question is yes. Here's what I mean by that. Man, you know, I think that when we are investigating the extent to which training and learning measurably contributed to human and business performance, I think we're looking at causation and correlation, because that's where the story is, right? I don't see us focusing more on what on, on on one and less on the other. I think that if you're going to tell a good, robust, fully inclusive story, we have to talk about correlation and causation. And artificial intelligence as an assistant to impact investigation helps us reveal correlation and causation, because I believe getting the answers to both of those informs decisions about what we do with our training and learning solutions, what we should do, and what we need to stop doing.
David Turetsky: 23:40
Yeah, and that's where I was saying. We got so many facts going into those models, right? We have so many facts that we can put in. Like, I'll go back to the thing we were that I brought up at the beginning with about skills, right? You have skills on every job, and now we're going to be able to do, let's just say we do assessments on skills on people, and we know that gap. Well, you know, you took a training, and we assess that person again. Do they still have that gap? So we can, we, you know, we could obviously do do the math ourselves, but not at scale. So I love where you're going with this, that the AI can help us do these things at scale and not just on the microcosm of that person with those skills, you know that we've enumerated by job now we've tested those people, and we've now closed those gaps. Yeah, that's all great, but now you can actually prove out the ROI of the investment in not just the skills themselves, but the people, the training and the and the measurement of it. You're proving the ROI because now you're telling the company, look, not only do we close the gaps, but now we're also seeing better performance from it, and here's how that got contributed to with all these different factors.
Kevin M. Yates: 24:50
Yeah, and I would add to that, and this is just really important message for me, that not only is training and learning a good solution for moving performance, but it's also a good solution for sustaining and maintaining performance, right? Quite often, the conversation is, how can training and learning change performance? And sometimes that is what is needed. But there are also times where we just need to keep the train on the track, so to speak. And so training and learning can then become a good, viable solution to maintain and sustain performance where it needs to be maintained and sustained versus where it needs to be changed.
David Turetsky: 25:25
And I think then it also lends to the AI models being able to then provide guidance to the practitioners on Hey, Person A has great skill sets, if we sent them to training, they could be a succession candidate for this set of jobs, based on what we've seen as success, not only in being able to close those skill gaps, but knowing we have this training there and then, you know, for the outcome of succession, being able to give that person a path, a career path, which is obviously necessary, and being able to use it outside of just the world of training, but being able to provide that to that HR person, slash the employee, so they know where they can head. So there's a lot of really cool outcomes you can get from this, if you've got that data and if you're measuring it correctly, right? I mean, am I off base on that?
Kevin M. Yates: 26:15
No, you're on track. And I would also add to this conversation the idea that when it comes to performance, training and learning is not the only thing that influences business performance and human performance, and I think that that's where we have to be very careful with our storytelling. We don't want to position training and learning as being like the Savior or being, you know, like the magic wand, or that training, learning and talent development teams and organizations have, you know, pixie dust, and you know,
David Turetsky: 26:49
They don't?!
Kevin M. Yates: 26:50
No, we don't, we don't. And quite often, the the idea, or the perception is that, you know, we need to fix people, and training will fix them. So let's do some training, right? All right, what we really have to consider as it relates to performance is all that there is that contributes to the performance ecosystem. And so when I talk about that guys, what I'm talking about is all that there is that has the potential and power to influence human performance. That includes training and learning, but it is not limited to training and learning, and so what are some of the other things that contribute to human performance? That's a great question. I'm glad you asked, because some of the things that contribute to human performance include manager coaching, right?
David Turetsky: 27:42
Yeah.
Kevin M. Yates: 27:43
It includes compensation. We all, well, I shouldn't say we all, but most people want to get paid!
David Turetsky: 27:50
And more!
Kevin M. Yates: 27:51
And more, right? Rewards and recognition, tools and technology, performance support, culture, natural ability. Those are some of the things, including training and learning, that influence people's performance. So we have to be thinking about that. And then when we think about all that, there is that contributes to business performance. Because ultimately, when we talk about training and learning contributing to workplace performance, we're talking about human and business performance, but let's think about that, right? So the training team can contribute to business performance, but so does, for example, the marketing team or the products and innovation team or the sales team, right? So the essence of what I'm saying, guys, is that as we think about measuring the impact of training and learning, we need to be thinking about all that there is that contributes to performance. Because at the end of the day, I believe that training and learning fulfills the highest purpose with measurable contribution to performance. And I use the word contribution intentionally and purposefully, because we need to be thinking about all that there is that contributes to performance, not just training and learning. Yeah, yeah. And, you know, guys, for me in my L
Dwight Brown: 29:06
And part of what you guys are talking about is it's that nuance factor. So you're taking the facts and you're either applying AI or you're applying human thinking, and probably optimally, both of those together in concert to understand the nuance behind just the hard facts. And I am 110% in agreement with you also on the fact that I think our flex reaction anytime that there's a decrement performance or some issue that we see, oh, we got to train. We got to train. We got to do training, and drives me nuts, but, but it's that nuanced piece of things, in the measurement process, with the with the training and development that I think is is really kind of the heart of where you start to see that effectiveness, but you've got. To be able to understand that nuance, to be able to get to, how do we train? How do we measure the effectiveness of the training? And, you know, it's kind of a continual circle. and D detective technique, there are nine questions that I ask business partners and stakeholders. And those, the answers to those questions give really good insight into when training and learning is part of the solution and when training and learning is not.
David Turetsky: 30:28
Right.
Kevin M. Yates: 30:28
Those nine questions also help determine if training and learning is the answer and the solution or part of it. The answers to those nine questions also proactively determine what you're going to measure. What I continue to see, and it is, it is so disappointing, is training and learning is designed, it's built, it's launched, and then there is consumption and utilization of it, or participation in it. And then the follow up question is, well, what's the impact? So quite often in my career, I had been brought in at the end where the training again, has already been designed, launched, consumed, utilized and participated in, and then I'm asked to measure the impact. And my follow up question is, well, what was the intended impact? And and the answer to that question is, well, we don't know. We just want you to find the impact.
David Turetsky: 31:23
Measure something, Kevin!
Kevin M. Yates: 31:24
Measure something!
Dwight Brown: 31:25
Yeah exactly. Give us some data! I don't care what it says.
Kevin M. Yates: 31:28
But yeah, yeah. But I think that if we are strategic and deliberate and work through some of that at the front end, measuring training learning's contribution, is going to be much easier to do on the back end.
David Turetsky: 31:41
Hey, are you listening to this and thinking to yourself, Man, I wish I could talk to David about this. Well, you're in luck. We have a special offer for listeners of the HR Data Labs podcast, a free half hour call with me about any of the topics we cover on the podcast or whatever is on your mind. Go to salary.com/hrdlconsulting, consulting to schedule your free 30 minute call today. So I don't want to lose sight of the fact that one of the things that we want to have as a goal from this discussion is also to talk about, how do we get there? Because you're talking about a lot of great things, Kevin, but one of the things I think that listeners will think about as they're contemplating, you know, measuring outcomes from learning, and going into the learning with the mindset of, what's our goal where we're trying to accomplish? That's a really good learning of, how do I get there? What are the other things that you would suggest people do in terms of getting started on this journey of measuring training and being able to align with business and employee performance?
Kevin M. Yates: 32:44
Well, I'm going to be intentionally repetitive to answer that question, because the question is, what is it that people need to be doing to get to a point where they can measure the impact of training and learning. And what I just said a few moments ago was that we need to be proactive in our planning and our thinking. So again, there are nine questions that I have, and, you know, no no shame here, but I'll just put out this, this plug in the L and D detective kit, which is on my website, I identify what those nine questions are. So in the L and D detective kit, there is a methodology that I illustrate for how to measure the impact of training and learning and how to be proactive with doing that so that you don't get in that situation where you've designed and you've launched and people are consuming it and using it, your learning and your training solution, and then you ask, Well, what was the impact? Well, what I do in the L and D detective kit is show how to ask nine questions and use the answers to those nine questions to not only design training that will purposefully contribute to workplace performance, but also how you're going to measure it.
David Turetsky: 33:56
Right.
Kevin M. Yates: 33:57
So that's my recommendation. You know, my recommendation is to intentionally and purposefully plan for impact in the beginning so that it's easier to measure in the end.
David Turetsky: 34:08
Sure.
Kevin M. Yates: 34:08
And I show how to do that in the L and D detective kit that's on my website at KevinMYates.com!
David Turetsky: 34:14
We're definitely going to have that link available in the show notes. And Kevin, that's brilliant. I love where you're going with this. Is there a need, and as a good data scientist, and also as a good econometrician, is there a need at the beginning to have either, I don't want to call it an ROI, but a hypothesis, about your ROI and about the goal, a hypothesis. It's not just, if we do this training, we're going to do better. That's kind of that's really cheating. But is there a hypothesis that you have to come up with that says that performance will increase by X percent to give that ROI to the business leaders, and actually it goes back to your point of informing the direction of where you want the training to go. But is there really, do you need to get that specific and that sophisticated, or can you be much more obvious about it?
Kevin M. Yates: 35:05
I think it depends on the goal, right? Because for some goals, you're going to be very specific with how you're trying to move performance. And so it's going to be reducing x by 5% increasing y by three points as an example,
David Turetsky: 35:24
Right.
Kevin M. Yates: 35:24
So that's where you have to have a hypothesis that says training and learning, or rather, the goal of training and learning, in combination with other influences and contributors, will be that we reduce errors by 3%.
David Turetsky: 35:40
Right.
Kevin M. Yates: 35:40
That's where you're going to get very specific, right? There may be other types of situations that may be less specific, but where the goal is still clear, right? So it really depends on what the goal is. It depends on who the other key players are in terms of achieving that goal, and it also depends on what are all the variables that have influence on business performance, metrics, movement or stability? So there is no one answer, but hopefully I just kind of gave you context for how you want to be thinking about it.
David Turetsky: 36:29
I can't think of a better way to end the program, because I think you just dropped the microphone on being able to solve ROI for, you know, people who are who kind of... And I've done, I've done lots of investments in in our and ROI analysis on training and development programs in the past, and then one of the things I've mistakenly tried to do is the kitchen sink. And you can't solve for kitchen sink. You got to be able to solve for individualized goals, maybe even business goals, but, you know, or looking at the overall organization's business goal, but being able to, you know, find a goal, make it something that's potentially addressable, and then go for it, right?
Kevin M. Yates: 37:09
Yeah, you're so right. And what you just said brings to mind those times where I have been asked to measure the impact of L and D, and I'm like, Well, what does that mean exactly?
David Turetsky: 37:20
Right.
Kevin M. Yates: 37:21
We want to measure the impact of L and D. I'm not quite sure how to do that, right?
David Turetsky: 37:27
Right.
Kevin M. Yates: 37:28
I do believe there are ways in which to measure how specific training programs and learning solutions have contributed to human and business performance. I know how to do that, right? But to say that we want to measure the impact of L and D, that's big. That's like boiling the ocean. So I focus more on specific training programs, specific learning solutions that have been designed to produce specific outcomes. And that is where I focus, in terms of what I measure versus measuring the air quote, impact of L and D, that's, I haven't seen it done yet. Maybe it can be done, I don't know, but, but I've not seen that.
David Turetsky: 38:12
Well, let's ask our friend AI and see if it can.
Kevin M. Yates: 38:15
You know what? I'm gonna try that. As soon as we end our discussion today, I'm gonna go to chatGPT.
David Turetsky: 38:20
Kevin, it sounds like you're trying to boil the ocean.
Kevin M. Yates: 38:24
Well, who knows. I mean, AI may have figured it out. Who knows.
David Turetsky: 38:29
May have. But to your point, I think, I think even that's a little bit beyond where the models are today. But what we might want to do is we'll, we'll come have another episode maybe next year, and see if AI did solve that problem yet!
Kevin M. Yates: 38:42
I'll meet you back here next year. I'm all for it.
David Turetsky: 38:44
Awesome. All right, cool. Kevin, thank you so much.
Kevin M. Yates: 38:48
Thank you for having me. Thank you guys. Thank you guys, great to be here.
David Turetsky: 38:51
Thank you and thank you, Dwight,
Dwight Brown: 38:53
Thank you. Hope you both have a wonderful rest of your day!
David Turetsky: 38:57
and thank everybody for listening, take care and stay safe.
Announcer: 39:01
That was the HR Data Labs podcast. If you liked the episode, please subscribe. And if you know anyone that might like to hear it, please send it their way. Thank you for joining us this week, and stay tuned for our next episode. Stay safe.
In this show we cover topics on Analytics, HR Processes, and Rewards with a focus on getting answers that organizations need by demystifying People Analytics.