Leading People

How you can prove Your Work Makes a Difference

Gerry Murray Season 4 Episode 81

Send us a text

In this episode…

How do you prove the value of your people programs when budgets are tight and leaders demand evidence? 

This week's guest is Dr. Alaina Szlachta, author of the book Measurement and Evaluation on a Shoestring.

In this illuminating conversation, Alaina shares practical approaches to measurement and evaluation that don't require a PhD in statistics or massive resources.

Drawing from her experience across corporate, non-profit and academic sectors, Alaina reveals why data literacy is the foundation of effective evaluation. She explains how her journey from "data-phobic" to "data enthusiast" shaped her practical approach to demonstrating impact with limited resources.

The game-changing "Impact Hypothesis" framework Alaina shares creates a logical chain connecting learning programs to business outcomes, without requiring complex statistical analysis. Rather than attempting to prove perfect causation (which is nearly impossible in complex organisations), she demonstrates how to identify meaningful correlations and trends that reveal program effectiveness.

Most provocatively, Alaina addresses the paradox many organisations face: leaders want ROI data from learning initiatives but fail to provide the resources needed to gather it. 

Her "build, borrow, buy" strategy offers practical ways to leverage existing organisational assets to overcome this challenge.

For HR professionals tired of fighting for budget without evidence, and business leaders questioning the value of people investments, this conversation offers a refreshing middle path. 

Alaina's approach makes measurement accessible to everyone, transforming L&D from a cost centre to a strategic driver of organisational success.

Curious?

🎧 Listen now

Connect with Alaina on LinkedIn and mention this podcast to receive a special gift. 

Visit Alaina's website

Buy Measurement and Evaluation on a Shoestring

Follow

Leading People on LinkedIn

Leading People on FaceBook

Connect with Gerry

Website

LinkedIn

Wide Circle

Speaker 1:

Welcome to Leading People with me, gerry Marais. This is the podcast for leaders and HR decision makers who want to bring out the best in themselves and others. Every other week, I sit down with leading authors, researchers and practitioners for deep dive conversations about the strategies, insights and tools that drive personal and organizational success. And in between, I bring you one simple thing short episodes that deliver practical insights and tips for immediate use. Whether you're here for useful tools or thought thought-provoking ideas, leading People is your guide to better leadership.

Speaker 1:

If you work in HR learning and development or run any kind of people-focused program, have you ever been asked what's the ROI on this? Or, worse, had your budget cut because there wasn't enough evidence? This week, I talked to Dr Elena Schlachter, author of Measurement and Evaluation, on a Shoestring. She shares a practical approach to measuring impact without needing a PhD in statistics or a massive budget. We talk about using the right kind of data to show value, why storytelling is critical and how L&D can become a strategic enabler of growth. If you've ever struggled to prove that your work matters, this episode is for you, so let's dive right in and hear what Alina has to say. Alina Schlachter, welcome to Leading People.

Speaker 2:

I am so excited to be here with you, Gerry.

Speaker 1:

Well, thanks for joining me. I believe you're coming in from Austin, Texas, isn't that?

Speaker 2:

right, austin, texas, where it is balmy and humid today.

Speaker 1:

Wow, there'll be people in the world wishing they were balmy and humid.

Speaker 2:

Yes.

Speaker 1:

Come to Texas anytime, you'll get all the heat that you could ever want.

Speaker 1:

Right. So I attended a fantastic webinar workshop you did a couple of weeks ago actually, for the Transfer Effectiveness Institute that's based here in mainland Europe. And then, of course, I got to know a little bit about your work and you published a new book last year and we're going to get to that shortly. But first to kick things off so my listeners can get to know you better, how did you come to focus your work on measurement and evaluation, and what people, places or events stand out in your journey, or were there some epiphany moments, particularly that led you to write this book?

Speaker 2:

All wonderful questions. The easy answer, jerry, is this I happened into it, not unlike many people who fall into corporate training. I happened to be very lucky in that early on in my career I worked for companies and people and in situations I was a competitive public speaker, I was a former athlete and all of those dynamics required really good data to improve performance. And, jerry, you and I talked briefly about your experience in music and how, when you teach something music or when you're an athlete, or when you're competitive and you're performing, you need really good feedback to be able to improve and feel good and confident about what you're putting out into the world. So I grew up literally as a child. I grew up in this competitive, performative space and feedback is data and taking feedback and adopting it and applying it in your life those are just things that were intuitive to me. And then, as I matured and moved on into my career, I worked for a company that was very data-driven my first job out of college and then I went and got my master's and PhD where I was working with data and statistics and publishing and wanting also to teach and be a great professor, and the feedback that I would get in all my evaluations I took to heart and would apply and improve my work. And then I worked in a grant-funded nonprofit dynamic, and so our grants were very data-driven and results-oriented, and we had to leverage data not only in our programming, but be pretty adept at using data to tell the story, to make sure our funders knew that we were doing what we said we were going to do with our grants. So I spent literally probably 15, 20 years.

Speaker 2:

If you think back to me being a young performer, that was the world that I operated in, and then all of a sudden, jerry, I decided I want to go work in the corporate world. Before no one will take me seriously in the corporate world because I spent too much time in the nonprofit sector or higher ed, let me go take my opportunity in corporate, and what I learned is that in corporate training, people aren't using data, they don't have feedback loops. Data enablement's not a thing. There aren't any logic, models or hypotheses built into our programs, and I thought what is going on here? This just isn't aligned with all the years leading up to my experience in corporate, where data was so important and a pivotal part of our work, and so that was really, what launched me into where I'm at today is how can I help the corporate world get better at working with data, because it literally is the one asset that makes our work really meaningful. So that's what I do today.

Speaker 1:

Right. And for anyone out there who's going oh no, no, data, data, data. I think what's going to happen over the next course of this conversation is you're going to actually make it actually present it in such a way that it's actually accessible to even the average person who isn't so, maybe, mathematical. So we're talking about really quality information that helps you improve or helps you determine whether things are working, et cetera, et cetera. So now the book is called if I'm not mistaken, it's called Measurement and Evaluation on a Shoestring right, and it's part of a series from the ATD, which stands for the Association of Talent Development.

Speaker 2:

Yeah, they're one of the oldest, largest publishers. I didn't know this until I started working with them in the talent development sector.

Speaker 1:

Yeah, and they have a series for professionals who have limited time and budget, which is the Shoestring series. So what we want to get to now is why this book? And why now? And who's the book for, and what gap did you set out to fill when writing it?

Speaker 2:

So this book literally landed in my lap. I couldn't be more honored to be invited to write this book. So ATD was looking for an author for this shoestring series. I had just done a presentation at ATD Core 4. This was back in 2021. I had done some of my own original research during COVID. Every one of us has a COVID project that we did.

Speaker 2:

And for me, being a former researcher and a data nerd, I wanted to do a study to understand why are people struggling with measurement in the corporate world. Because, remember, I worked in the corporate world for a couple of years and then I left to start my own business and it was around that time that I did this study. So, presenting the results of the study in a conference presentation, one of the editors from the ATD team was in the room, reached out to me and said, hey, would you be interested in pitching to write this book? So how could I say no to that? And that was really what landed me really hyper-focused on measurement and evaluation. And, jerry, what's really interesting is, as I wrote the book and yes, we're talking about measurement and evaluation and different models and strategies and I tell all kinds of stories to make it more accessible, because data, measurement, evaluation just like you said, jerry, these things are intimidating to people and my goal in the book is to make them less intimidating and more accessible and to help people realize that you've got a lot of the tools and thinking that you need already to be able to do this work.

Speaker 2:

Well, but as I wrote the book and as I've been out in the world speaking and doing podcasts, talking to people, I realized that measurement and evaluation is the outcome of being data literate. So it's not that we need more information about measurement and evaluation. We don't need any more models. There are so many incredible models and theories and there's such great research out there to help us become more strategic and to apply other people's strategies to our own measurement approaches. But what makes all of that easier is data literacy.

Speaker 2:

People hate data and in my book I talk about how I was also that way. I hated data, I hated mathematics and statistics didn't really come naturally to me as I was coming up, and so I realized that data lives everywhere. Data could be anything. Data is information, it's feedback, it's what time is it, it's how is the temperature and learning from all of that feedback that's all around us learning and applying that in a meaningful way. It's as simple as that, and so my goal in the book is to help people understand how to better work with data, and then I tell all kinds of stories of how I've done that in my career with limited resources, so that people can walk away feeling like I can do this. And that's really the goal.

Speaker 1:

On Leading People. The goal is to bring you cutting-edge thought leadership from many of the leading thinkers and practitioners in leadership today. Each guest shares their insights, wisdom and practical advice so we can all get better at bringing out the best in ourselves and others. Please subscribe wherever you get your podcasts and share a link with friends, family and colleagues, and stay informed by joining our leading people LinkedIn community of HR leaders and talent professionals. I think there are probably a lot of people out there, including myself.

Speaker 1:

You know it's not about like, if you take your example you started off with, you know Austin is humid today. Well, I mean, you don't need to be able to calculate the temperature to know that 95 degrees Fahrenheit or maybe 35 degrees Celsius is pretty hot, right, and if you get a percentage for the humidity of 60, 70 percent, you know, so you don't need to be able to calculate it, to use it. Basically and this is something I discovered when I did my MBA I was surrounded by all these super bright engineers, you know, and they were like super bright with maths and stuff, because their whole studies and everything was based on these things. And it was only after that, and then, working in the corporate world. I ended up working in finance and strategy and that that I realized that actually there's lots of great people out there who can compute stuff and increasingly today you're going to have more and more advanced tools that can do it.

Speaker 1:

It's understanding what you could. What decisions can you make based on that? Can you turn the data into useful information that will help you make a decision right and or tell you something you didn't know, or or confirm something you thought you knew right? And maybe that's actually not a bad way to segue into a powerful tool you talk about in the book called the impact hypothesis. Can you walk us through what this is and how it helps learning and development teams move from vague goals to clear, measurable outcomes?

Speaker 2:

So I mentioned earlier in our conversation that I spent a lot of time in higher education and research and then in the nonprofit sector, where we worked with grant-funded and donor-backed programming, and so in those sectors, something called a logic model is very common. You cannot put a grant in front of a prospective funder without a kind of logic behind what you're doing. Similarly, there's logic behind performance. If you think about sports, there are sports plays that say this is how we're going to go about trying to achieve this goal, or navigating where we're at in the game. This is how I can become faster as a runner. There's a playbook, there's a strategy, there's a logic behind what you're doing, and that's the impact hypothesis for L&D, and it's not just applicable to learning and development. In fact, I use the same framework in the nonprofit sector with executive directors and leadership development companies who are like how do we create some kind of logic behind what we're doing? And so it's just that it's logic. Create some kind of logic behind what we're doing. And so it's just that it's logic. It's tying a few key data points together to tell a story and to ultimately be able to test what kind of impact you're having, and so it's really simplified version of a logic model. Anybody could go Google search a logic model. You'll see some pretty complex things inputs, outputs, outcomes, results and it seems a little overwhelming. We really don't need all of those data points, though they could be useful, especially if you want to calculate ROI. You need to have a little bit more detail with the inputs, outputs, et cetera. But in the end it's a hypothesis that we want to be able to test and it links some core evidence together. So we have a program on the one end, and that program is doing something for our organizations, our communities, our people, and we want to tell a story about what contributed to the organization based upon our learning program. And the core pieces of that are the question I love these two questions what becomes possible? Or so what? It's that simple. So we have a program, hr initiative, we're doing it, and when people engage in the initiative, what becomes possible for them? Then we have an answer to that question. And then so let's say, we have a learning program, let's call it a leadership initiative, and you ask yourself the question well, what becomes possible for leaders when they participate in this program? Well, we want them to be better communicators. Okay, great. So then if they're better communicators, what then becomes possible? Well, maybe we'd have less conflict between team leaders and the team members. We hope that when people have better communication skills they can better navigate conflicts. And then you ask the question again well so then what becomes possible when there's less conflict on the teams in your organization? Oh well, they'll become more productive.

Speaker 2:

Now we've got this really nice impact hypothesis, we've got a chain of evidence and now we can go test and prove it. People participate in this leadership program to become better communicators, so that teams can have less conflict, so that the teams can be more productive. Then we ask ourselves well, what data do we need to test this chain? How do we test that this domino effect actually took place? And we figure out well, we need to have some kind of completion rate, obviously. If people aren't participating in the program, well then, none of the other things can become possible. So we do need to track completion rates. Okay, great, that one's pretty easy. Well then, what becomes possible when people complete the program? Well, they are better communicators.

Speaker 2:

Okay, well, what are some of the indicators that would tell us that people are better communicators? There's probably some problem within the organization or on the leadership team. Maybe people aren't clear in their expectations, so maybe increasing clarity of communication is an outcome. So we do have to drill down a little bit on, well, what do we mean by better communicator, because that can look an infinite number of ways, so we do have to drill down on that. But once you get clear on how do we know that somebody is a better communicator because they participated in the program? That's a data point.

Speaker 2:

Well, then we take that one step further. Well, how do we know that teams have less conflict? Maybe we can look at some of the meeting notes. Now everybody is using an AI tool to sit in on Zoom meetings or real in-person meetings and we can actually do a sentiment analysis. When you look at those meeting notes, what kind of sentiment is happening among the team? And we could actually do a sentiment analysis of a reoccurring team meeting to see that more optimistic or even neutral. Maybe it just goes from negative to neutral, right. But that sentiment analysis is a great indicator that maybe conflict is going down. And then productivity is an easy one. Obviously, are people meeting deadlines? Are people having as many calls as they need to have with prospective client leads, right. So whatever productivity measures are probably already in existence in your organization, pick any one of them that's relevant to the team and those leaders overseeing that team. Now we've got a chain of evidence and some data to test. Did these things unfold in the way that we imagined?

Speaker 1:

You've actually just reminded me that something that came up in the webinar that I was attending I attended with you was this idea of perhaps you need to gather some baseline data or data just to know what your current situation is like, what is your present state of things versus, because that helps you know if you've shifted the dial now. It doesn't necessarily imply cause and effect, but it it can help you say well, we started here and we're starting to see something different, which is actually an improvement on what we had before. Um, what, what's your experience of of that with working with your clients now?

Speaker 2:

Yeah, your point is so important and I hate to say this, jerry, but I forget this because I've worked in data-rich environments throughout my career. Being a data nerd, I know intuitively that we need a baseline, and so oftentimes the baseline data is already there and we just have to pull it out in conversation. So, going back to the same example of there's some productivity is down, teams have high conflict and leadership teams aren't communicating well. Well, why did we pick those things? We go back to the questions I asked in the beginning. Okay, well, we're doing this leadership program. When people go through it, what becomes possible on the other side Better communication. Well, why did you answer that way? You could have said any number of things. You could have said that we want people to be able to delegate better, or we want people to hire, we want people to have better performance evaluation conversations. You could answer the question in an infinite number of ways what becomes possible when people participate in your leadership development initiative? Your answer to the question probably comes from some baseline evidence that people are poor communicators. So I would ask you, jerry, if you were the one that said people aren't communicating well, we want them to improve their communication.

Speaker 2:

I'd say, jerry, what leads you to believe that our leadership doesn't communicate very well? Well, I've heard some anecdotal evidence from people. They submit anonymous feedback to our HR team that this particular leadership group isn't clear, they're not giving clear directives to their team. The team feels lost, they don't know what to do. Okay, that's baseline data, that's baseline evidence. We could go validate that and do some informal surveys, or we could do some focus groups interviews or put out an anonymous survey that asks for a little bit more data on this. But baseline data already exists. It is in the intuitive gut check feeling, it's in anecdotal conversations, it's in our key performance indicators. So take that equation all the way out to productivity. I would ask you the same question, jerry.

Speaker 2:

Well, what becomes possible all the way at the end of this chain of evidence is that the teams are more productive. Well, what leads you to believe that productivity is a challenge right now? Well, because people aren't hitting their deadlines. Okay, what data are you looking at to know that they're not hitting their deadlines? Well, we use Trello to manage our projects and I'm seeing this one team is just not hitting their deadlines consistently. Okay, great. So if we're successful, I should look at that same Trello management system and be able to see that people are hitting their deadlines if they participate in this program and they become better communicators and there's less conflict.

Speaker 2:

But that's the thing, jerry. We can't just look at people participating in the training and then let's look at productivity. I know we want to be efficient, we want to use our resources wisely, but this is one of the challenges that I see everywhere In leadership initiatives. Within some of the biggest and best leadership development companies. They're doing this and it's awful. They say people take our programs and there's more psychological safety in organizations as a result.

Speaker 2:

I'm like BS there's not enough data that connects the dots between people participating in your programs and psychological safety. We've got a few more things that we have to show that are correlated, which is what are people doing differently to influence psychological safety? And did you focus on those things? Did you teach people? Did you give them practice opportunities to be better communicators or to be more inclusive, or whatever it is? So we need all of the evidence and that impact hypothesis. Do not ever cut corners by trying just to say people completed training and productivity is up. That's not enough data, it's not sufficient enough and any smart stakeholder is going to question you and say well, how do I know that it was the training that contributed to the change in productivity? That's where those data points in between. Well, people are better communicators. Well, do you know? Because we did some baseline data and now we can see that communication changed. Also, conflict changed.

Speaker 1:

Yeah, right, so that that's a really important connection that we don't want to skip yeah, and, and from my experience working in finance and financial planning over in my early career, one thing that finance people will look at when they look at data outcomes or the patterns in data is they want to know what's driving the data. So you need some level of granularity. That's right, because many things could be contributing to psychological safety. That's right. And you want to make sure you're targeting the things that have the most impact Right, and not the things not spending your money on something that makes a 1% difference when you could be spending your money on something that makes a 20% difference. That's right. So if you don't have some data points along the way, you're probably going to get misguided information at best right.

Speaker 1:

So you talk about measurement not having to be perfect. It just has to be useful, and I'm sure that's quite liberating for many listeners. What sort of mindset shifts do L&D professionals need to make to start small and measure meaningfully? You're listening to Leading People with me, gerry Murray. My guest this week is Dr Elena Schlachter, an expert in practical evaluation and author of Measurement and Evaluation, on a Shoestring Coming up. We explore how to connect data to decision making, why engagement scores are just a starting point and how leaders can build a culture of learning that drives results. Now back to our conversation.

Speaker 2:

So I had a really awesome one of those light bulb conversations with a good friend of mine and it was about causation versus correlation. She believed her entire career she's got about the same number of years in L&D as I do and we were just having an informal conversation over a beer and I said, liz, why is it that you believe we have to show that learning caused this thing, this change, this outcome? And she said, well, if it didn't cause it, then it wasn't effective. And I said, no, think about any kind of initiative that's out in the world. It's never going to be the sole cause of the outcome.

Speaker 2:

Maybe in a randomized, controlled trial with a drug study, where somebody takes a prescription pharmaceutical and they get some kind of outcome and we have a control group and it's a very controlled environment, then, yes, we can talk about causation, but in the real world we are never talking about causation. It is just fundamentally inaccurate. And so what we're looking for is simply trend lines. Are things trending in an incline or a decrease? Whatever you're trying to accomplish, we just want to see trend lines changing, because there are so many other confounding influences in working with people. How does someone feel that day? Did their mom die? Did their dog die? Is their child going from puberty into becoming a teenager and they're just really difficult to deal with and it's hard for them to show up at work right and be as productive as normal. So there's so many other things that contribute to how people perform that we can never prove causation and we shouldn't try what we can do is just use that chain of evidence and if we can see trend lines.

Speaker 2:

Going back to the example I mentioned earlier, people participate in a leadership program. It's designed to make them better communicators and better deal with conflict on teams. If we can see that communication has improved, that conflict has gone down and productivity has gone up, and we can see all of those trend lines have changed as people engage in a program, that's a pretty good indicator that our program was effective. Yes, we can drill down into more data. We could calculate the ROI if we really wanted to, but we don't need that to be able to know that the initiative was a good investment of time and money.

Speaker 1:

Yeah, it's probably this tendency in society to misinterpret or misunderstand that a lot of things are systemic and people are looking for very black and white, linear answers. I mean, it's OK if I put an egg into a saucepan of boiling water. After several minutes it will boil, so I have a certain cause and effect easy to observe and organizations with the complexity of people and the dynamics and everything else, it's going to be hard put to find the cause and effect relationship in the same time and space. It could be that there are cause and effect relationships, but they might not be easy to see. So you say the pattern is going to be incredibly important there. Now let's talk about the role of leaders, because many of our listeners here are leaders themselves and some may not be sitting in L&D but they rely on it, perhaps to train their people and develop and grow their workforce. Why should they care about measurement and what can they do to foster a culture where learning and evaluation are seen as strategic rather than some sort of optional thing you do?

Speaker 2:

So there's this really interesting paradox that I think would be helpful for leaders to just reflect on for a moment. So data from a variety of sources going back for decades says that leaders want to know some kind of return on investment. It doesn't have to be financial, but they want to know that their investments and developing people are giving them some kind of returns. And they wish that the learning function would be better at that. And we've seen data that says, you know, 90 and above percent of leaders want their learning function and their HR functions to be better at telling the impact story, the outcomes of their programs, versus just outputs. And this is an important distinction, I think, for listeners, leaders listening to this. This is going to be intuitive for you. But if you're in the learning function or you're in the HR function, remember that an outcome is the thing that becomes possible because of an output. So if somebody picks up the phone and dials 10 prospective clients and has 10 sales conversations, that's an output. The outcome of that could ultimately be conversion rates and more sales and more profitability. But the output is just the things that those activities that people are doing. People complete programs, they do workshops, they have conversations those are all outputs. We really want to focus on the outcome of all those activities, because that's what helps us to know what kind of return on investment we're getting. And so back to the leaders component. So leaders want to know the outcomes of all their investments and all the outputs that are happening inside of the organization. Well, what's coming about?

Speaker 2:

Here's the paradox.

Speaker 2:

Well, our HR and our learning professionals.

Speaker 2:

They also care about the returns of all the activities that they're doing, but they say we aren't able to do those calculations or tell those stories or even to be confident that we're working with the right data, because we're not given the resources to do so.

Speaker 2:

So that's the paradox, jerry, that our leaders want some kind of return on investment. They want to know the outcomes that are coming about because of our programming, our learning and HR. People want that too, but they say the reason that we're not focused on those things is we don't have enough people power, we don't have enough time, we don't have enough technology or resources to do this work well. So, leaders, I would say, if you want to support getting more outcome data, well then we have to think about how can we provide the right resources to make sure that all of our departments have that kind of data-driven capabilities, because it's being data-driven that gives us those outcomes and gives us the returns on what we're investing in. And helping people be data-driven and data literate is the thing that's going to make that great change.

Speaker 1:

Yeah, and that sort of probably segues nicely into another theme in your book about you propose a build, borrow or buy strategy for anybody working with limited resources. Maybe you can unpack that a little bit, because that probably relates to the L&D person going to the leadership team asking for things, getting pushback on what they want and then having to rethink. Either they give up or they rethink. How could I maybe do this even though I'm not given everything I want? So please maybe talk a little bit about that.

Speaker 2:

I think, an important conversation. Going back to the paradox that I mentioned our learning leaders and the people in a learning function, I think one important perspective, as you're talking with stakeholders and asking for resources, whether that's time or money or tools our leaders want to know if the investments that we're doing in these resources are worthy of the organization's resources or not. So we have to be prepared to make that case, for, yes, this is an investment. This is a good investment of the organization's resources, and here's why. And so to your question about the build, borrow, buy framework, one of the things I talk about and I have an exercise in the book is do we build it internally or do we buy it? So here's a really good, simple example.

Speaker 2:

So a lot of people in the learning and HR function. They're not great at creating reports and they're not great at data analysis. It's a unique our vendors. So many organizations are going to have a customer relationship management system, an administration management system, an HR system, an LMS, right. There's all these vendors that sell tools that collect data and they can do reporting and they have the capability of doing analysis. So one of the best things that we could do, and this is a great way to use the organization's resources wisely is to figure out, use the impact hypothesis and say we want to be able to test this chain of evidence completion rates versus increased communication, versus reductions in conflict on the team and increases in productivity. This is the impact hypothesis. Take that impact hypothesis either to your vendor and say, hey, what data could we use in our learning management system or other data systems. Sometimes even Power BI is an incredible tool to lean on, and you could go to whoever in your organization is overseeing your business intelligence. Bring this impact hypothesis and say we're going to need data from a few different places in the organization. What can I do to make this chain of evidence possible and how do I create a report that can help me to track and monitor if these changes are coming about? The impact hypothesis of here's what I want to do. Here's the data that I'd like to use and go talk to your vendors, talk to the business intelligence people in your organization that might even just be your CEO and say, hey, how can I test this hypothesis so it uses the organization's resources wisely without having to go take expensive courses and learn how to do data analysis all on your own, leverage the assets that the organization already has, the people, the vendors, the knowledge.

Speaker 2:

You might have a colleague who's in marketing that's a super data nerd like me and you could ask them hey, how would I go about making a report, how are you guys doing reports and marketing, and how can I learn from you? So the idea is, once we know what we're trying to accomplish, we can lean on the knowledge and resources and supports that are already in the organization. But it's a lot harder to do that in reverse. I can't go to my marketing colleague and say, hey, I want to report without knowing what's going to be in that report. My marketing colleague is not going to be able to tell me the impact hypothesis. I have to come up with that on my own. But once I've got it, then it makes it so much easier to ask for resources or ask how do I accomplish this with the resources the organization already has? And that's how we can use that build, borrow, buy perspective to use the resources we have to make accomplishments in our work.

Speaker 1:

So that's even a tip for people on our side of the fence who provide services to organizations. It's something that in my company, wide Circle, we offer to clients if we can give them data that they can use, and so in some cases we can. I'm a big fan of that because I come from the world of business, outcomes and strategies and and that, uh, you know, and do they actually deliver what we wanted? Not, not not, like you say, the output per se, but do they produce an outcome for the organization? And not too often the conversations I mean, you're talking about the difference between outputs and outcomes too often the conversations are about inputs like how how many hours does it would hours would you take to do that, and how many people can go on the training, and how many books or course materials will I get, rather than thinking and working your way back from the outcome, because maybe a course is not what you need to achieve what you wanted, and if you're not starting from the outcome, you'll never get there.

Speaker 1:

Now just to. There's one last thing. So, basically, you've never get there. Now just to. There's one last thing. So, basically, you've covered the data. You don't need to be massively data literate to be able to get data and to use it to make informed decisions. What about getting stakeholder buy-in, and especially when, in some organizations, measurement and evaluation isn't a top priority?

Speaker 2:

Yeah, I would say that in the organizations where measurement and evaluation isn't a top priority. Yeah, I would say that in the organizations where measurement and evaluation is not a top priority, it's probably because organizational leaders themselves aren't incredibly data-driven and I experienced this in one of the organizations I worked for. I have a very data-driven mindset and I believe in the power of data, and I want to know how effective my programs are. How do I improve them? How do I use the resources that we have wisely? We only have so many hours to train people. Maybe I should cut this and invest more in that. Well, how do I know how to navigate those investments and resources without data? So I think it's these kinds of questions that we can bring to our stakeholders and we can say things like hey, we've, for example, we've had this onboarding program. It was one of the programs I was accountable for many, many years ago. I was the one that trained all the new employees to do their jobs and I wanted to know how can I improve? How do I know that our program is leading to the efficiency and effectiveness of employees when they get on the job? I need some data on that, which simply means my test scores on how people's knowledge has changed. I had data on that, but that only told a fraction of the story.

Speaker 2:

I want to know how people are performing against the expectations of performance once they leave training and they get on the floor, and what are the things that they struggle with the most.

Speaker 2:

I needed managers to be bought into working with me to give me that data, and so it's just approaching your work from a sense of curiosity, like I want to be better.

Speaker 2:

I want to use the organization's resources wisely, and training and onboarding and coaching those are expensive things, whether you're building it internally or you're leveraging an external resource. So we want to be asking those questions and the more we approach our leaders and managers with I want to make sure we're using our resources wisely. Especially today, when economies are uncertain and God knows what the future looks like, the best thing that we can do is use our resources wisely. And so, leveraging that perspective and saying to leaders I want to make sure we're using our resources wisely, but I need better data and I need managers to get on board and give me some of the feedback that they see when people go into their jobs in the first 30 days. I need to know where they struggle because then I can change my training to maybe help address some of those struggles so that they're struggling less and they're being more effective on the job. So that kind of perspective and that healthy sense of curiosity to get people on board, it works really well.

Speaker 1:

Okay, so there's so much great wisdom and advice there. So, coming to the end, elena, what are perhaps, if we could get everything synthesized, what are a few key insights or the big idea you'd like my listeners to take away from this conversation?

Speaker 2:

Whatever it is that you're trying to accomplish getting stakeholder buy-in, getting your managers to give you feedback and data that helps you understand how employees are performing after they leave training, whatever it is you're trying to accomplish it's so much easier when you know what you're doing and why, and you can organize it in the impact hypothesis, and you could use my tool, you could use a logic model, you could use the five whys framework. Why are we doing this? Why does it matter? So long as you have something that you can present to a stakeholder, a manager, even a learner?

Speaker 2:

I love using the impact hypothesis at the beginning of my learning program so that every single participant knows why we're doing this. You're doing this program so that this becomes possible and this becomes possible, and we want you to be part of this journey and giving us feedback and helping us to make this impact possible. So, yeah, knowing what you're doing and why, and use some kind of organizing tool to organize your thinking and that kind of clarity. We're doing this so that this becomes possible and this becomes possible, string it out, show the details, and when you do that, you get a lot more buy-in, resources and support than you could imagine.

Speaker 1:

One of the reasons why I do this is because I always learn something, and now you've given me a very interesting idea to explore how to take your hypothesis framework and maybe explore that with participants at the beginning of a particular training. So that's got me thinking. And finally, I'm sure there's lots of listeners out there thinking now how can people get in touch with you, because they might want to reach out to you to find out more, and do you have anything special to offer them?

Speaker 2:

In fact I do, jerry. I appreciate the question. So the easiest way to get in touch with me is on LinkedIn. I am Dr Elena Shlokta, just like you see here. You can find me by putting that name into the search tool. I am giving away a free chapter of my book. So we talked a lot about data literacy and the impact hypothesis. I will give those chapters chapters, actually. We'll make this plural. So if you say to me on LinkedIn I listened to the Leading People podcast, loved your conversation, can I please have two free chapters of your book? I will happily share them with you right on LinkedIn and that's how you can stay in touch with me and get that free offer.

Speaker 1:

Well, that's fantastic. I think I'll apply for it myself. Okay, as always, thanks, Elena, for sharing your insights, tips and wisdom with me and my listeners here today. You are very welcome. Thanks for having me. Thanks, Elena, for sharing your insights, tips and wisdom with me and my listeners here today.

Speaker 2:

You are very welcome.

Speaker 1:

Thanks for having me Coming up on Leading People.

Speaker 3:

Lots of people go into management without any kind of training. Lots of people are selected for positions as managers on the basis of being good at something else. I'm sure that a you know, a lot of your conversations kind of reflect that truth. So a whole universe of people who are in positions of authority and don't quite know what to do or are being managed by people who plainly don't know what to do, and that felt like a very big opportunity.

Speaker 1:

My next guest is Andrew Palmer, senior editor at the Economist Bartleby columnist and host of the Boss Class podcast. In a fast-paced conversation we talk jazz, power and delegation and what Andrew has learned from interviewing some of the world's top managers. It's a witty and insightful episode that you won't want to miss and remember before our next full episode. There's another One Simple Thing episode waiting for you A quick and actionable tip to help you lead and live better. Keep an eye out for it wherever you listen to this podcast Until next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.