Nonprofit Hub Radio

Proving Your Impact: How Nonprofits Can Use Data to Tell a Better Story

NonProfit Hub Season 6 Episode 41

Send us a text

In this episode of the Nonprofit Hub Radio Podcast, host Meghan Speer sits down with Matthew Courtney, principal consultant at Courtney Consulting, to explore how nonprofits can use program evaluation to strengthen funding applications, prove impact, and improve programs. Drawing from his own experience as a former nonprofit founder, Matthew shares hard-earned lessons about the importance of collecting and analyzing data—beyond just counting participants—to tell a compelling story of real change. He offers practical tips for starting from scratch, making data collection automatic, and creating a culture of continuous improvement that funders trust. Whether you’re a small or startup nonprofit or a seasoned leader rethinking your evaluation strategy, this conversation will help you see data not as a burden, but as a powerful tool for sustainability and growth.

Support the show

Get free nonprofit professional development resources, connections to cause work peers, and more at https://nonprofithub.org

SPEAKER_01:

Marketing Support Network is proud to serve the nonprofit community by offering full service contact center fulfillment, digital marketing, and fundraising services. Your vision is our mission, and we can't wait to partner with you. Visit marketing supportnetwork.com for more information.

SPEAKER_02:

Welcome back to the Nonprofit Hub Radio Podcast. I'm your host, Megan Speer, joined today by Matthew Courtney, who's the principal consultant at Courtney Consulting, also a Nonprofit Hub coaching member. So we're excited to have him here today. Matthew, welcome in.

SPEAKER_00:

Hey Megan, thanks for having me today.

SPEAKER_02:

Yeah, my pleasure. So we're gonna dig into a topic. I feel like I've been on a run of these lately, of like things we've I've never talked about on the podcast. So I'm excited to dig into that idea, especially around what for some of our program folks, around uh some of those evaluations. So before we do that, uh take a moment and introduce yourself, tell us who you are and and kind of about the journey, the little bit that got you to this conversation today.

SPEAKER_00:

Yeah, well, um, it's great to be here with you today. I'm Matthew Courtney, and um my work specializes in nonprofit strategic planning and program evaluation. I started actually started my career out as a music teacher, and um, that sort of public service piece I think has sort of always been in there. And when I was 25, my school closed. And instead of trying to find another music teaching job, I decided to jump into the deep end and become a nonprofit founder. Um, I started a nonprofit called the Bluegrass Center for Teacher Quality, and we were focused on teacher professional development and thinking about how we can empower teachers to empower each other. Um at 25? At 25, wow, yeah.

SPEAKER_02:

Okay, good for you.

SPEAKER_00:

The best advice.

SPEAKER_02:

I'm just thinking about myself at 25. There's no way that I would have been ready for such a thing.

SPEAKER_00:

You know, it was truly a wild adventure. Um the best schooling um that I could ever have asked for. And I really think that it launched me years ahead of where I would have been had I kind of gone a traditional, you know, climb the ladder route. Cause I was literally right in the deep end, having to learn to do all of the things, as I'm sure many of your listeners also find themselves to be at any age, because uh the deep end is the deep end, it always is. Yeah, so I did that for four years until unfortunately we lost all of our funding kind of all at once, and we had to close our doors. Um, that was really painful, but also very formative because I I learned that I had made some mistakes as the leader around strategic planning and program evaluation, and I didn't have what I needed to show funders that my work could be sustained and should be sustained. And so when uh you know, you can't pay the bills with promises, and so when promised money didn't come, um, we were kind of up a creek.

SPEAKER_02:

Yeah. Oof, that is, I mean, that's a hard road at any point. Yeah. But that early on in your career, man, I'm sure that has definitely shaped the work that you have been doing. And even I would say, probably the compassion that you have for people as you're doing that work.

SPEAKER_00:

Yeah, for sure. Cause it's a lot of pressure. It's a lot of stress. And I had staff and volunteers and a board who were looking at me. And, you know, we had to go in and say we had a very difficult board meeting, and we had to say, you know, this money's just not coming. The funder has changed priorities. We really counted on one funder, or were counting on one funder that changed priorities. And you know, like I said, promises don't pay the bills, checks due. So pro tip don't get excited until that check is in your hand.

SPEAKER_02:

Yes, absolutely. Okay, so I think that this is I think it's a very timely conversation because I know there are so many folks who uh this year especially have had grants that they have always had that are no longer there and they're having to kind of reprove themselves really as they apply for new ones. And you can't do you cannot prove yourself unless you can prove that the work you're doing is actually working. For sure. Right. And so I think so often we get stuck in this idea that oh, I work I'm working for a nonprofit and it's really nice and we're doing good work, and so that's all that should matter.

SPEAKER_00:

Yeah.

SPEAKER_02:

But the unfortunate reality, and in an ideal world, I could see where that would be true. The unfortunate reality though is that we have to be able to prove that what we're doing is making an impact or that the thing we're setting out to change is actually changing, or whatever the case may be. And so I I talk to so many uh organizations and nonprofits who don't even know how to measure what they're doing. They're just doing it because it's what they've always done. So let's start at the very beginning of this process. If somebody is is in that boat, maybe we're applying for new funding, funding has been taken away, we're rethinking, and we have to start being able to prove what we've done. What's kind of the first step? Because that feels like an overwhelming process if you've never had to do it.

SPEAKER_00:

It can be a very overwhelming process. And in this environment, it is more important now than ever that we are taking an evaluation mindset every day.

SPEAKER_02:

Yes.

SPEAKER_00:

Because what a lot of folks who I'm working with right now, what they're dealing with is they're having to go back and they're having to find data, they're having to remember how they did things and prove that works today. Um, because there's just less money available, right? So the burden of proof is higher. So a year ago, two years ago, you could bring five testimonials from participants and maybe some counts of how many people, you know, went through your food bank or your museum door, whatever it is, right? Um, and that was sufficient. But now you really have to show that your food bank is doing more good than the other food bank, or your museum is serving the community in a better way than the other museum because we're all applying for the same money and you've got to differentiate. So I think the number one thing I would say right now to directors, to program managers and consultants is to immediately start thinking about what data are we collecting? What data could we be collecting? And how can we systematize that so that it's almost automatic? So, thinking about, for example, I'll use the museum example. I sat on the board of the Lexington Children's Museum in Lexington, Kentucky for um seven years. Um that's one of my near and dear organizations. So for them, you know, they might be thinking about what questions are we asking when we sell that admissions ticket, right? So, you know, in the past, maybe we collected your address, maybe your zip code, so we could show, oh, well, we're pulling from all over the state or from other states, you know, that those are important, uh, important pieces of information. But maybe now we're also collecting demographic information. Maybe now we're taking those zip codes and comparing them to census data zip codes and we can pull some uh poverty information in. So thinking about like at that point of sale, can I ask a different question? Um, or on my website form, if you buy the buy your ticket there, can I ask some different, maybe optional questions that give me a little more nuance and detail so that when that museum says we had 5,000 people, we can say we had 5,000 people from six states, from 20 counties, from these um home income bands, these background education bands, etc. That is gonna show a funder like these people know what's up, and we're gonna trust them with our money more than this group that just said we sold 5,000 tickets.

SPEAKER_02:

That's so good. That's so good. Okay, so I think one of the things that may that has to be a part of this is you're collecting that data is is one, what do we do with it once we have it? And two, that there is an ongoing debate, obviously, about data protection.

SPEAKER_00:

Yeah. Right.

SPEAKER_02:

So when you're when you're taking all that in, certainly we have to have a policy in place and we can have a whole different discussion on a policy in place of how we deal with data. But I think sometimes nonprofits don't realize how much data they're already sitting on.

SPEAKER_00:

Yeah.

SPEAKER_02:

Because we don't know what to do with it once we have it or how to how to think about it in such a way as more than just sells on a spreadsheet.

SPEAKER_00:

For sure.

SPEAKER_02:

Right. So if that's the case, especially I'm thinking about there's a nonprofit here in Pittsburgh that I volunteer with that does uh programming for under-resourced communities on Pittsburgh's north side for their students, for students in first through 12th grade. And so we have this wealth of information on them and their families and where they live and all of the things. I don't, I'm curious about then how we use all of the information we have to drive decisions. Right? If we know these are the 17 zip codes that our students come from, what do we do with that? Where how do we how do we start using data as an evaluator in our models there?

SPEAKER_00:

Yeah, there's a couple of things you can do here. Um, so the first thing I do with a lot of clients, we just sort of do a data audit and we say, what exactly are we collecting? Because, like you said, a lot of us don't know, and we find ourselves sitting on these mountains of data. Yeah, things that we might not consider to be data, you know, testimonials are data, and we should be collecting those and storing those. Um, the little cards that people drop in as they walk out of your facility, right? That is data. So we do need to kind of take stock of what's there. And then I think a really great exercise is to do an exploratory analysis. And formally we call that exploratory data analysis. There's like books about that and courses you could take. But really, what it means is we're just kind of playing in it, like a splash pad at the park. We're just looking to see like what's in this data, what trends are jumping out at us. And then once we identify those, we can really home in on those and create actionable evaluative questions that we can then study, we can refine our data collection around these questions and really start to differentiate ourselves. And I think that's a great way to do it in this environment because so many of us are doing kind of the same thing, right? If you're a food bank, you're competing with every other food bank in the country and you're all kind of doing the same thing, right? Your food bank might have a different schedule, it might have a different, you know, volume or or poverty threshold or what have you, right? But we're all providing food to hungry people. So how do you differentiate yourself? If the only goal you're measuring is how we're providing food to hungry people, so is everybody else. But if you're digging deep and you can find a little nuance, a little pattern that maybe you didn't know was there, then you can explore that outcome. That gives you something unique and competitive in this funding environment and and in really this sort of marketing environment too, right? Where you have to just prove every day that your organization matters.

SPEAKER_02:

That's so good. So let's use the let's use that food bank example for a moment, right? So if you are, if that's your piece, we want to get as many, we want to get food into as many homes as possible. Great goal, by the way. Well done, Anya. Absolutely. Good job. Um what data are we looking at outside of because we know how many cars come through the line, right? To pick up boxes. We know how many boxes we put into cars. Um, d I'm curious then your thoughts on how that data proves programmatic effectiveness, right? So when we're when we're evaluating all of that data to be able to say this we're successful because, or this is the impact we're having because what points are we looking for in the data to tell that story that says, yep, we accomplished the mission versus we fell short of the mark?

SPEAKER_00:

Yeah. So so many, oh gosh, so many things here. So let me think. One cool way, one cool thing you could do that might differentiate you is could you collect data about the nutritional profiles of your boxes? How are your boxes different, presumably better than boxes that your competitors might have? Thinking about are your boxes nutritionally complete? Are your boxes nutritionally complete for a toddler versus an adolescent? That is a differentiator that we could pull. And that's data that you probably have because you're collecting, presumably, you're inventorying food before it goes into boxes, presumably you're inventorying what's in those boxes. So you should have a pretty good handle on what's on what's going on there. If you don't, that's where you need to start, right? We need to understand we have 20,000 cans of corn, we're putting two cans in each box. You may, you know, we got to start there and and track that inventory. But if you're tracking that inventory, that could be a pretty easy thing for you to grasp. Also, considering your impact on the community as a whole. So are there other data sources that you can pull in to help tell your story? We call this a correlative analysis, right? Where there it's not directly causal. I can't say for sure that because I'm here, this thing happened, but we are here and since we got here, there are fewer hungry bellies in the ER, for example.

SPEAKER_02:

Okay.

SPEAKER_00:

Um, or there are maybe fewer, maybe the homeless shelter has fewer children showing up because we're able to fulfill this gap and we can kind of make some rough connections. Those rough connections, they're rough, and you're definitely gonna have people who push back on those, but it's again a way to really think about within the context of our community what larger impact are we having?

SPEAKER_02:

Yeah. So it's interesting. I just recorded a podcast with someone else earlier this week who was talking about the importance of tearing down silos within organizations. Um, and this man, I think, highlights that so well because the folks who have access to that data or the folks who are hearing those testimonials are likely the folks who are boots on the ground putting boxes into cars, right? To keep on the food bank analysis.

SPEAKER_00:

Yeah.

SPEAKER_02:

But if that story that they get told while they're putting boxes into the car never makes it back to the marketing and fundraising team to help tell that story, or never makes it to the grant writer who's responsible for going to get more funds to keep this program going, this is just a oh man, this is such a good use case for why communication internally is so important.

SPEAKER_00:

Yes. And program evaluation is not any one person's job. It's not the director's job, it's not the program director's job, it's not the fundraiser's job. You should really have a team that is doing program evaluation work together. And when I work with nonprofits, I won't work with just the director or just the program director. Like I pull a team together for that exact reason because we all have different lenses and viewpoints. I would also go a step further and say your program evaluation team should have a board member on it. I think we forget about those other skills that our board members have. And when we recruit board members, you know, a lot of times we're thinking about like we need somebody who's money savvy, or maybe we need someone who's more got, you know, got some legal background, a communications background to kind of fill out, fill our roster, right? But thinking too about are there people in our community who could serve in a volunteer capacity who could provide that extra statistical knowledge, that extra program evaluation. Maybe you've got a university and you can recruit somebody from that space. Bring that board member into your program evaluation conversations and let them bring their passion to the table. Because I always find that board member passion is a little hotter. It doesn't get as cynical, maybe that staff passion. I mean, that staff passion is always there, right? You have to have that passion to do nonprofit work, but it can get a little cynical, right? As you're in day in and day out. And so having that sort of optimistic board member passion at the table can really reframe a program evaluation conversation.

SPEAKER_01:

At Marketing Support Network, we are proud to support nonprofits by providing top-tier customer service solutions for your donors. Live US-based agents are able to assist your callers by phone, email, website live chat, or social media response 24 hours a day, seven days a week, 365 days a year. From taking donations to updating records, answering questions, to placing orders, let Marketing Support Network help you take donor care to the next level. Visit Marketing Support Network.com for more information.

SPEAKER_02:

So I'm curious, in your certainly professional opinion, right, because when I say when I say things like program evaluation, right, or your program evaluation team, yeah, is that something that you see as like we should be evaluating every month, every quarter, every year, we're doing an in-depth analysis. What's kind of the the schedule or the recommendation for really deep diving into the evaluation process for your actual programs?

SPEAKER_00:

So that's a tricky one because it really depends on the cycle of your work. And I would also say the cycle of your funding. So let's say that you are um primarily grant funded. Then you're probably doing a program evaluation halfway through that grant and a deep dive at the end of that grant. Okay. Because you want to prove that that grant worked so that maybe they'll give you another one, right? Or so that you can launch off of that grant to another foundation and say, you know, Annie KC, let us do this. We're gonna come to you. Will you let us do this? So I think that's a cycle to think about there. If you're more philanthropically funded, like you're one of those like small donor organizations that's pulling, you know,$20 bills in all day, every day, and that's your meat. Then I think you're really focused on um the cycle of your work and how that looks. So for example, I hate to stay on the food bank, but now I've got food bank on the brain. You know, the food banks, food banks are a little cyclical. They have big spikes in November and December, right? Because there's special food-oriented events that happen in those months. And they tend to have kind of drop downs in the middle of the summer, um, especially in rural areas. I live in Kentucky, so we've got a lot of rural areas. Our food banks do less in the summer because people are growing food in those rural areas in the summer, right? So there's sort of some waves and some cycles. Yeah. And so I think then you kind of have to ride that wave and decide kind of where is the right point in your wave that when does it make sense to you and your workflow to do that? Um, and then I would also say programmatic. Anytime that you have a special program, maybe you're launching a new program, you've made a change to a program, and so you're rolling it out, you know, in a new, you know, uh it used to be online and now you're doing it in person or something like that, yeah, that needs to be really carefully monitored because then you can get into continuous improvement cycles where we're just tweaking and changing and monitoring all the time. And I'll tell you, our funders love to see that because that really shows that you're taking the quality of delivery outcome seriously. If every three or four months you're saying, okay, I wanna, we've been sitting in a circle, but now we're gonna sit at tables and we're gonna see if that increases the community discussion, whatever it is, right? They want to see that you're making those adjustments. That shows that you're really taking it seriously and you're not just locked into sort of that founder's mindset of this is the way it's supposed to look, and we're not gonna deviate from this.

SPEAKER_02:

When it's so easy to get stuck in that.

SPEAKER_00:

Oh, then they've done that. Yeah, yeah.

SPEAKER_02:

Yeah. So, okay, so on that note then, because you have sat in the seat. Yeah, you have sat in that executive director's seat and know the pressure and struggle. Yeah. If someone is listening today going, you know, that's a really interesting idea. We've never evaluated nothing. Right. Or that's just not, it's not maybe been a part of their culture or you know, whatever. Um I'm curious what advice or or where you would tell someone to start from like from the jump. And obviously, I mean, a fantastic answer is hire a consultant and like Matthew and bring them in and help them help you figure this out. But if you've never even like, I'm thinking about I there's an organization I was working with who had never done they run classes like continuing ed classes for low-income neighborhoods to get to get folks job ready. And they didn't do they weren't doing any sort of evaluation at the end of class of like what was helpful or going back and checking in with those folks to say, hey, how where's your job search? Did this class help you? All of that takes time and prep though, right? For sure. So if somebody is maybe in that overwhelmed founder executive director seat, going, Yeah, I know this is something we need to do, where do we even start?

SPEAKER_00:

Yeah. So if you're starting from zero, and I work with a lot of startup nonprofits that are in this same space, you're starting from zero. The best thing I think you can do is pick one outcome, one program that you're running, and start to systematically collect the data related to that program. So, for example, for your classes, that data is going to be registrations. We might look at our registration form. Is it robust enough? Are we collecting maybe some optional demographic questions and some things like that? By the way, I always say demographic, demographic questions need to be optional because that can really turn people off. So just throw that one out there. But you know, are you asking those optional questions in your registration form? Are you keeping hold of those registration forms? I mean, do you even know where they go once somebody gets met and the class is over? Who's in charge of those? Also, program evaluation at the end. What did you think about this class? Follow-up interviews down the road. That's a great place for a consultant to come in, by the way. Follow-up interviews, they take a lot of time. And sometimes people will tell me things that they won't tell you because they're afraid they're hurting hurt your feelings, but they don't care to hurt my feelings because I'm not there, right? So um, anyway, thinking about that. So think about how can we systematically collect it, how can we make it automatic, right? If data collection is an extra thing you have to do, you're not gonna do it. And then periodically pull that data and see what's going on, enter into a cycle of continuous improvement where then you're responding to that data. So if they don't like the temperature in the room of your class, then we're gonna change the temperature in the room and see if that makes the evaluations get better, whatever it is. We're gonna tweak those. Document, document, document along the way. Because if you can't prove to your funder that you're doing that, and you just say, well, you know, they they didn't really like the temperature in October, so in November we turned the thermostat up. Well, okay, but you don't have any proof that you did that. I have to take your word for it. So document all of those things. Also, that documentation helps people like me when we come in to help you, because I have a place to start, and it saves you time and money because you don't have to like I don't have to look at you and go, you really got to do this for three more months, and then call me. Here's what I want you to do. I can pick up where you left off for that more kind of academically rigorous program evaluation that you could potentially even send to an academic journal for publication to show that your stuff is working.

SPEAKER_02:

That's so good. I'm curious when you go into work with folks, is there one consistent theme of like, uh, if everyone would just do this thing, like is there one that you we could say is like maybe a warning sign for or is there one she's like, man, I'm so tired of fixing this one error for everybody?

SPEAKER_00:

I'm gonna give you two.

SPEAKER_02:

Okay.

SPEAKER_00:

The first one is registration forms. I know I've mentioned that several times, but that's because it's always front of mind registration form has to be more than first name, last name, email address. So registration forms is a huge missed opportunity. Then um on the back end, counts of participants. And what and I did this too. When I was a nonprofit founder, we served over a thousand teachers a year. And I would go around and I would tell everybody and everybody we served over a thousand teachers a year. And that sounds great, but guess what? So did every other education nonprofit in my state at that time, right? Because there's 60,000 teachers in my state, and they're all going in somewhere for their learning. And so when when I look at your website, if all your website is, is we had 20,000 kids, we did 50,000 meals, whatever it is, those are very impressive, important numbers, and you should obviously have that. But if that's the extent of your program evaluation, you look the same as everybody else. You are not competitive, and other people are gonna get that money that you want.

SPEAKER_02:

Yeah. Oh, that's so good. And because realistically, you know, we all have a vision and a mission that we're trying to accomplish.

SPEAKER_00:

Yeah.

SPEAKER_02:

Numbers only tell part of the story. Absolutely. Right. The number is good, but if you're not telling me a story of what what how that changed a thousand different teachers' lives or how it made education better, then we've missed the boat.

SPEAKER_00:

Yeah.

SPEAKER_02:

I I need the impact as part of it.

SPEAKER_00:

Exactly. And you know, evaluation we call that mixed methods, where we've got some numbers and some stories, and those are the best because you never know what kind of brain your founder, your funder is gonna have. And so if you're you might have a funder who is very moved by emotional stories and pictures of little children on playgrounds and things like that. You might have a funder who is like, I have this much money to give, and I want to give it to the people who are gonna give it to the most people, and I want numbers. So you have to have both in every report and every evaluation.

SPEAKER_02:

That's so important. Matthew, this has been a fantastic conversation. And I think really full of a lot of good takeaways. If people are curious to like learn more about you and the work that you do or connect with you, what's the best way to go about doing that?

SPEAKER_00:

Yeah, the best way to find me is at nonprofiteval.com. Um, there you'll find all my contact information, all my socials, as well as the different kinds of evaluation processes that I can assist you with. So nonprofiteval.com. You can also find me on social media. Um, just search for me. I'm basically on all of them.

SPEAKER_02:

And just out of sheer curiosity, what type of evaluations are those?

SPEAKER_00:

So um really I specialize in startup and smaller nonprofits focusing on really academically rigorous program evaluations. So my goal is to help you evaluate your nonprofit the way that big, giant nonprofits are paying$150,000 to massive research consultants to do. Um, I don't have all that overhead. So I can do it at a reasonable price that really helps those small and startup nonprofits prove that their stuff is working. So it's really all about that academic rigor. That's what I bring to the table that you might not have the skills to do on your own.

SPEAKER_02:

That's so good and such a needed, such a needed service. So I'm sure you're sure a lot of folks will be checking that out. Yeah. As we move to close, yeah, there's a question that I've been asking everybody this season, with you know, with the understanding that it has been quite a year for a lot of folks in the nonprofit sector. What a season. Uh so if you could give one piece of wisdom or encouragement or advice, counsel to those nonprofit leaders, especially in those kind of startup to small bootstrapping their way through right now, what would that be?

SPEAKER_00:

You know, I'm gonna go back to my 20s and I'm gonna say you're gonna be okay, and your work is gonna be okay. A lot of nonprofits are asking themselves if they can keep the doors open, and a lot of leaders are having to make hard staffing decisions and reductions, and that is hard. But I want to let you know, here I am, you know, 15 years down the road from my grand adventure as a founder, and looking back, going, you know, that was good work. I'm proud of it. I am okay. And the work that I started there, even though we aren't doing it, other people are. And the projects that we had to drop, we very intentionally gave them to other people who we thought were gonna outlive us. And so that work is still happening. I'm just not the one doing it. And so I think take you're gonna be okay, your work is gonna be okay. Take a little of that sort of existential pressure off of yourself, yeah. Um, and know that your mission is important, other people think it's important. Important. It's going to be okay in the end.

SPEAKER_02:

That's such a good reminder. Because it's so easy to just like to think that the entire weight of the world is on our shoulders. And if we don't do it, no one will. And then these communities won't get. So it's gonna I it is gonna be okay. I like that one.

SPEAKER_00:

Yeah.

SPEAKER_02:

So good. Well, Matthew, thank you. This has been a fantastic conversation, I'm sure. Very helpful for a lot of folks. So thank you so much for sharing your wisdom with us today.

SPEAKER_00:

Yeah, thanks, Megan. I'm happy to be here anytime.

SPEAKER_02:

So again, this has been another episode of the Nonprofit Hub Radio Podcast with my guest Matthew Courtney, who's the principal consultant at Courtney Consulting. My name's Megan Spear. This has been another episode of the Nonprofit Hub Radio Podcast, and we'll see you next time.