Google DeepMind
Insights
Teams
April 30, 2025

How can you approach AI with an equity lens?

Are you curious about using AI without compromising your values? In this episode, Farra Trompeter, co-director, chats with Meena Das, founder of Namaste Data, to unpack findings from the AI Equity Project and offer practical advice for nonprofits navigating this evolving technology. 

Transcript

Farra Trompeter: Welcome to the Smart Communications Podcast. This is Farra Trompeter, co-director and worker-owner at Big Duck. In today’s conversation, we’re going to ask the question: How can you approach AI with an equity lens? And I’m delighted to be joined by a new favorite Meena Das. Meena uses she/her pronouns and is the CEO of Namaste Data, a data and AI equity consulting agency, helping nonprofits design ethical, human-centered AI and data practices. With 17 years of experience in the tech and nonprofit space, she has authored principles like Community-Centric Data and co-founded the AI Equity Project. She’s passionate about making complex tech accessible for social impact. Meena, welcome to the show.

Meena: Thank you so much for having me here, Farra. And I’m going to ignore the word new favorite. I’m gonna focus on the word favorite.

Farra Trompeter: That’s right. Just favorite, eternal favorite. New to my life, but clearly a favorite.

Meena: Love it. Okay. I’m going to take it, thank you so much for having me here and having this conversation with me.

Farra Trompeter: Well, I want to start with the AI Equity Project, which you co-founded with Michelle Flores Vryn, who’s been on the podcast twice discussin audience prioritization and radical honesty with Marissa DeSalles. We’ll link to both of those conversations if people are interested. But it is great to have a connection there, and I know we have lots of other friends in common and folks who do work in the nonprofit space. So again, welcome, excited to know you. But let’s start with the big question, which is: What is AI equity, and what is the AI Equity Project? What led you and Michelle to actually conduct the research that is part of the project as well?

Meena: Great question to start with, and so glad it’s not “Introduce yourself, Meena” question, I never know what to say on that one. AI Equity Project, well, Michelle and I have been in conversations for almost a year before we started this project in 2024. We had been in conversations understanding where are we, asking each other, where are we as a sector when we speak of the word AI? Most of the conversations, when it happens around AI, we see a bunch of reactions, which is overwhelm. “I don’t know how to use it. Where do we start?’ The basic fundamental questions. And then the responses pretty much always came from the big tech companies saying, “Here are the four tech things you need to know,” and “Here are the cloud products you need to know,” and “Here’s the subscriptions you need to get.” But there were a variety of conversations that were not happening in that spectrum, like, who should be held accountable when we are working with AI? Is it our legislature? Is it our board members? Is it us individually? Are we even having the right data practices and data values when we work with artificial intelligence? I mean, the conversation about AI has to be more than beyond just the tech conversations or the tech stack or login credentials. And we couldn’t see any space like that. So, we wanted to figure out how to capture some data on that, how to capture some stories on that, how to talk to the sector in a way where we know where we are on this conversation holistically. So we established this three-year-for-now project. I mean, I started it, and then I said, “Michelle, you have to be part of it.” And then Michelle is amazing, and she said, “Yes, I will be.” And I never gave her enough choice to say no, and I’m so grateful for her.

Meena: But she and I designed a bunch of questions. It was pretty straightforward from there. We designed a bunch of questions, reached out to nonprofits, the sector responded very kindly to our request, and they came back with 700-plus nonprofit responses to what and where they are and what we need as a sector. Not all of it was surprising, and I know you and I are going to talk about it, but it did give some very interesting and obvious, and messy outcomes that we found in the project. And we are going to repeat the same study this year and the next to be able to compare where are we heading. What is this story here? This is a technology that is going to stay with us for a very long time, and we need an approach which is not fear-based, which comes from our joy, which comes from our strengths, and is not constantly used as a weapon to threaten our existence. So that is AI Equity Project.

Farra Trompeter: Yeah, that’s great. Well, actually I, you know, I had folded another question in there that I realized I probably shouldn’t have combined. So I’m gonna pull it out, which is just for folks who may not, who may be having a hard time understanding AI and equity together, what do we mean by AI equity? And then we’ll come back to the project and the report.

Meena: AI and equity. Well, first of all, I want to acknowledge that we do not use these two words together in the world enough as much as we should. We use the words AI generally loosely as much as we can in this time. And then we sometimes happen to use the word responsible AI, technical AI, almost as if that is another flavor of AI that you can purchase or buy. We have a tendency to not put the importance of these words together. So what AI equity, when I say I really mean AI, first of all, I mean AI using it, purchasing it, selling it, doing any kind of work, any kind of work around AI with the lens of equity, inclusion, and justice. AI equity doesn’t pertain just to the data scientists who are designing algorithms. This word doesn’t belong to the CEOs and C-level executives who are responsible to sign on purchases regarding softwares and products like these. So when I say AI equity, I am talking to almost everybody who is interested to use, design, produce, sell, purchase, something to do with artificial intelligence. And I am trying to tell them that any action we take around/in/with AI, it has to happen with a lens of equity. It has to happen with a lens of inclusion. That is what I mean when I speak the words AI equity.

Farra Trompeter: Thank you. Well, like I said, we’re going to come back to the report and we’ll link to the report in the transcript of this conversation at bigduck.com/insights. I really hope folks download it and read it. But for now, since folks hopefully are just here in this conversation, I would love for you to share with everyone the three big summary findings from the report, so they can just wrap their heads a little bit more around what you found.

Meena: Absolutely. So, my first finding in the report was that we do not have enough funding for organizations to experiment in these conversations. So most of the funding that is available, at least that was the case in 2024, when we were looking at the study. And the cycle of the study, just for context, was May 2024 to October 2024. So that’s when we wrapped it up. We didn’t have enough funding in the sector for nonprofits to experiment and co-learn, and produce something together on data equity and AI. Data equity, for starters, was a word which I’m coming to one of the findings, was a word not all of us are familiar with or we’re familiar with, and we were getting to, so that was the first one, lack of funding. No surprises.

Meena: Number two was this data equity piece that we might not have common language. That was one of the things we might not have common language around the word data equity, around the word AI equity, like even asking the question that you asked, “What is data equity? What is AI equity?” The reality is we are doing some good things with data from our tables individually in our small teams in our day-to-day. We are just not sharing that with each other enough to build that common vocabulary, that common document. And so that was one of the second findings of the research, where we need common language. We don’t have that yet.

Meena: The third one was, we don’t look for partnerships beyond the tech companies. Any kind of partnership when it comes to AI, we are our natural tendency, and I’m, when I say our, I mean the sector, our natural tendency is to look at the tech companies to, to tell them, “Help us. Guide us. What products do we need?” But there is a step of dreaming, visioning, when it comes to this technology, this incredibly powerful technology, that doesn’t happen. We missed that step. And the moment we think that our organization needs AI, we go straight to the tech companies. And that’s the third thing we need. Our recommendation is: We need collaborations between two nonprofits or a nonprofit and a coalition or all forms of different collaborations that goes even beyond the tech companies, and let’s not restrict ourselves to the tech products.

Meena: And the fourth bonus learning that I and Michelle had through this project is a lot of the work that we do around AI, and it’s true even for this, like we are in the second year of the project and fourth year of having ChatGPT now in the world. A lot of the work that we are doing with AI in the sector is still individual task-based. Someone is writing a letter, someone is drafting an outreach plan, plan for an event. Pretty much that’s it with, when we think of gen AI. Where I want us to get to is a place of moving from task-based AI to mission-based AI. And for that to happen, we need to have more conversations. We need to have more talking to each other. So we get to know, okay, it’s not just enough to write one outreach letter using gen AI. Now let’s talk to each other. What is the purpose of this letter, and how is it actually leading up to donors supporting us for the X camaign or something? So that we are tying it––this conversation––with evaluation with the why is just important. So that is the learning of the project that we are still in the task-based AI, not yet in the mission-centered outlook.

Farra Trompeter: Great. I appreciate that. And thank you for giving us the bonus.

Sponsored by RoundTable Technology This episode of the Smart Communications podcast is brought to you by RoundTable Technology – the nonprofit IT partner. Is your nonprofit cybersecurity forecast looking a little stormy? No worries – RoundTable Technology is here with the ultimate umbrella!

Their free three-part webinar series Weathering the Storm: Protecting Your Nonprofit and Yourself in Uncertain Times has you covered.

Their privacy experts’ real, practical steps will protect your team, your data, and your mission in the current US political climate. Think data governance, risk ownership, staff protection, and more! To explore this content, go to roundtabletechnology.com, look for the “Free stuff” section, and view the webinar library. Here’s to staying dry, secure, and prepared no matter what the clouds bring!

Farra Trompeter: You know, coming back to data equity, right? We start talk about AI equity, now we’re getting into data equity. What are some of the data equity practices you uncovered that nonprofits actually need to do more and engage with?

Meena: I think the biggest one that kind of became my work was how we do data collection. Because that’s one of the primary areas where my work is most focused and comes up often. I’m gonna give some misconceptions, some things that we can be doing differently. One is: we need more data––that is not always true, we do have data. We need to understand why are we collecting, what are we collecting? Number two is the data that we collect, we collect it from multiple different places. We don’t build in text transparency and accessibility around that data that we collect. So, the second misconception around data equity is how we collect the data and why we collect the data.

Meena: And the third misconception around the data equity is: We think data equity is a foreign concept, almost. And that to use, and build, and do something with data equity, we need to learn a lot. I actually got some notes when we started with the AI Equity Project asking, “Am I even eligible, Meena, to take this survey because I don’t know what is data equity? Should I just remove myself from this ask? I want to support in this project, but I don’t know if my story is good enough for this survey or not.” And that’s where I want to change the narrative. The biggest misconception is that we are doing things with data. Data equity means how we are handling the data, how we are collecting it, how we are storing it, how we are taking decisions with data, with this equity and inclusion lens. It doesn’t just talk about the identity data. Data equity doesn’t mean just about ethnic data, or racial identity data, or gender data. Data equity means any data, regardless of what it is about: how are we operating around it? Are there values with which we are operating around that? And that is the missing piece. Farra, when you’re asking what is missing in this conversation, that level of understanding that yes, you have a space in this conversation. That is missing, are we reacting to that data? I want to, you know, bring us back to that. And then the third one being that we don’t always need more data. We have enough data. Can we step away from that idea that to do anything with data equity and AI, we just need more and more. I want to take that off of our vocabulary.

Farra Trompeter: Yeah, definitely. Well, when we were preparing for this conversation, we were both reminiscing about how much technology has changed in our lifetimes. And you shared that regardless of whatever the new form of tech that is emerging, whether it’s AI or something else, our approach should be the same. And I would love you to talk a little more about that. What should our approach be to AI and emerging tech? You know, whether it was matching when we first started using smartphones, or I remember I got my first email address when I was in college, you know, obviously that was brand new. I actually recently had to find my VCR in my closet because I wanted to see something. I still have it, thank God. But you know, I remember when all I did was watch movies on VCRs now, like who needs those VCR tapes? So let’s talk about, you know, again, tech is constantly changing. How it shows up in our life is changing, but how we think about it, we often, when these new things come, and as you said, like the story about the person who got the invitation to your survey, who felt, you know, that that imposter syndrome, that “I don’t know enough about this, I’m not the right person.” Let’s talk about that. What do you think our approach should be to AI and emerging tech?

Meena: I want to first normalize the reaction that we are having right now in this moment, which is fear and rejecting these ideas, feeling the imposter syndrome. I mean, I’m going to say as a tech person, these are exactly the same things I saw when floppy disks first came, and then CDs came, and then, you know, the USB, those came. And you suddenly, we were not even close to cloud storage, and now you can, you no longer have to use the floppy disk, what, and how can you not lose your projects? And there were like these feelings and ideas and attitudes, and approach when these new things came. I am seeing nothing different when we are talking about AI. I have been doing these conversations on AI for at least the last four years. I’ve been working in this industry for many years, but to do these so much publicly has been my life for the last three years, I would say. And every single one of them has had this kind of question: “I don’t want to use it. What would you suggest? Do I have to? Will I lose my job? Will this AI take over who I am?” They are some of the most basic questions. It’s almost like it is threatening us in an all-new way, but the threatening part is not necessarily new. We do know what fear is. We do know how fear works, and here we are looking at this technology, approaching it the same way.

Meena: But I do want to offer number one, that this technology is incredibly powerful. This technology is going to stay with us for a very, very long time. We are no longer talking about floppy disks that we can insert and store something and pull out. We are talking about a technology that can learn on its own, what mistakes it made, and it can teach other AI products through itself, how that those products need to be updated. So we are speaking of a pretty magic-like, incredible technology here. And yes, the approach that we are taking right now is fear-based, understandably so. But I want us all to feel encouraged and welcome to approach it differently. Look at it not in a way that yes, it’s gonna solve everything, but look at it from a point of view of curiosity. Yes, a knife is equally dangerous, right? We use it to cut our vegetables for salads as well as it can hurt and harm people. It doesn’t mean we are no longer going to use knives. Same goes for fire. Fire exists, I mean, the way it is created, it can cause harm and hurt as well as it can keep us warm.

Meena: So there are things in the world that exist, which have brought out the same feelings. The question truly is about our intentions. The question truly is about our values. And the question truly is, what are we trying to protect? When people ask me, “Is my job going to be replaceable?” I tell them, “Probably, yes.” Right now, AI can do a lot of those same things that I do in my business. What it doesn’t threaten me out the way I see it is that what am I trying to protect? Am I trying to protect Meena’s job description that only Meena can do it? Or am I trying to protect who Meena is? The caring, the compassion, the kindness that are important to me? How can I make sure that the things the AI produces it has the same things, the kindness, the compassion, the humanity that I value?

Meena: So don’t think so much for in your fear as you’re moving out of your fear as to what parts of you can AI do. AI can replace what you do. It can’t replace who you are. Distinguish between the who you are and what you do. The better we know ourselves, the better we know our values, the better we understand where we are coming from. The more passionately can we protect the who part of us in this AI conversation. And that, that’s how I feel the approach can be different when we are talking about this technology.

Farra Trompeter: Yeah, I love the way you see it all. So, thank you for bringing that together. And one of the things that I appreciated, there’s a lot I appreciated about the report from the AI Equity Project, was the opening letters from you and Michelle. And in those, you articulate your hope for the future of AI and nonprofits. And again, some of that may be I, many of these themes, and I’m just curious today, here we are, we’re recording this in March 2025, what is your hope right now in this moment for the future of AI and nonprofit organizations or the nonprofit sector?

Meena: It’s a lovely question, and probably I should start thinking about it because a new report should be coming out in two months, so I am going to write this. You know, this is the beautiful question that I always carry in my mind. What do I want out of this sector for this technology? Not to build a business out of it, but to understand where can I make the most impact? Where can my presence make the most help? And to answer your question, I want to answer two things. One, 50 years from now, a question I want always to be there is: Whatever we are doing with AI, is this work inclusive enough or not? There is that, that inclusion, that lens of equity that is never complete. It’s an ongoing journey. We can never get to a point where we say, “Okay, we’ve done enough and we need to stop thinking about inclusion” or “That’s a check mark, done. We no longer need to talk about that. That’s a metric we have achieved and done and dusted.” We can’t be that community. We can’t be that group of humans or beings on this planet. So 50 years from now, my hope is we still ask that question every day, is this whatever we are doing with AI, is this inclusive enough, or not? Is this justice-oriented, equity-led, or not?

Meena: My second hope, kind of in tangent to that piece is we move away, we set up enough basic foundation so that we don’t circle back the same things over and over again. So, one of the conversations I don’t want to circle back is a lot of our nonprofits are not ready yet to even have the right database systems. How to store the data points before they even talk about AI and Einstein Analytics, and other big really nice products out there. We need good database systems 15 years from now. I don’t want someone to pick up this kind of research again and find out that we don’t have the right data systems. So, my second hope is we have the right foundations. We don’t circle back to the same things over and over again. What we do circle back to is the same question: Is it inclusive enough or not? Are the right voices included or not? Is the power democratized enough or not? Is what we have done with this product of AI, is it sharing equity and power, or not? Those kinds of questions, but not if we have the right database systems or not. Where are we storing data? We need more data. Are we still on Excel spreadsheets? So my hope is we have the right foundations. My hope is we are asking over and over again the important questions that we cannot miss. And my hope is that we do it collectively, together.

Farra Trompeter: Love it. Well, if you would like to engage more with topics related to data equity and AI, be sure to visit namastedata.org. You can also connect with Meena on LinkedIn, and I have to say, I really enjoy your posts on LinkedIn. You always reference good recommendations and tips, and thought-provoking questions. So, highly recommend you follow and connect with Meena on LinkedIn. So Meena, before we sign off, any other advice or thoughts you’d like to share? You just dropped a lot of knowledge, gave us a lot of things to think about, but anything else you want to say on this topic?

Meena: Probably just approach it with a lot of love, a lot of joy. I know it sounds so fundamental, but that’s the thing. We just need to go back to fundamentals. There is a lot of comfort if we go back to the basics. So if you’re approaching this technology, if you’re new in it or you have been dealing with this technology for a while now, make sure that you are bringing a lot of love, a lot of joy, and you are not approaching it from a place of competition or threat or fear, but just how can you do it in a way which allows you to be the best version of yourself for the communities around us.

Farra Trompeter: Amazing. Well, love, joy, and curiosity. We can all take that for our entire worldview. So, thank you so much Meena, and really appreciate you being here today.

Meena: Thank you so much for having me here, Farra.

Farra Trompeter: Alright everyone, be sure to go download that report and keep those minds open.

This podcast has been sponsored by Round Table Technology
Round Table Technology logo