AI Doesn't Know What Empathy Looks Like
Tina-Marie Gulley, former CEO of Ada Developers Academy, on why the people creating technology should reflect the people affected by it.
“AI only knows how we describe empathy, but it doesn’t know what empathy looks like in practice.”
Tina-Marie Gulley was the CEO of Ada Developers Academy, a tuition-free coding program for women and gender expansive adults. In a moment where everyone is racing to adopt AI and equity programs are being quietly dismantled, she’s doing both at once: embedding AI into the curriculum while refusing to water down the mission.
What’s fascinating is how she thinks about the tension between technology and humanity. She pointed out that the children of tech billionaires often aren’t using AI in school the way the rest of us are pushing it on kids. That disconnect says something worth sitting with.
We also talked about what happens when funding dries up. Her take was sharp: impact reporting shouldn’t stop when the checks stop. If a funder gave you $10,000 ten years ago, they should still be hearing about what that investment made possible. That’s how you build relationships that outlast grant cycles.
Episode Highlights:
[00:01:00] From corporate marketing to nonprofit CEO: how a computer science competition revealed tech’s inclusion problem
[00:02:00] Why technology hasn’t fulfilled its promise of closing the digital divide
[00:04:00] Why nonprofits are slower to innovate and what Ada is doing differently
[00:05:30] Using vibe coding and hackathons to build real solutions for nonprofits
[00:07:00] Cutting through the AI hype cycle: grounding innovation in mission
[00:09:00] The hidden human cost behind AI, from data centers to the communities around them
[00:10:00] Why AI should never replace your therapist
[00:12:00] Working in community with AI and the risk of treating machines like people
[00:17:00] Investing in AI means investing in humanity, and in unplugging
[00:18:00] Why billionaires’ kids aren’t using AI in school the way everyone else’s are
[00:21:00] Reframing DEI in a hostile political climate without compromising on impact
[00:23:00] Navigating the funding shakeup and keeping funder relationships alive after the checks stop
Notable Quotes:
[00:02:20]: “People get really excited about innovative creations and creating the next unicorn, but they don’t really think about all the pieces that drive that. Making sure that they’re in the room at the start versus as an afterthought.” Tina-Marie Gulley
[00:07:55]: “People get so enamored with this new shiny thing that they forget how it can also further exaggerate inequalities.” Tina-Marie Gulley
[00:11:10]: “AI only knows how we describe empathy, but it doesn’t know what empathy looks like in practice.” Tina-Marie Gulley
[00:16:00]: “Are we gaining efficiency and more technology and more information? But what is the cost of that? Are we losing pieces of our deep humanity?” Eric Ressler
[00:17:00]: “A big part of investing in AI is investing in humanity. And that means investing in ways where people are able to unplug.” Tina-Marie Gulley
[00:25:15]: “A lot of times the impact reporting stops once the checks stop, and I don’t think that’s fair.” Tina-Marie Gulley
Resources & Links:
Ada Developers Academy — Tuition-free nonprofit coding academy for women and gender expansive adults
AI2 Incubator — Seattle-based AI startup incubator; Ada partners with AI2 on the AI House initiative
AI House at Pier 70 — Public-private AI hub in Seattle launched in partnership with the City of Seattle, AI2 Incubator, and Ada Developers Academy
Tina-Marie Gulley on LinkedIn — Connect with Tina-Marie directly
Imagine Cup — This is Microsoft’s global student technology competition.
P.S. — Struggling to align your message with your mission? We help social impact leaders like you build trust-building brands through authentic storytelling, thoughtful design, and digital strategy that works. Let’s talk about your goals »
Eric Ressler [00:00:45]: Tina Bree, thank you so much for joining me today.
Tina-Marie Gulley [00:00:45]: Thank you so much for having me.
Eric Ressler [00:00:50]: So your career began in corporate marketing and I’m curious to hear how you got from there and maybe what pivotal experiences moved you from that world into being an executive as a nonprofit CEO.
Tina-Marie Gulley [00:01:05]: So actually one of my first jobs, I was doing marketing and we were managing Imagine Cup. And so that’s a big computer science competition. And that was one of the aha moments where I was like, “Hey, this is so cool, but it doesn’t seem like there’s a lot of opportunity for people that are diverse or I just don’t see them in the room.” And so that was my first indication of, “Oh, tech is not really as inclusive as I thought it could be.” And so through there, I just started building my career and I did a lot of the work around equity, education for folks that are often not part of the conversation as a volunteer. I really am passionate about creating opportunity for people that maybe are underestimated.
Eric Ressler [00:02:00]: Technology has this promise of bringing us closer together, of closing these digital divides, of being a way to get past some of these issues of inequity and inequality. Why do you think it hasn’t fulfilled that promise?
Tina-Marie Gulley [00:02:15]: I think that people aren’t as intentional as they could be. I think people get really excited about innovative creations and creating the next unicorn, but they don’t really think about all the pieces that drive that. So not only your customer base, but also the people who are creating those innovations. So making sure that they’re in the room at the start versus as an afterthought. I think it’s really about creating opportunity for folks that maybe wouldn’t necessarily have that opportunity and really thinking about, okay, what kind of future do we want to build? Do we want to just continue building these systems that really do us all harm or do we really want it to be intentional from the start and really see it evolve from there? I think a big part of that is just really making sure that we as individuals, as we’re creating rooms, as we’re having a seat at the table, that we’re looking at who’s missing and who could really make a difference in evolving this idea, this innovation, this thing that we’re trying to move forward to adapt and problem solve for.
Eric Ressler [00:03:35]: You used the word innovation and coming from the tech world, that’s a term that’s used a lot to describe experimentation. Use the word unicorn, these new bold ideas, these white space ideas. How have you experienced the transition from the tech world into the nonprofit space? Another space that talks a lot about innovation, but I don’t think is really set up properly always to facilitate that kind of research, that experimentation, a willingness to be bold, a willingness to experiment and to fail. Has that been a bit of a culture shock for you or have you been able to bring that spirit of innovation through your career path?
Tina-Marie Gulley [00:04:10]: I think that Ada is one of those anomalies where we’re creating that intentionally and to what we’re doing every day. I do see that oftentimes nonprofits are slower to innovate, slower to invest in things like AI, slower to improve their systems. And so fortunately, that’s not our situation at Ada. We really think of those things at the, how we can continue to be cutting edge and take some of those elements from businesses and bring that into our organization as a nonprofit and as an education technology provider or institution. And so I don’t see us having that issue, but definitely as I talk to other nonprofits, I definitely see that there is a lag behind. What’s really exciting though is that we’re seeing a lot of forces joining together to bring innovation to organizations. We actually did that most recently. We had a hackathon where we were really trying to solve for real world issues for other nonprofits and community organizations.
And we did that through what’s called vibe coding now, which is a lot of prompting and making it more accessible for people. And so we were really building solutions that we know that nonprofits can use and take and iterate on. And so I’m seeing more of that. I’m also seeing more opportunities for partnerships, solution innovation partnerships with different organizations. I know Salesforce is doing it. I know JP Morgan Chase and a host of other organizations are doing it where they are bringing in re-imagining, hey, if you were to bring AI into your organization, what would it solve for? Would it help improve curriculum? What are the things that you can create? So I think there’s lots of avenues where these forces are joining together. We don’t have to be so distant, but we also can learn from each other. A big portion of, I think, working in nonprofits is around the impact of people in a way that businesses don’t necessarily think about.
Eric Ressler [00:06:40]: So AI is this new technology, this transformational opportunity, and it’s very controversial change, a lot of fear, a lot of uncertainty around it. And I think a lot of that is grounded in good, healthy skepticism and criticism. And at the same time, a lot of potential. And everyone’s trying to figure out how AI fits into their work, into their life. How are you thinking about how AI fits into your work, into your mission, into the values that drive your work? And what are the things that you’re excited about when it comes to AI and what are the things that you’re cautious about when it comes to AI?
Tina-Marie Gulley [00:07:20]: I think what can be challenging, especially as somebody who works in tech and who’s worked in tech for years is there is a huge hype cycle echo chamber that sometimes happens with these new innovations and things that come out. And so for us, it’s really being grounded in the work that we’re doing, really looking at it as AI is amazing, but it does not replace humans. It does not replace the connectivity. It’s really meant to be your copilot. And so oftentimes people get so enamored with this new shiny thing that they forget how it can also further exaggerate inequalities. And so for us, it’s really grounding ourselves in our mission and our purpose and how can AI help us to solve it. So whether it be internally from an operational standpoint, whether it be embedding it into our curriculum like we did about a year ago, whether it be through partnerships that we have with other orgs as well as the government to see how we can make it more accessible, I think those are the things that I’m really locked in and focused on. As we are creating these pathways for people to have AI, to use AI, to also think about the ethics around AI, also thinking about what goes into the background of AI.
There’s people behind all of this, from the language learning models, from tagging things, from the people that are creating the data centers to the water that is running through the data centers to the communities that are impacted by those data centers outside of their door. And so we really try to be intentional about the way that we’re using AI, where we’re really making sure that when we use it, we’re being constructive with it. We’re just not using it because it makes sense, it’s because it’s helpful, but we also understand the ramifications of all of those things.
Eric Ressler [00:09:35]: Where do you think AI is most helpful when it comes to social impact work and even just in general as it comes to creating social change, doing your work, being a human being even, and where do you think it is not a good fit and where there’s some potential either downsides or even just places where AI is not constructively used or shouldn’t be considered as the first tool in the toolbox?
Tina-Marie Gulley [00:10:00]: Well, therapy is one. I know that is a big one that I’m starting to see the trends that people are using AI as their therapist and those are one of those areas where I say, “No, let’s not do that. Let’s go to our therapist.” It’s really meant to be a copilot. A lot of times, depending on the language learning model that you’re using, AI is meant to please you. It doesn’t want to make you angry. It’s going to tell you what you want to hear. And so we have to take that into consideration. It might give us a perspective based off of the prompts that we put in or what it knows about you. And so we really have to take that into consideration as we’re using it. I really love the idea of co-authoring with people, with humans and the machines to really make a better perspective, to understand all of the nuances.
There are things that AI is just not going to know because it doesn’t have the lived experience of humans. It only knows how we describe empathy, but it doesn’t know what empathy looks like in practice. And so it’s really important for us to make sure that as we are guiding what the future of AI looks like, that we are not displacing people. It’s really meant to be used as an opportunity for you to focus on the areas that you really enjoy doing and all of those minute tasks that maybe aren’t so fun, it can help you solve for that. For me, I think AI is less about productivity and more about figuring out what are those things that people enjoy doing, what are those core competencies that AI can help you with, and what are the things that we need to continue to let people do?
I don’t want us to get into a situation where AI is replacing people. I don’t think that is great. People still have to have a livelihood, people have to pay bills. And I really want there to be a world where we can work in community with AI.
Eric Ressler [00:12:20]: It’s an interesting way of putting it. I don’t think I’ve ever heard anyone quite say it that way, working in community with AI. And it is an interesting way to think about it. And it starts to get almost very sci-fi very quickly around this idea of humanizing this technology in a way that I really don’t think any other technology has been humanized so quickly. At the time of this recording, ChatGPT-5 just rolled out and there’s a whole subculture of the internet freaking out because they miss their old model, not because of the tool element of it, but because almost like they would miss a dear friend who got replaced unexpectedly. This goes back to your point about maybe we shouldn’t be using AI for therapy or in replacement for therapists and the constructive and the skillful use of AI versus a maladaptive use of AI. Does that cultural relationship worry you even when you’re using terms like working in community with AI?
Tina-Marie Gulley [00:13:30]: It worries me all the time. I mean, we’re seeing a proliferation of people having relationships with their AI. They consider their AI their best friend or their romantic partner, and that definitely has me very concerned. I think that there is a way to invest and engage and make AI a part of your community in terms of really improving systems, improving experiences, creating more opportunities, better understanding outcomes. But we really have to think of it as a partnership ecosystem versus this is an actual community member. And we can definitely build together and create the next era of things, but it definitely should be led by human beings and not by machines.
Eric Ressler [00:14:30]: I have a bit of a complicated relationship with technology, and I’ll try and explain that briefly here. So I grew up with basically the popular adoption of the internet and have been pretty connected to technology my entire life, personally, professionally. I’m an artist, I’m a designer, I’m a creative, and the convergence of creativity and technology has essentially been my life’s work. And at the same time, and maybe part of this is because I’ve recently become a parent, I have two young kids. I do worry about how technology has so deeply become interwoven into humanity. And I think the internet was a major milestone in this. The smartphone was maybe even a more major milestone because now we are connected everywhere we go. We’re looking at VR and AR and now AI. And it seems like technology is getting closer and closer and more interwoven into our humanity.
I walk around with a smartwatch on and I track my daily steps and I know what my resting heart rate is. And sometimes it can be constructive because it might remind me, “Hey, dude, you got to get up and move a little bit. You can’t just be sitting at your desk all day.” And at the same time, I do worry sometimes, and when I think about my daughters growing up in this world, I worry sometimes about, are we gaining efficiency and are we gaining more and more technology and more and more information? But what is the cost of that? Are we losing pieces of our deep humanity in a way that is leading to some of the cultural and societal issues and the polarization and the loneliness epidemic and maladaptive behavior trends that we’re starting to see show up in data and in science? As someone who’s also very tied deeply to technology and social impact, I’m just curious how you think about that and how you think we might be able to more skillfully coexist with these technologies because it’s not going away. The genie’s out of the bottle at some level. So I think the question is, how do we best... Is it regulation? Is it social and cultural norms? Is it people speaking up and reconnecting with their humanity? Where do you see this going?
Tina-Marie Gulley [00:17:00]: I think a big part of investing in AI is investing in humanity. And that means investing in ways where people are able to unplug, where they’re able to be in areas where we do things without technology, without AI, to have that balance. I think we still have to have opportunities to spark amazing conversations that lead to action that allow us to really be able to use a lot of those soft skills that we’ve already developed as human beings. And so we definitely need to have... I don’t necessarily know if it’s regulation, but it could just be a framework of how to do it properly. So that means not only at work, but in your personal life, at schools. One thing that I find really interesting, I’ve read a couple of articles that were speaking to billionaires and the schools that their children go to, and a lot of times they are not using AI or technology in the ways that other schools are.
And it’s very interesting because these are children of people who are running technology companies. And so it’s really advantageous for us to learn from that, where there has to be balance, where part of what this AI and technology as a whole is supposed to do is democratize some things, but also realize that we have to be able to spark conversations and innovations in natural settings without technology as well. We have to use both sides of our brains to ensure that we’re able to do that. And we’re not living in a world where we are so reliant on technology that the next generations don’t know how to think critically, or they don’t know how to analyze, they just don’t know how to do basic things. And so we do have to have the balance. I don’t know if that’s policy. I don’t know if that’s some type of mandate, but I definitely know that that has to be part of this conversation.
Eric Ressler [00:19:30]: Hey friends, real quick before we continue today’s episode, I’m Eric Ressler, founder and creative director at Cosmic. Cosmic is a creative agency purpose built for nonprofits and mission-driven organizations. For the last 15 years, we’ve helped leaders like you nail your impact story and sharpen your strategy, but we’re not here to just leave you with a fancy slide deck and a pat on the back. We roll up our sleeves and help you bring our ideas to life through campaigns, creative, and digital experiences. Our work together helps you earn trust, connect deeply with your supporters, and grow your fundraising and your impact. If you value the thinking we share here and want it applied to your biggest challenges, let’s talk at designbycosmic.com. All right, back to today’s conversation.
I want to shift and talk a little bit about diversity, equity, inclusion, given that those are major drivers in your personal life, your lived experience, also the values that shape how you think about your organization. We’re unfortunately in a moment where a lot of those values, a lot of those efforts are being scrutinized heavily, at least in America. We’re in a moment where a lot of funding that was flowing pretty freely towards some of those initiatives in the corporate world, especially in the social impact world, that seems to be pulling way back, even just because organizations don’t want to attract any negative attention or extra scrutiny. What’s the experience been like for you this year navigating all of that as that’s been unraveling in real time? Or maybe you have a different experience and that’s not how you see it happening.
Tina-Marie Gulley [00:21:10]: I think more of it is I want to focus more on reframing it. At the end of the day, we’re trying to create opportunity for everybody, but we really have to focus on what are the challenges to those opportunities. We have to call it what it is. I don’t care if you call it DEI, something else. I really want to scale responsibility. I don’t want to compromise it. I want to make sure that impact is there. I want to make sure that we are able to create those cultural safeguards that these communities need. And I want to be able to feel good about what I’m doing in terms of creating opportunities for folks that are untapped or considered non-traditional and being able to partner with organizations that don’t want to retain these archaic systems that cause everybody to suffer.
For us, scaling really means quality over quantity. That means our students, once they graduate, they’re giving back to our program, not only in mentorship hours or volunteer hours, but also in dollars. We are still addressing structural barriers that are happening, but we’re reframing what that looks like and changing what equity and action also looks like.
Eric Ressler [00:22:45]: I want to pick back up on a thread that we started early in this conversation around innovation, and especially as it relates to fundraising and philanthropy. So you talk about power dynamics with regards to philanthropy and finding the right partners, the reciprocal partners who will fund this work. What’s your experience been like, especially recently with a major shakeup in the funding landscape and the incentive structures that are in place for some of the major donors and most importantly, some of the institutional donors. What’s that experience been like for you this year?
Tina-Marie Gulley [00:23:15]: I think more than anything, there’s a lot of uncertainty on both sides, and that’s definitely relatable. And I think more than anything, we want to make sure that as things get aligned or adjusted as different priorities happen, that we also think about other organizations. There are other organizations that are literally ensuring people are alive and they’re being prioritized and they should be. If there’s a nonprofit that is ensuring people have cancer care, people have a place to live, and to eat, they should be a priority. And I don’t fault any organization that wants to realign to those things. Those are very important. Those go back to the humanity of society, but also knowing that we all play a part. And so really making sure that if it is something where an organization is perhaps shifting in direction, to know how we can still be a part of maybe future conversations, that this relationship hasn’t ended.
And so for us, that’s what’s really important. We really want to make sure that whoever we partner with, that they understand our mission, that we’re mission aligned to the focus of the organization and to know that we’re here, we’re not going anywhere. And even if you’re readjusting the way that you’re distributing grants and funding for the next two, three years, we’re hoping that our mission was so strong, the relationships that you created with us was so strong that you still want to figure out how you can still be involved with what we’re doing. And so it’s really about building those relationships and also showing the level of impact. I think a lot of times the impact stops or the impact reporting stops once the checks stop, and I don’t think that’s fair. I feel like if somebody’s given to you, they should always have those updates of what the organization’s doing, what are some of the things that you’ve rolled out, because that impact has a ripple effect.
And it’s important for them to see this $10,000 grant that you gave us 10 years ago has allowed us to accomplish all of these things. So I think it’s really important to have those mechanisms where you’re still reporting back to funders way after maybe their relationship has formally ended.
Eric Ressler [00:26:00]: Before we wrap up, I want to ask you something a little bit more personal about how you show up in this work. And no one comes into this work and dedicates their life to doing social impact work because it’s easy. Being up at the top as a leader, not to get too hierarchical with it, but it could be lonely at times, it could be hard at times. What do you do when it gets hard for you? What keeps you going? What keeps you energized doing this work, especially when it gets hard? How do you balance all that? How do you prevent yourself from burning out, from losing hope and motivation for the future as it relates to technology and just in general, how you show up every day in doing this work in community?
Tina-Marie Gulley [00:26:45]: I think I have to ground myself in who I am and what I can control. And that’s a big thing that I do with my team as well. I think it’s also important to always just have a mindset of continuous learning. You might be an expert in all of these areas, but there’s so many more areas that you have to learn from or learn about. And I think the other thing is being able to unplug, rest, rejuvenate. I think that naturally, a lot of us didn’t even know what that looked like, like work-life balance, what is that? I don’t know her. And so being able to prioritize rest. I’m always a big proponent: when your body tells you to rest, you do it or it’s going to make you rest. And I think I’ve learned that so many times. If you are not filling your own cup, you can’t fill other people’s cups.
Eric Ressler [00:27:45]: That’s beautiful. Thank you. What are you personally most excited about right now as it relates to the work that you’re doing or even just in your own life? What are you looking forward to?
Tina-Marie Gulley [00:27:55]: My goodness. I think that I’m most looking forward to creating new partnerships. I think that we’ve started this year with some amazing partnerships with the city, with the state, with other nonprofits, with AI2 Incubator. And I’m really excited about where that is going. I think that a lot of times we’re just so focused on ourselves or our organizations that we don’t think about the totality of what we can do when we come together. And so that’s what I’m looking forward to is more community and private partnerships to ensure that people are being able to be a part of society, able to make sure that they’re part of the workforce, able to make sure that they’re contributing in lots of different ways that we find the value in the different ways that folks in our communities can contribute. So that’s what makes me really excited right now.
Eric Ressler [00:29:00]: Awesome. For listeners who want to learn more about your work, to connect with you, where can they go? Where do you want to plug?
Tina-Marie Gulley [00:29:10]: Yeah, please visit us. Our website is adadevelopersacademy.org, and I’m on LinkedIn. I am Tina-Marie Gulley, G-U-L-L-E-Y. That’s a great way to plug in with me. And definitely at an event. I try to go to a lot of tech, social impact events, as well as my team. We’re a small but mighty and approachable team, and we really believe in the best in people.
Eric Ressler [00:29:45]: Tina Marie, this has been awesome. Thank you so much.
Tina-Marie Gulley [00:29:50]: Thank you.
Eric Ressler [00:29:55]: If you enjoyed today’s video, please be sure to hit like and subscribe or even leave us a comment. It really helps. Thank you. And thank you for all that you do for your cause and for being part of the movement to move humanity and the planet forward.



