What does AI mean for the future of nonprofit fundraising?

What does AI mean for the future of nonprofit fundraising?
By Mia Abrahams

Since the debut of ChatGPT in November 2022, the language chatbot has been an evolving topic of conversation among Slack channels and group texts. As the media report on rapid advances in language model technology, experts continue to raise concerns about ChatGPT playing an ever-increasing role in our personal and professional lives—citing, for example, the chatbot’s habit of “hallucinating” information when it doesn’t know the answer.

What does this all mean for nonprofit and political fundraisers? How can we harness this emergent tool for good, mitigate harmful impacts, and prepare for the ways in which it could upend our work?

What do we mean when we say AI?

In short, ChatGPT, Microsoft’s Bing, and Google’s Bard are all types of  “large language models” (LLMs)—a technology that analyzes massive amounts of text from across the internet. Developed by tech company OpenAI, ChatGPT, the most viral version of the three, is capable of creating vast amounts of content, carrying out human-like conversations, and performing complex tasks.

These technologies are already being implemented rapidly in organizations, workplaces, and institutions—to various effects. In March, the New York Times reported that the Democratic Party has begun testing the use of artificial intelligence in some fundraising emails (and that some of those appeals had been performing better than those written entirely by human beings), while the nonprofit Khan Academy announced a partnership with ChatGPT to launch new AI-powered virtual tutors.

How could AI shape our work as fundraisers?

While many of the predictions for AI’s impact on fundraising can be pretty “doom and gloom,” we see one potential use case where AI could actually bring us closer to our donors.

Researchers Allison Fine and Beth Kanter remain hopeful that in the future of the industry’s partnership with emerging technologies, every nonprofit will receive “the gift of time freed up for staff to spend more time building strong relationships with donors and supporters.” While being mindful that there is still so much unfolding in the AI space, we are considering ways this technology might impact fundraising, our partners, and our day-to-day work life.

1. Focus on the creative that sets us apart.

As an experiment, I gave ChatGPT a prompt asking for ways it could assist with content strategy for nonprofit fundraising, and the suggestions were wide ranging: generate content tailored to specific donor segments, analyze engagement across channels and optimize accordingly, and create content based on organization mission and themes.

As we’ve seen in the last few months, language bots are really good at generating words—lots of words–and may soon have the ability to easily generate new content based on past campaign results. But when we at Blue State put it to the test to write a fundraising email in the voice of one of our clients, principal editor William Tomasko found that while AI could write cohesively, it failed to match the spirit of a human voice.

A recent (although not peer-reviewed) study provides early indications on how this might play out in the workplace: Using ChatGPT makes everyday tasks quicker and easier to complete. If (or when) technologies like ChatGPT become another tool we use in our everyday work lives, we may be able to free up space to focus our energy on big-picture creative ideation, out-of-the box ideas, and ensuring that the voice of the organization is captured in our work.

2. Think seriously about our personal relationships with donors.

Building strong relationships with donors is an essential piece of the fundraising puzzle. ChatGPT might be able to help with routine personalized donor communications, for example, providing donors with information or FAQs in a more convenient way than waiting on hold at a call center.

However, right now, ChatGPT is not a reliable source for truth or facts. There are also considerations around donor privacy. Using a public tool like ChatGPT to share donor or client data creates a risk—Samsung employees recently (accidentally) leaked sensitive confidential information using ChatGPT. As generative AI continues to become ubiquitous in our communications, will person-to-person relationships and communications become prioritized by our supporter groups?

With increasing disillusionment in institutions like nonprofits (particularly from young people), the potential for LLMs to cause a firehose of highly personalized marketing automations is a concern. Fundraisers might see a benefit in the long-term fostering of micro-communities and establishing themselves as trusted sources by offering disclosure around using AI-generated content in communications and ongoing acknowledgment of potential bias and factual inaccuracies.

3. Consider how ChatGPT employs bias.

In a groundbreaking research paper that resulted in her being forced out of her role as the co-lead of Google’s ethical AI team, co-author Timnit Gebru argues that “size doesn’t guarantee diversity.” As a language model analyzing enormous amounts of information from existing sources on the web (300 billion words, to be exact), ChatGPT will inherently reflect white supremacist, misogynistic, and ageist biases, among others, that are overrepresented in these existing data sources.

As fundraisers working with groups led, organized, supported, and made up by diverse groups of people—it is critical that we acknowledge potential bias in technologies being employed by our organizations and in our day-to-day work. One place to begin tackling these biases in emerging technology is in regulatory legislation. This week, OpenAI chief executive Sam Altman testified in a Senate hearing, imploring Congress to act quickly on regulation, as lawmakers acknowledged the growing need for accountability and oversight.

What’s clear is that we’re only at the start of the trajectory for AI—and progress is only accelerating. Nonprofits should closely track the evolution of this technology and how they can and should employ it in their work—along with considering the myriad ethical and technological issues it raises.

For our part, we’re interested in the ways it can elevate our work: by routinizing communications to bring us closer to our donors and supporters and freeing up our time to think boldly and creatively—all things that make our work, well, a little more human.

Mia Abrahams is an account director at Blue State, where she supports international nonprofit clients fundraising and strategy. Prior to joining Blue State, Mia led communications and development for Youth Justice Network, a New York-based nonprofit.

The sustainable nonprofit

November 9, 2023