As an instructor of nonprofit fundraising — and a professional fundraiser myself — I teach students to build relationships with donors. We practice skills I call “icebreaking, inquiring, and inspiring.” This type of practical experience is essential to the fundraising craft. Thoughtful fundraisers don’t bore the donor to sleep, have meandering conversations, or try to strong-arm people out of their money.
Good written communication is critical to performing these skills effectively. So why are so many of my fellow fundraisers entrusting their proposals and solicitations to a machine? The past year has seen hundreds of articles, webinars, and blogs designed to help fundraisers make the most of “large language models” such as ChatGPT.
Having vacuumed up much of the internet, ChatGPT makes it possible for almost anyone to produce plausible prose. Professional writers, however, are far less enamored of its skills. They consider ChatGPT a poor writer — unless it’s fed large quantities of good writing. It repeats itself. It makes things up. It takes up space like your extroverted manspreading friend, spurting passively constructed, pompous noun clauses. It loves jargon. And when asked to critique its own writing, it will, unironically, note the deficiencies I just mentioned.
Navigating AI
-
Artificial Intelligence
7 Questions Nonprofits Have About A.I., Answered
But the “artificial” half of “artificial intelligence” concerns me at least as much as the “intelligence” part. Much of what fundraisers do is deeply human and can’t be replicated by a chatbot. A fundraiser’s job is to pursue excellence in developing relationships, and people can only get excellent at something with consistent practice. Employing A.I.-written prose risks sacrificing human connection at the altar of efficiency.
However passionate my defense of human writing may be, it typically falls flat with my A.I.-forward colleagues. There’s a reason generative A.I. has become one of fundraising’s most hyped new assets: For fundraisers drowning in grant proposals and case statements, writing is not a craft but a nightmare. This might explain why fundraising ranks in the top 10 percent of occupations likely to be affected by the advent of generative A.I.
ChatGPT can also help level the playing field, allowing small organizations to get the job done without bearing the cost of copywriters. Given that the largest 100 charities bring in an eighth of all philanthropic dollars in the United States, it makes sense for smaller nonprofits to seek out less expensive methods to attract donors.
The truth is, when done right, ChatGPT can be an effective fundraising companion.
Cameron J. Hall, executive director of annual giving and lead generation at the University of South Carolina, told me that the institution had its most successful giving day ever with the help of ChatGPT. As an early adopter, Hall used the A.I. tool to create marketing content for the giving-day campaign, analyze a post-giving-day survey, and generate copy to recruit staff.
Hall’s team came up with especially creative prompts for the bot, including feeding it previous writing samples and instructions on not only what to write but whose style to mimic.
Such successes underscore the need for a balanced approach to using generative A.I. — one that makes us better, not worse, humans (and fundraisers).
A.I. and Humanity
In my course, we do a unit on normative ethics, or the study of how humans ought to act. Fundraising, after all, is full of ethical quandaries, such as whether to name programs after donors with bad reputations or accept gifts with onerous stipulations. The answer is never obvious. Each case must be carefully weighed with the organization’s values in mind.
Similarly, the use of A.I. needs to be balanced in the context of obligations to staff, beneficiaries, and donors. Here are some initial questions to consider:
What do we owe employees whose skills may become obsolete or whose jobs may be replaced by A.I.? I sympathize with those who summon ChatGPT when they’re staring at a blank page. But let’s not forget that writing doesn’t just reflect our thoughts — it helps us generate new insights.
Nonprofits should ensure staff keep those creative skills strong by allowing them to write the documents that describe what the organization is and does — mission statements or white papers, for instance. They can then let generative A.I. do the repurposing of that content — as opposed to the inventing — turning the original, human-produced writing into, say, social-media copy.
A.I. is unlikely to replace frontline fundraisers, whose personal interactions with donors are still the most effective way of getting them on board. But back-office jobs in marketing, copywriting, and analytics are another story.
Nonprofit technology expert Meena Das, founder of Namaste Data, told me that she expects tools such as ChatGPT to both augment and substitute for human capabilities in the workplace. For that reason, instead of laying off staff, organizations should consider devoting a portion of the A.I.-driven productivity boost to retraining them in areas such as face-to-face engagement.
What do we owe those who benefit from a nonprofit’s services? Authentic storytelling about the people an organization serves requires a unique human voice. A whole movement in ethical storytelling has even sprung up to push nonprofits to move away from one-dimensional portrayals of people and ensure the stories they tell don’t harm the subjects.
A.I. is poor at storytelling because it’s trained on tales that have already been told and often perpetuate stereotypes. For these reasons, A.I. should never be used to create stories about beneficiaries without their buy-in.
Moreover, fundraisers need to know that any information they provide ChatGPT could become part of the system’s training data and thus make its way into content provided to other users. This might occur, for example, if a nonprofit wants ChatGPT to write a grant proposal and feeds it data about people who have used the organization’s services. Giving A.I. personally identifiable information is tantamount to posting it online.
What do we owe donors? Blindly relying on generative A.I. to expand your organization’s outreach to donors is wrong-headed. Here’s a rule of thumb: if you can’t imagine adding the postscript “This email was written by generative A.I.,” then write the email yourself.
Vanderbilt University learned that lesson when it sent an A.I.-generated letter to the campus community following a mass shooting at Michigan State University. The letter understandably provoked a highly negative response. After all, communication about sensitive topics presumes a caring human behind the screen.
That doesn’t mean you should turn off auto-suggest and forgo all A.I.-fueled writing. Keep in mind, however, that the context of a particular communication is critical to deciding when to deploy it. A note to a single person, rather than a mass solicitation, should be written in your voice, primarily by you. When A.I. is used to, say, synthesize letters to create a new fundraising appeal, make sure humans always review and edit the bot-generated prose before it’s sent to other humans.
A couple of months ago, I attended a lecture on ChatGPT given by Philip Resnik, a professor of computational linguistics at the University of Maryland. His wide-ranging discussion included this simple but reassuring comment: “You can do things that these systems can’t.”
One of those things is exercising moral judgement about how and when we deploy A.I. As artificial intelligence gets smarter, balancing the efficiency of machines with human understanding and wisdom may be our greatest challenge.