Just two years ago, OpenAI’s ChatGPT made its debut, captivating the public’s imagination and kindling an A.I. frenzy that soon spread to the nonprofit world, inciting a mixture of curiosity and fear.
If organizations’ grasp of A.I. was in its infancy then, nonprofits today have reached a toddlerhood in terms of understanding, deploying, and safeguarding against A.I. tools.
In this technological maturation, the Patrick J. McGovern Foundation has had a front-row seat. One of few leaders in the philanthropic A.I. movement, the tech-savvy foundation announced this week that it had doled out $73.5 million in A.I.-related grants to 144 organizations in 2024, including many dedicated to influencing global policy decisions around the technology’s use and governance.
As generative artificial intelligence matures — and the possibility of federal A.I. regulations dims — the foundation’s head, Vilas Dhar, says what happens next could define the next decade. Because President-elect Donald Trump has promised to repeal nascent A.I. regulations on the grounds that they could stifle innovation, Dhar notes civil society organizations, including many of McGovern’s grantees, may become a last line of defense against the technology’s worst possible effects.
“The decisions that we make in the global governance of A.I. in the next three years will define the trajectory of the next 20 or 30 years,” said Vilas, in a call from Vatican City, where he’s been partaking in meetings about A.I. and “human futures.”
“Even if it feels uncomfortable, philanthropy and nonprofits need to step into conversations about governance, not just of the technology but of its social, economic, and political impacts,” he said.
The A.I. Nonprofit
The McGovern Foundation is one of few grant makers dedicated exclusively to A.I. and data science, having formed out of an endowment of $1.2 billion from the late Patrick J. McGovern, the founder of the 1980s tech magazines Computerworld, Macworld, and PCWorld.
When A.I. roared into the mainstream two years ago, the foundation became one of few philanthropic organizations prepared to fund and advise nonprofits interested in A.I. tools. This past year, the foundation funded — and in many cases, offered technical expertise to — dozens of organizations, big and small, applying A.I., for example, to fact-check information, reduce food waste, and predict the likelihood of intimate partner violence.
The Nature Conservancy received $750,000 for its use of A.I. to monitor ocean health, while the ACLU received $400,000 to deploy A.I. in its litigation work. And Climate Policy Radar received $750,000 to develop tools for synthesizing and making searchable large swaths of climate-related data.
“A.I. is here — it is out of the box — and it is the most transformative technology that has landed on our front doors and screens in a long time. Meanwhile, climate change is the most transformative crisis that’s happening right now,” said Michal Nachmany, CEO of Climate Policy Radar, who says that if used correctly, A.I.'s potential for fighting climate change could outweigh drawbacks, namely the technology’s massive energy needs, often powered by fossil fuels that are warming the planet.
Some grants also went to organizations that leverage data science or high-tech tools without relying on artificial intelligence.
“You’ve got to have the fundamentals down first,” said Connor Schoen, CEO of Breaktime, a McGovern grantee that combats youth homelessness while “bringing data into everything we do.” While Breaktime has experimented with A.I. for some one-off tasks, Schoen says that most of the work still requires a human touch.
An important part of the organization’s work is giving a sense of human connection and support to 18- to 24-year-olds, “who are inundated with technology and social media,” he said. This service that cannot be easily replicated with A.I., Schoen added.
A.I. Regulation Uncertain
Though some of the McGovern Foundation’s grants this year did go to new nonprofit A.I. tools, many others went to devising regulations and best practices for fighting the technology’s abuses, excesses, and biases.
It’s part of an investment that Dhar says is meant to ensure that A.I. develops safely, with input from civil society and governed not just by market forces.
That hasn’t been easy, as tech-company evangelists have enthusiastically talked up A.I.'s potential, while global and national regulatory frameworks have not slowed down A.I.'s use or development.
“We’re already living in an A.I. economy — it’s not the future,” said Miriam Vogel, president of EqualAI, a nonprofit that received $250,000 from McGovern for helping businesses and other organizations use A.I. to mitigate harm and improve trust and transparency.
In October 2023, the Biden administration issued an executive order outlining permissible government and private A.I. use, which many experts called a positive step to managing the technology’s risks.
However, President-elect Donald Trump — who received campaign and inauguration support from some of A.I.'s biggest evangelists — has promised to repeal that executive order. His administration is likely to thwart attempts at A.I. regulation on the grounds that it would stifle innovation.
Without federal oversight, nonprofits like EqualAI or watchdogs like Amnesty International, the Algorithmic Justice League, and the National Fair Housing Alliance — all of which received McGovern Foundation grants this year — would become first-in-line defenses against the technology’s excesses in the years to come.
“2025 is going to be a time of increasing vulnerability, of stress on civil society,” said Dhar, who noted that during previous technological transformations — like the advent of the internet — nonprofits rarely held influence or a seat at the table as regulations and priorities were settled upon.
“Unlike in the past, the door to be architects of a digital future is open,” he said. “It’s now our choice as philanthropies, as civil society organizations, whether we step through it or not.”