It should come as no surprise that the focus of this year’s buzzwords is power — who has it, who doesn’t, and how to get more of it. The subject of power was of course inescapable during the election, particularly given worries about the country’s potential slide into authoritarianism. Anxieties about power extend to the tech world, where A.I. and social media exert growing and seemingly uncontrollable influence on our lives.
All these concerns have implications for nonprofits, which are reflected on this year’s list. Increasingly, the sector must contend with the power that entities ranging from chatbots to billionaires have over the day-to-day lives of the people they’re trying to help. At the same time, some words, such as “dandelion” and “socialize,” prove that the nonprofit world isn’t immune to seeking power and influence itself.
Here are the words and phrases to watch in 2025, drawn from my report “Philanthropy and Digital Civil Society: Blueprint 2025,” which was released today.
Accelerationism/accelerationist. Effective accelerationism, or e/acc, describes a movement that promotes the most rapid and unregulated research and development of A.I. The movement’s adherents argue that those who create the biggest, fastest, most powerful A.I. systems will be able to solve some of the world’s worst problems. They see themselves as an alternative to effective altruism, an evidence-based philanthropic approach that views A.I. as an existential threat.
A different type of accelerationism inspires white supremacists, one in which they foment divisiveness and polarization to speed up the collapse of existing systems and incite civil war. The people promoting and defending against both versions of accelerationism organize online, in associations, and using nonprofits.
Addictive Intelligence. All A.I. chatbots are working for their maker, not the user. As such, some may incorporate “dark patterns,” or design tactics that manipulate people into certain actions and can get them hooked — similar to YouTube’s “play next” feature. As more people turn to chatbots for conversation and comfort, or to replace human relationships entirely, they need to be on the lookout for A.I. systems with deliberately addicting patterns. Nonprofits considering developing chatbots should err on the side of humanity, and avoid causing technological harm for the people they serve.
Autonomous fundraisers. You knew this was coming. Autonomous fundraisers are A.I. avatars or chatbots that respond quickly, and presumably politely to donors, so they feel cared for. At least until they don’t. This is likely a terrible idea, but nonprofits everywhere will likely want to try it. The big “what if” is whether making fundraising more efficient will also make it more obnoxious and transactional. Inevitably, the bots will negotiate a gift a donor doesn’t actually want to make — or even trick some people into thinking they’re human. Nonprofits should prepare for some spectacular backlash when a donor feels angry or duped.
Black box billionaires. Coined by Inside Philanthropy, this term refers to five ultra-high-net-worth donors who primarily give to donor-advised funds. For these billionaires, DAFs function as black boxes because of their lack of disclosure and payout requirements. The article notes that 83 to 99 percent of their giving goes towards DAFs. And as Chronicle of Philanthropy columnist Craig Kennedy has explained, DAF dollars sent to 501(c)(3) organizations can be funneled to 501(c)(4)s that engage in more political activity. For these donors, it seems that the control enabled by DAFs and their anonymity when giving to controversial causes are especially attractive features.
Dandelion. This term refers to a strategy to expand the reach of a program or idea so it spreads like dandelion fluff in the wind. To do so, people aim to ensure their ideas take root in a variety of environments by emphasizing adaptability and resourcefulness. It’s an apt metaphor amid the growing interest in protecting natural habitats, such as the movement to avoid lawn maintenance known as “no mow May.” But the tactics involved in a dandelion strategy can sometimes be vague, and other nonprofits would benefit if those using the metaphor were more specific about how it worked.
Deep Doubt. The technology outlet ArsTechnica uses this term to describe the current era in which A.I. fakes take over text, audio, and video, making it impossible to trust anything on the internet. Expanding on the phrase “deep fake,” deep doubt goes beyond one-off examples to encompass all digital media. And it captures both the use and excuse of faked images to explain away reality, a phenomenon known as the “liar’s dividend.” Electoral politics thrives and dies on such information. Civil society, for its part, both creates deep doubt and is a victim of it. Take vaccine misinformation: Major charities have spent millions pushing false science, forcing fact-based organizations to deploy their dollars to fight back.
Digital Twin. Corporations gather so many minute data points on our online habits that it’s possible to think of ourselves as having both a physical self and a virtual self created by analyzing personal data. Digital twins can be used currently to predict shopping behavior or provide individualized health guidance. Eventually though, A.I. companies say digital twins will be able to do everything from cure diseases to handle boring tasks, such as attending meetings so we don’t have to. To prepare for that future, civil society needs to consider how to protect the safety and freedom of digital twins, ensuring people can still maintain ownership and control over their data.
Slop. Just like spam has long filled email inboxes, A.I.-generated junk, or slop, now fills the web. While spam can be blocked at an inbox’s metaphorical door, slop can’t be stopped. It’s everywhere on the internet: an image of Jesus with shrimp for limbs on Facebook. Dangerously inaccurate foraging books on Amazon that may lead people to eat poisonous mushrooms. The more junk on the web, the harder it is to find accurate information that good actors in civil society are sharing. The challenge for any organization is how to be seen and heard amidst all the slop.
Socialize. Where once organizations focused on making cogent, one-off arguments to persuade others, now they socialize an idea by trying to build support for a concept over time, introducing the information slowly and in digestible pieces. This might involve presenting the idea to people one-on-one, getting their feedback, and tweaking the approach where needed to ensure that eventually, the entire organization is on board. People one level below the final decision makers often speak of “socializing the idea, the proposal, the strategy” among their bosses.
Weaponize. In the country’s Alice-through-the-looking-glass politics, to weaponize something is to turn a legitimate issue or concern into a tool for inflicting political damage. Recently, for example, Republican lawmakers accused misinformation researchers of censorship, even though they were the ones doing the censoring by issuing subpoenas.
At least one Congressional subcommittee focuses on the weaponization of government power. Leonard Leo, the right-wing political activist with a billion dollar-plus bank account, called on his grantees to “weaponize” their work two months before the 2024 U.S. presidential election. In the dark money sandbox of politics and charity that Leo inhabits, real issues that matter to real people, from marriage equality to health-care access, are cultural divisions to be exploited. Expect this behavior and language to creep further into the charitable sector in the coming year.