Questions about the best and most ethical ways to use artificial intelligence have consumed much of society this year, including the nonprofit world. Media coverage has celebrated generative A.I. for encouraging increased efficiency and new ways of problem-solving — while also warning of its risks.
One thing is clear: We’re on the precipice of a new way of interacting with technology, and this year’s buzzwords reflect that. Four of the 11 focus on A.I., compared with just one last year.
Even as we contend with this brave new technological world, we’re still navigating entirely human problems from years past, including the opioid epidemic, the pandemic, and social inequity. For the nonprofit field, that means learning how to use new technology in ways that put people first, while not sacrificing progress. That tension is clear in this year’s list.
Here are the words to watch, drawn from my report, “Philanthropy and Digital Society: Blueprint 2024,” which published Friday.
Alignment. This A.I. safety research term describes the extent to which artificial intelligence systems achieve the goals established by the humans who created them while advancing human values and ethics. A misaligned system can be harmful to the communities nonprofits serve. But which values, which humans, and what kind of alignment is necessary to ensure most people benefit? The effective altruism community, for example, funds several public policy fellowships where participants develop expertise in A.I. governance. It emphasizes alignment with issues that matter to them, namely long-term existential risks such as human extinction, instead of immediate harms, such as gun violence.
Data Lake. This refers to a repository where users can store and secure large amounts of data, allowing for multiple analyses of a dataset by different organizations and analysts. Earlier this year, several organizations in the United States worked together to build a data lake of nonprofit tax information, known as the 990 Data Infrastructure Project. Its goal is to increase transparency and help nonprofits identify trends and make decisions.
Digital ID. While many people use passwords, usernames, and captchas to distinguish themselves from others when accessing a website, the advancement of A.I. means humans will also need to differentiate themselves from bots and other automated systems. Tech leaders, particularly those creating powerful, human-imitating A.I., argue that everyone needs a digital ID. For example, Sam Altman, the CEO of OpenAI, is pushing for globally recognized digital IDs based on human biometrics. Digital IDs are controversial for many reasons, including privacy and anonymity challenges and their potential to be used for surveillance. At the root of these issues are questions about the responsibilities governments have to their citizens, the ways private companies track us, and the need for nonprofits to stake out a middle ground between people and governments.
Digital Public Goods. The United Nation defines these as open-source software, data, A.I. systems, and content collections that adhere to privacy laws, don’t cause harm, and help achieve the U.N.’s 2030 Sustainable Development Goals. There is a registry of these tools — many of which are created with foundation dollars — and a global standard of requirements that must be met in order to qualify. A map of accessible public facilities in Kazakhstan is one example. Such tools represent governmental and civil-society alternatives to corporate control of the digital landscape and digital tools.
Donor Codes of Conduct. Six years after the start of the #MeToo movement, more organizations are demanding donors adhere to behavior standards through codes of conduct. Informed by growing calls for nonprofits to concentrate more on the needs of the communities they serve and less on donors, they’re designed to address problematic behaviors, such as sexual harassment and discrimination. They also help protect fundraisers and others who often have little power in these lopsided relationships.
Doom Loop. Although many people have returned to office work, hybrid schedules are more common, leaving many cities and rural towns with empty office spaces and stores. This has raised concerns about both the viability of commercial real estate and the vitality of downtown centers. Empty offices lead to empty public transit, which leads to fewer people on the street, which leads to unkempt downtowns and fears of rising crime, further driving away people. One crisis creates another, fueling the much feared, much hyped doom loop. For nonprofits and philanthropy, that might mean lower rents, increased community needs, reduced tax bases, and fewer public services. They may also be involved in efforts to help counteract the cycle through artistic or cultural events.
Green Hushing. The backlash against socially responsible investing is leading companies to downplay their sustainability goals to quiet the critics. In the U.S., the federal government has flip-flopped from condemning investments based on environmental, social, and governance factors to supporting them with the change in presidents, leading several states to organize in opposition. Green hushing is the inverse of green washing, in which companies tout environmental actions they aren’t taking. Longtime supporters of ESG investing, nonprofits and philanthropy are finding their advocacy efforts challenged and their investment options limited.
Insurance. Insurance costs are skyrocketing, due in part to climate change, damaging weather events, and inflation, placing financial pressure on nonprofits and affecting how much money goes to other organizational costs. Property insurers are redlining entire counties because of the cost of natural disasters. Meanwhile, cyber insurance, which provides protection from internet-based threats such as data breaches, is already expensive, with no signs of getting cheaper. In the years to come, insurance costs may cause nonprofit budgets to balloon, possibly forcing some to close.
Naloxone. The generic name for Narcan, which is used to reverse opioid overdoses, the drug may soon become a standard item on fundraising gala checklists. Widely available in stores and online, it is also increasingly carried in bars, ambulances, schools, and by caregivers. Every nonprofit or foundation that hosts events should consider whether they need it on hand.
Safe, Responsible, Trustworthy, [fill in the blank] A.I. An endless number of descriptors aim to delineate what kind of A.I. is needed, including “aligned,” “trustworthy,” “human-centered,” and “responsible.” Few of these terms are clearly defined, making them difficult to understand or compare to one another. Given A.I.’s potential to transform society and affect every human, such descriptors and the organizations that propagate them require deeper public scrutiny and more input from civil society. Philanthropy, especially effective altruist proponents, has funded many of these organizations. A new, $200 million philanthropic effort to govern A.I. — from donors who aren’t in the effective-altruist camp — may allow for more diverse and inclusive perspectives on how artificial intelligence is used and give real meaning to those descriptors.
Salary Transparency. A robust push in the United States for nonprofits to list salaries in job postings aims to reduce salary discrepancies within organizations, make the hiring process more efficient, and attract top candidates. State laws requiring salary listings in all job postings have helped encourage the movement, shifting the norm from opacity to an expectation that salary ranges should be shared.