Each day seems to bring fresh challenges in the search for reliable information. The 2024 election is already a major flashpoint, with politicians and far-right activists spouting lies about the economy, immigration, voting, and more. “Anti-woke” efforts and campaigns to slow climate action also rely on and benefit from disinformation.

Technology, particularly social media, exacerbates these global trends, which have real consequences. Facebook, for example, helped spread misinformation about the 2020 election results, which in turn helped incite the Jan. 6 insurrection at the Capitol. Despite this, many platforms have scaled back their efforts to fight mis- and disinformation.

A.I. brings further challenges, including fabricating research that influences people’s health-care decisions, manipulating deep fake videos, or even inventing news stories that seem all too real. The A.I.-generated fake photo of a plane crash near the Pentagon this spring, which briefly spooked financial markets, is just a precursor to far more sophisticated tactics to come.

Fortunately, philanthropy is waking up to these threats. Grant makers are seeing how both misinformation — inaccurate information — and disinformation, which is deliberately shared, undermine the work of the nonprofits they support. Whether it’s increasing vaccine hesitancy, eroding faith in democracy, or disproportionately harming already marginalized communities, the spread of falsehoods and conspiracy theories is threatening to reverse progress toward the social good.

“It feels like the house is on fire,” noted one foundation staff member at the recent Skoll World Forum.

ADVERTISEMENT

Global leaders share these concerns. The United Nations secretary general recently wrote that disinformation “poses an existential risk to humanity and further endangers democratic institutions and fundamental human rights.” The Organization for Economic Cooperation and Development launched a resource hub for mis- and disinformation. Aid agencies are mobilizing, too: The United States Agency for International Development has multiple regional programs to combat mis- and disinformation in areas such as the Sahel region of Africa and the Caucasus.

Grant makers are also working toward solutions, as seen in the rapid growth of media and fact-checking organizations. There are currently 417 active fact-checking projects in more than 100 countries, according to the Duke Reporters’ Lab. Many receive philanthropic support as well as financial backing from big tech firms. The International Fact-Checking Network at the Poynter Institute, for example, receives funding from foundations such as Craig Newmark Philanthropies, Google, and YouTube.

But philanthropy needs to go beyond simply reacting to mis- and disinformation on a case-by-case basis. Instead, grant makers should focus on creating a healthy information ecosystem that doesn’t exacerbate lies and falsehoods, limits their proliferation in the first place, and ensures a steady supply of quality information. To get there, philanthropists need to move beyond the piecemeal approach that current research and programming often takes and instead coordinate better, help enact policy change, and invest in more research to understand the scale of the problem.

A holistic approach of this kind would encourage donors to advocate for and fund all the elements necessary to restore a trusted, reliable information environment where high-quality, diverse information sources thrive. A common framework could help the philanthropic community understand how different types of funded programming impact each other and make it more difficult to spread lies.

“Information disorder is a whole-of-society problem that is foundational to the success — or failure — of our partners’ efforts in every other sphere,” Marla Blow, president and COO of the Skoll Foundation said in an interview. “We need a systems approach to addressing it.”

ADVERTISEMENT

Our model does just that. To start the conversation, Michael’s organization, the Transparency and Accountability Initiative, worked with Courtney to develop a framework that includes four steps grant makers can take to restore information integrity and fight misinformation.

Coordinate programming and funding across fields. This will help clarify how longstanding areas of donor support, including media literacy, access to information, and investigative journalism, align with newer investment areas, such as social media platform accountability and A.I. governance.

Luminate is the rare example of a grant maker that has explicitly employed this strategy. Its initiatives on social media platform accountability, strengthening public interest media, and combating mis- and disinformation fall under one umbrella and go beyond grantmaking, with staff leading advocacy efforts directly. Staff say such an approach allows them to coordinate complementary elements and have greater impact.

Similarly, grant makers should consider how the threat of mis- and disinformation affects longstanding programmatic priorities. The Skoll Foundation, for example, saw how false information undermined its key program areas of health, climate action, effective governance, and justice and equity. It now funds organizations like Digital Action, which aims to strengthen the information ecosystem overall, and those that address information risks within existing programs, such as to the Climate Action Against Disinformation coalition.

Advocate for and invest in policy change. The United States lags behind other countries when it comes to regulating tech platforms and the way they spread information. Australia recently passed its New Media Bargaining Code to require tech platforms to compensate news publishers for the news they feature — helping to ensure media outlets have the resources to provide strong journalism. The European Union’s new Digital Services Act focuses on holding tech platforms accountable for addressing issues such as disinformation on their platforms. But the U.S. has done little beyond enshrining protection for major technology companies.

ADVERTISEMENT

Philanthropy should fund advocacy efforts aimed at requiring tech companies to operate transparently while covering the costs of keeping information accurate, and addressing online safety issues such as harassment and extremism.

Donors should also back efforts focused on breaking up monopolistic practices that allow tech companies to get richer at the expense of reliable information access — for example, by favoring their own content in search engines or app stores.

Finally, more support is needed to promote the idea that our data and information systems are public goods that should be protected from propagandists and other nefarious actors who spread disinformation.

Support better research. Most studies focus on individual components of the information ecosystem, such as local news deserts or access to information, but not their cumulative effect on discourse and democracy.

What’s more, of 155 recently reviewed studies on misinformation, more than 80 percent were conducted exclusively in Global North countries. Efforts to combat disinformation in Asia, for example, have been hampered by what some see as the Anglocentric nature of existing studies. But the mis- and disinformation problem reaches every corner of the globe, and the research should reflect that.

ADVERTISEMENT

Until we get a more comprehensive picture of what works, philanthropy risks investing resources on ineffective responses or those that exacerbate problems. For example, media and information literacy efforts that help people discern authentic content are no match for generative A.I., which makes it impossible to detect data manipulation. An alternative use for donor dollars is to fund efforts such as the Content Authenticity Initiative, which is developing common technical standards that can’t be tampered with to validate the authenticity of online content.

Grant makers should also recognize how research choices today influence future technologies. For example, a focus on A.I. ethics and governance without considering market power, data privacy, or regulations could replicate failures of the social media age that have allowed a few powerful platforms to shape the public sphere, undermine journalism, and perpetuate mis- and disinformation.

Work together. One funder working alone can’t solve today’s information crisis. Individual philanthropies can treat different symptoms but need a shared north star. A systems-based approach around a common framework will encourage grant makers to align their strategies, identify gaps that deserve more funding, and push philanthropy to strengthen the overall health and resilience of our information environment.

We must create the conditions for a future where accurate and diverse information is readily available, and individuals can make informed decisions that benefit themselves and society.

Preventing mis- and disinformation from infiltrating the 2024 presidential election, and so many other aspects of our daily lives, is impossible. But a coordinated and comprehensive effort can greatly minimize harm. The world desperately needs a healthy information ecosystem. To get there, philanthropy must confront this crisis with the commitment and seriousness it deserves.