Two very different stories illustrate the impact of sophisticated decision-making tools on individuals and communities. In one, the Los Angeles Police Department publicly abandoned a program that used data to target violent offenders after residents in some neighborhoods were stopped by police as many as 30 times per week. In the other, New York City deployed data to root out landlords who discriminated against tenants using housing vouchers.
The second story shows the potential of automated data tools to promote social good — even as the first illustrates their potential for great harm.
Tools like these — typically described broadly as artificial intelligence or somewhat more narrowly as predictive analytics, which incorporates more human decision making in the data collection process — increasingly influence and automate decisions that affect people’s lives. This includes which families are investigated by child protective services, where police deploy, whether loan officers extend credit, and which job applications a hiring manager receives.
How these tools are built, used, and governed will help shape the opportunities of everyday citizens, for good or ill.
Civil-rights advocates are right to worry about the harm such technology can do by hard-wiring bias into decision making. At the Annie E. Casey Foundation, where we fund and support data-focused efforts, we consulted with civil-rights groups, data scientists, government leaders, and family advocates to learn more about what needs to be done to weed out bias and inequities in automated decision-making tools — and recently produced a report about how to harness their potential to promote equity and social good.
Foundations and nonprofit organizations can play vital roles in ensuring equitable use of A.I. and other data technology. Here are four areas in which philanthropy can make a difference:
Support the development and use of transparent data tools. The public has a right to know how A.I. is being used to influence policy decisions, including whether those tools were independently validated and who is responsible for addressing concerns about how they work. Grant makers should avoid supporting private algorithms whose design and performance are shielded by trade-secrecy claims. Despite calls from advocates, some companies have declined to disclose details that would allow the public to assess their fairness.
Instead, philanthropy can support efforts to ensure public accountability, such as by funding civic watchdogs and the development of policy mechanisms for enforcing that accountability. Consider, for instance, a recent investigation from the Markup, a tech-focused nonprofit news organization that receives funding from the Ford Foundation. It found that universities were using software that included race as a factor when predicting student success, which resulted in Black students being disproportionally steered away from math and sciences. After the Markup published its article, at least one school paused its use of the algorithms.
Upturn, a nonprofit that advocates for the use of technology to create equity and is a grantee of the Casey and Ford foundations, issued a report that found technology for assessing potential employees reinforces bias, even when the tools are told to ignore race and other characteristics. A broad-based coalition of civil-rights and employment groups has used the report to advocate for embedding civil- rights principles in hiring technologies. Task forces in New York City, Pittsburgh, and Vermont have responded by coming up with policies for regulating the selection and use of decision-making tools.
Invest in data tools that promote overlooked talent. Grant makers can support groups that are using A.I. or predictive analytics tools to create opportunities for those in marginalized or overlooked communities.
For instance, Catalyte in Baltimore removes traditional criteria such as a college degree or previous work experience from its hiring process. Instead, it uses data science and A.I. to screen applicants for software-developer jobs solely on their aptitude. This approach has helped Catalyte discover a pool of workers from unconventional backgrounds — security guards, truck drivers, artists, military veterans — whom it trains and supports as they enter the technology work force. In 2020, Catalyte partnered with Baltimore Corps, a work-force development nonprofit, to channel local tech talent into city jobs. More recently, the company spun out a separate nonprofit, Retrain America, to connect people with new, stable work opportunities.
Support advocates’ efforts to require governments to release data that could illuminate patterns of bias. Community advocates have real-world expertise that can make predictive analytics tools function better to improve government services instead of targeting individuals. Grant makers can help governments and advocates work together to incorporate those experiences into automated decision-making tools.
Our foundation worked with Actionable Intelligence for Social Policy, which helps state and local governments use data responsibly, to create a tool kit on incorporating equity into data systems. Specifically, the tool kit looks at ways to embed a focus on racial equity into every aspect of the data process with input from residents and advocates, starting in the initial planning stages and running through the dissemination of data analysis.
Support government efforts to create fair and effective data tools. Government agencies can develop new ways of approaching data — with the help of foundations, community groups, advocates, and researchers.
In the New York City tenant discrimination case, the Mayor’s Office of Data Analytics pinpointed areas with available housing, high-performing schools, and little crime but “suspiciously low use of housing vouchers,” reported Harvard University’s Ash Center for Democratic Governance and Innovation. Using data to target 24 neighborhoods for increased investigation by the nonprofit Fair Housing Justice Center, New York City lodged 120 income discrimination complaints against landlords.
In Chicago, researchers worked with the Chicago Department of Public Health to build a model that predicts the risk of a child being poisoned by lead — using blood tests, home inspections, property-value assessments, and census data — so inspectors could address hazards. The city also began targeting neighborhoods and buildings identified as being at risk with public-service advertisements and outreach.
Foundations can support these types of data-driven investigations, which uncover problems, fund the development of approaches to respond to them, and help distribute that information widely to advocates and policy makers.
At a time when the Covid-19 pandemic has exposed inequities in many societal structures and prompted a reimagining of more effective and equitable approaches, philanthropy must seize the chance to ensure data tools protect people from harm — while promoting their ability to actively advance social good. Philanthropy needs to lean into these challenges by funding and acting as thought partners to organizations that can help communities and governments use emerging technologies to build a brighter future.