A refugee in Iraq shows a photo of members of his family who were reunited by the nonprofit Refunite.
The chaos and confusion of conflict often separate family members fleeing for safety. The nonprofit Refunite uses advanced technology to help loved ones reconnect, sometimes across continents and after years of separation.
Refugees register with the service by providing basic information — their name, age, birthplace, clan and subclan, and so forth — along with similar facts about the people they’re trying to find. Powerful algorithms search for possible matches among the more than 1.1 million individuals in the Refunite system. The analytics are further refined using the more than 2,000 searches that the refugees themselves do daily.
We're sorry. Something went wrong.
We are unable to fully display the content of this page.
The most likely cause of this is a content blocker on your computer or network.
Please allow access to our site, and then refresh this page.
You may then be asked to log in, create an account if you don't already have one,
or subscribe.
If you continue to experience issues, please contact us at 571-540-8070 or cophelp@philanthropy.com
REFUNITE
A refugee in Iraq shows a photo of members of his family who were reunited by the nonprofit Refunite.
The chaos and confusion of conflict often separate family members fleeing for safety. The nonprofit Refunite uses advanced technology to help loved ones reconnect, sometimes across continents and after years of separation.
Refugees register with the service by providing basic information — their name, age, birthplace, clan and subclan, and so forth — along with similar facts about the people they’re trying to find. Powerful algorithms search for possible matches among the more than 1.1 million individuals in the Refunite system. The analytics are further refined using the more than 2,000 searches that the refugees themselves do daily.
Data science has the power to transform nonprofit work. Leaders are thrilled by the opportunity to scour vast amounts of data for connections but unnerved by the prospect of data tainted by bias and unseen algorithms deciding who gets services. For our July cover package, the Chronicle examines how philanthropy is grappling with what it means when machines take on tasks that humans usually to do.
The goal: find loved ones or those connected to them who might help in the hunt. Since Refunite introduced the first version of the system in 2010, it has helped more than 40,000 people reconnect.
One factor complicating the work: Cultures define family lineage differently. Refunite co-founder Christopher Mikkelsen confronted this problem when he asked a boy in a refugee camp if he knew where his mother was. “He asked me, ‘Well, what mother do you mean?’ " Mikkelsen remembers. “And I went, ‘Uh-huh, this is going to be challenging.’ "
Fortunately, artificial intelligence is well suited to learn and recognize different family patterns. But the technology struggles with some simple things like distinguishing the image of a chicken from that of a car. Mikkelsen believes refugees in camps could offset this weakness by tagging photographs — “car” or “not car” — to help train algorithms. Such work could earn them badly needed cash: The group hopes to set up a system that pays refugees for doing such work.
ADVERTISEMENT
“To an American, earning $4 a day just isn’t viable as a living,” Mikkelsen says. “But to the global poor, getting an access point to earning this is revolutionizing.”
Another group, Wild Me, a nonprofit created by scientists and technologists, has created an open-source software platform that combines artificial intelligence and image recognition, to identify and track individual animals. Using the system, scientists can better estimate the number of endangered animals and follow them over large expanses without using invasive techniques.
Automating Repetitive Tasks
To fight sex trafficking, police officers often go undercover and interact with people trying to buy sex online. Sadly, demand is high, and there are never enough officers.
Enter Seattle Against Slavery. The nonprofit’s tech-savvy volunteers created chatbots designed to disrupt sex trafficking significantly. Using input from trafficking survivors and law-enforcement agencies, the bots can conduct simultaneous conversations with hundreds of people, engaging them in multiple, drawn-out conversations, and arranging rendezvous that don’t materialize. The group hopes to frustrate buyers so much that they give up their hunt for sex online.
College Forward
A coach with College Forward helps a freshman at the University of Texas at Austin troubleshoot a financial-aid issue.
“We’ve had hundreds of thousands of text messages back and forth where buyers were attempting to buy sex from trafficking victims and instead they were talking to a piece of software,” says Robert Beiser, the group’s executive director.
ADVERTISEMENT
The bots send educational messages and counseling resources. “Research with buyers of trafficked sex indicates that they feel shameful about this practice, that it’s a habit that they’re in that they would like to break,” says Beiser.
Law-enforcement officers also can use the bots to identify sex offenders or people who have been convicted of violent crimes who are trying to buy sex. Plus, the data gathered can help officers better understand sex trafficking in their jurisdictions.
Tailoring Services
A Philadelphia charity is using machine learning to adapt its services to clients’ needs.
Benefits Data Trust helps people enroll for government-assistance programs like food stamps and Medicaid. Since 2005, the group has helped more than 650,000 people access $7 billion in aid.
The nonprofit has data-sharing agreements with jurisdictions to access more than 40 lists of people who likely qualify for government benefits but do not receive them. The charity contacts those who might be eligible and encourages them to call the Benefits Data Trust for help applying.
ADVERTISEMENT
To measure its impact, the organization in 2016 teamed up with researchers from the Abdul Latif Jameel Poverty Action Lab for a randomized, controlled trial with 30,000 people.
Test subjects were split into three equal groups. The first received the organization’s regular outreach, a letter with a phone number and an offer to help. Members of the second group received a letter indicating they might qualify for assistance and encouraging them to call the Pennsylvania Department of Human Services to apply. The final group, acting as a control, was not contacted by the organization.
The results: 18 percent of people who received the group’s typical letter enrolled for benefits — three times as much as those who were not contacted at all. But there was another important finding: Eleven percent of the people who received the letter encouraging them to apply for benefits directly enrolled, suggesting that not everyone needs the same level of help.
Benefits Data Trust is now turning to data science to predict who might apply successfully on their own. It will use that information to shape its outreach efforts and aid call-center staff in helping clients.
Individuals who apply on their own stand to reap real benefits. For example, those who directly submit the documentation for food stamps will receive benefits faster than if they had applied through the charity. That saves the group time, allowing it to help more people.
ADVERTISEMENT
The data-driven change in how Benefits Data Trust approaches potential applicants will be subtle. If the model predicts people will need help, the letter will encourage them to call the charity. If the model suggests the opposite, the letter will give them three options: apply for the benefits online, contact the local government-assistance office, or call Benefits Data Trust.
The goal isn’t to take away services, says Pauline Abernathy, the nonprofit’s chief strategy officer. “It’s just what one leads with.”
Another group, College Forward, has a lot of data after 15 years of helping first-generation and low-income college students finish their degrees. The nonprofit is using artificial intelligence to flag coaches when one of their students might be headed for trouble so they can intervene early and head off bigger problems.