Melding artificial intelligence and algorithms with health care and policy to combat human trafficking
An interdisciplinary team of Stanford researchers — from economists to statisticians to pediatricians — is collaborating to fight the global scourge of human trafficking.
It’s a titanic task.
On any given day, some 40 million people are trafficked into modern slavery, which includes labor trafficking, sex trafficking and forced marriage.
Forty million people — that’s a few million more than all the people in Canada.
The United Nations and myriad government agencies, NGOs and nonprofit foundations have worked tirelessly for decades to curb the practice of forced labor. They have had many successes — but they have not had the desperately needed large-scale microdata to fully understand trafficking markets and change the underlying structures that allow people to benefit from this criminal practice.
“Globally, policymakers are eager to find solutions to reduce trafficking, but efforts are impeded by the lack of data and almost complete lack of quantitative evidence on the causes of trafficking and the effectiveness of interventions,” said Grant Miller, a professor of medicine at Stanford Health Policy and a senior fellow at the Stanford Institute for Economic Policy Research (SIEPR).
“As a result, global anti-trafficking policy has not been guided by data-informed decisions,” he said.
The Stanford Human Trafficking Data Lab intends to conduct critical research through a collaboration among academics, health-care providers and frontline trafficking experts and prosecutors, using promising innovations in modern data science. These include machine learning from artificial intelligence, statistics, and building algorithms that try to determine the corporate structures of those who are profiting from the exploitation of human beings.
The team is creating an open-source data repository that will serve as a global model for innovation in anti-trafficking research. They’re also developing novel research methods and engaging with policymakers and frontline actors to identify the most effective solutions to adapt on a global scale.
“One of the most ambitious tools from the machine learning world, for example, is the creation of what will be a decision-support tool that the labor sector and prosecutors can use,” said Miller, a principal investigator of the Lab. “They’re trying to use their judgment to detect and catch traffickers, and we think that we could potentially develop machine-learning algorithms with all of this data to improve their work.”
The lab he co-founded together with other Stanford scholars was just granted a $900,000 award from the Stanford King Center on Global Development and also has funding from the U.S. State Department, via the African Programming and Research Initiative to End Slavery at University of Georgia.
“This is a huge and highly ambitious project,” said Miller, who is a faculty affiliate at the King Center as well as a senior fellow at the Freeman Spogli Institute for International Studies. “It’s an intensive project to build the technology and then try it out in the real world to see how it works and improve on it in ways that policymakers would want to use.”
Mixing data and social sciences
Pediatrician Vicki Ward, a clinical assistant professor of pediatrics at Stanford Children’s Hospital who focuses on pediatric global health, is providing her deep experience on risk analysis for forced labor in supply chains and is working on the development of effective interventions. She also chairs the Board of Directors of the anti-trafficking NGO Made in a Free World and has worked extensively on human trafficking.
“Trafficking is one of the most pernicious human rights and global health problems, not only that of sex trafficking, but the problem of forced labor in general — which is rampant and widespread,” said Ward. “Given the profound nature of the health impacts for victims, we need better solutions not only to intervene, but to prevent it from happening in the first place.”
Kimberly Babiarz, a social science research scholar at Stanford Health Policy who is an expert on large-scale impact evaluation, econometrics and machine-learning analysis, is serving at the senior analyst on several of the research initiatives. One of those is examining Brazil’s cash-transfer program, which ties government welfare programs and cash support to conditions that must be met if beneficiaries want to remain on the receiving end.
She notes that a commonly held view among policymakers is that poverty is a leading risk factor for human trafficking, but there has never been a quantitative assessment of this relationship to cash transfers.
“Brazil has the world’s largest cash-transfer program with over 27 million households, and we’re looking at the risk of trafficking and child labor in beneficiary households,” Babiarz said.
Brazil sets example
Luis Fabiano de Assis, a data scientist and visiting scholar at the Center for Human Rights and International Justice at Stanford, is another key member of the team. He is a prosecutor at the Federal Labor Prosecution Office in his native Brazil, a trafficking hub in Latin America owing to its sizable economy. The country has strong open data policies and a government commitment to fighting human trafficking — so Brazil was a natural choice for the initial focus of the research.
“And there is a demonstrated demand among Brazil’s government agencies for data and evidence that would help curb trafficking,” De Assis said.
De Assis, a world-renowned expert on applying data to human rights, emphasized the human faces of the victims of the global phenomenon should not be forgotten. They suffer lifelong mental and physical health disabilities even after exiting a situation of trafficking: trauma, depression, HIV/AIDS and other sexually transmitted diseases, cognitive impairment and memory loss.
“As a frontline agent and Brazilian prosecutor who works with the rescued, we take them straight to health institutions because they’re usually destroyed, mentally and physically, and face a lot of disease,” De Assis said. “We were astonished that the ones who die, die really young of diseases that the general population doesn’t die from.”
Respiratory diseases and lung cancer rates among victims of trafficking are much higher than in the general population in Brazil, he said.
These are not only issues of public health, but of human rights.
Jessie Brunner, the director of human trafficking research at the Center for Human Rights and International Justice, is the point person for connecting the lab with outside partners and keeping them on track with the U.N. Sustainable Development Goals related to human trafficking and forced labor.
“My role is to think about how we can connect our work to local, national and international anti-trafficking efforts — and keeping tabs on what funders are doing, while looking for key partners because we’re hoping to grow geographically,” Brunner said.
Stanford Medicine statistician and assistant professor of epidemiology Mike Baiocchi is working on a decision-support tool for the lab, which they call the “intuition engine.” The architecture is based on a data processing pipeline that transforms a constant flow of incoming trafficking clues from a variety of sources — and then comes up with predictions of trafficking risks.
“We’re trying to give prosecutors the kinds of information they need so they can tackle these complicated cases,” said Baiocchi.
“One of the aspects that’s important to this team is that we’re listening and learning from how the prosecutors do their jobs, and we’re enabling machines to figure out a way to comb these huge data sets and automate and prepare insights for the prosecutors, to help prosecutors get into the field faster,” said Baiocchi.
Specifically, he’s using machine learning — getting computers to learn on their own by feeding them data —to have them chew through data in the background, so it’s largely digested by the time prosecutors need it.
“Workers who have been on the front lines develop routines and processes they find helpful, and machine learning is getting sophisticated enough that a lot of complex processes can be automated, Baiocchi said.
For example, Ying Jin, a PhD student who has been working with Baiocchi, been using cluster detecting algorithms to process the data. He said the idea came to them from a conversation they had with a prosecutor who told them that when they find one trafficker they could usually pull a thread and unravel a bunch of others.
“So our team has started to automate this ‘pulling the thread’ process,” Baiocchi said. “Machines are so powerful because they work nonstop, they don’t get bored, and they are scalable. We can have these machine learning algorithms running in the background, getting the data in a usable format so when a prosecutor gets a tip, they can go to the `intuition engine’ and plug it in and find out how concerned they should be.”