Brandon Anderson sees artificial intelligence as the key to holding police accountable for racial bias, and he believes he has created a tool to do it.
“Police departments crunch huge amounts of data today, but we still don’t know how often law enforcement officers have hurt, killed, or for that matter saved and comforted people in the line of duty,” Anderson told CNBC.
Now the Oklahoma City native is seeking to change that with the creation of Raheem.AI, a data tool and chatbot app that allows community members to report police conduct in real time in a secure and anonymous way.
On Tuesday, Anderson, 33, was chosen as a 2018 fellow by Echoing Green, which funds innovation in areas ranging from racial justice to environmental conservation, for his work on Raheem.
As a member of the 2018 cohort, which consists of 35 fellows, Anderson will receive a two-year seeding grant, programmatic support, and access to a large network of alumni, including Michelle Obama.
Tiffany Thompson, a senior associate at Echoing Green and Anderson’s portfolio manager, said Anderson was chosen from a pool of over 2,800 applicants because of his proximity to his work.
“Those impacted should always be the ones leading this charge,” Thompson told The Crime Report. “That cannot be more true for Brandon, who does this work because of his personal experience” with police violence.
Anderson first became interested in technology as a means of saving lives when serving as a U.S. Army satellite engineer. Four years into Anderson’s service, he was called back home by a personal tragedy: after being charged with stealing a car, Anderson’s long-term partner had been beaten by police and had subsequently been hospitalized.
After his partner’s death, Anderson realized that behind the larger issue of police violence was a separate problem: the difficulty of reporting it.
“The process is intimidating,” Anderson said. “Most cities require you to do it in person and within business hours. That’s nearly impossible for most working Americans.”
The resulting: underreporting. Anderson says that 93 percent of incidents involving police brutality go uncounted, “[leaving] officers unaccountable for their behavior,” and failing to provide cities with the data they need to identify concerning trends.
Anderson believes technology such as Raheem is a means of amplifying the voices of community members who might otherwise stay quiet about their experiences with police. Working with police departments, he identified the types of data necessary to inform policies that would put an end to violence, and developed Raheem to collect such information.
Users can access Raheem through the Facebook Messenger app or through the company’s website. The chatbot asks simple questions about a police interaction, presenting opportunities to write in greater detail as well.
When asked about the origin of his project’s name, Anderson said, “Raheem means compassion in Arabic, which embodies the spirit of love that was the impetus of this journey for me.”
“And it’s a human name for a human problem – it’s not a technological problem, or a process problem, or racial problem. Or it’s not only that. It’s a human problem, and I wanted to speak to that humanity.”
In addition to being an Echoing Green fellow, Anderson is also one of eight Black Male Achievement (BMA) fellows.
Thompson, who runs the BMA fellowship and formerly served as Engagement Associate for Barack Obama’s “My Brother’s Keeper” Initiative, spoke to the particular importance of Anderson’s work to black males within the current climate.
“Black men and boys are being murdered at the hands of the police, and that’s not something new, but it’s becoming more prevalent in today’s society,” she said.
Supporting Black Men and Boys
“There’s a transformation that Brandon and Raheem are building out in a way that can help support black men and boys and can truly shift some of the systems that have been impacting them in a negative way.”
Anderson expressed his appreciation of the support he receives from the other BMA fellows.
“Frankly, 30 percent of my work in the past has been explaining to people the disproportionate impact [of police violence] on black people,” he said. “My effort doesn’t need to go into that anymore. My cohort gets that.”
Anderson plans to publish quarterly reports using the data Raheem collects to show where police are working well and where communities feel targeted by violence. The bot will also deliver custom reports to precincts, cities, and campuses to help them identify areas where their forces can improve.
Still, Anderson was clear that solving police violence does not end with technology, stressing the importance of coalitions with community-based organizations in making a difference.
He outlined three major goals for the Raheem.AI project.
First, Anderson seeks to cut the percentage of people who don’t report police violence. “Right now, 93 percent of people don’t report,” he said. “We want to get that down to zero.”
Second, “we want to collect large volumes of data and we want to use this data to advance policy solutions at the local and state level.”
Finally, in the long-term, Anderson hopes to arm community-based organizations with the necessary tools to engage in participatory budgeting.”
“There are very limited spaces wherein police can solve crime,” Anderson said. “Homeless people need homes; they don’t need quality-of-life infractions.
“Young people need access to better education; they don’t need truancy charges when they can’t make it to school.”
“Communities have a good handle on what they need to provide safe spaces. I want us to offer them the tools they need to craft participatory budgets that address their needs rather than giving money to the police.”
Raheem has already run pilot projects in Berkeley and San Francisco. Anderson’s first official partnership, with Oakland, Ca., will begin in a few months.
Elena Schwartz is a TCR news intern. She welcomes comments from readers.