District Attorney George Gascon on Wednesday announced a new AI tool developed by Stanford University to help factor out race in prosecutions. (Kevin N. Hume/S.F. Examiner)

San Francisco DA announces AI tool from Stanford to avoid racial bias in prosecutions

Artificial intelligence will scan police reports to redact information about race

A new artificial intelligence tool from District Attorney George Gascon seeks to root out the potential for racial bias when prosecutors are deciding whether a person should be charged with a crime.

The bias-mitigation tool that Gascon announced Wednesday will obscure from prosecutors all information about the race of an individual included in the police reports they review when making an early charging decision in a case.

That prosecutor would then record an initial charging decision before making a final decision by reviewing an unredacted version of the police report as well as other evidence that would reveal the race of a person.

Gascon plans to track changes between the first and second charging decisions to see if additional evidence factored into the final decision, or if steps are needed to remove racial bias from the process.

“This technology will reduce the threat that implicit bias poses to the purity of decisions which have serious ramifications for the accused,” Gascon said in a statement. “That will help make our system of justice more fair and just.”

The Stanford Computational Policy Lab developed the new open-source bias mitigation tool for Gascon at no cost to the District Attorney’s Office. Gascon plans to implement the tool beginning July 1.

This is not the first time that Gascon has unveiled new computer technology to reform his office. In 2018, he partnered with the nonprofit Code For America to clear decades of marijuana convictions after California legalized cannabis.

The bias-mitigation tool will remove all hints of race from police reports including the names of suspects, officers and witnesses, according to his office.

The tool will also redact officer star numbers, hair and eye color and neighborhood information that can suggest the race of a person involved. The tool will replace the information with labels such as “race” or “neighborhood.”

Before making a final decision, prosecutors will review the full police report and other evidence including body-worn camera footage that might show the race of the persons involved.

In the future, the District Attorney’s Office could use the tool to help reduce the jail population by having the program flag cases for prosecutors that are likely to be dismissed based on a number of factors including whether the allegations are gang-related, drug-related or involve a number of charges.

The algorithm could review cases within eight hours of an arrest and lead to charging decisions within two business days, resulting in an estimated 35 percent reduction in jail time for those who are not charged, according to the office.

“If there’s a statistically significant likelihood that a low-level offense is going to be dismissed, there is no reason to continue to infringe upon a person’s civil liberties purely due to resource constraints,” said Max Szabo, a spokesperson for the District Attorney’s Office.

mbarba@sfexaminer.com

Just Posted

SF’s budding cannabis equity program finds support in Mayor Breed’s budget proposal

San Francisco’s cannabis equity program has had a slow start since the drug was legalized last year.

New Richmond to SF ferry service nets ‘incredible’ ridership growth

A ferry route launched in January from Richmond to San Francisco was… Continue reading

Food workers at SFO call for increased wages, health care with strike vote

‘I can’t even think of having health insurance,’ one worker says

SF supervisor candidate Dean Preston to ‘rogue’ landlords: ‘We should run you out of town’

Sometimes two political candidates are so similar, it’s a toss-up. Why even… Continue reading

Most Read