Even if Detroit City Council rejects a proposal to extend the city’s facial recognition contract this week, it won’t spell the end for the police department’s use of the controversial tool that has led to at least two wrongful arrests.
The city’s three-year contract with South Carolina-based DataWorks Plus expired in July, but city and police officials claim they can continue to use the technology because they purchased the software outright. Upgrades, maintenance, and licensing are up for renewal Tuesday, but the city says its Department of Innovation and Technology (DoIT) can cover some of those responsibilities should council reject the proposed two-year, $220,000 extension.
“We already own the license to operate the software (and) we already bought and own the software … it is a part of what DPD has access to,” Art Thompson, Director of Public Safety and Cyber Security, said in a hearing last week. “It’s no different than buying a cell phone ... for a few years you get upgrades and at some point, when those upgrades stop, you still own the hardware.”
The discovery suggests ending the city’s use of the technology could be more difficult than opponents initially thought. Council does not appear supportive of an outright ban, which would require a two-thirds majority to override a mayoral veto, and it has stalled on a more modest proposal that would give citizens greater say in what surveillance tools the city adopts.
Meanwhile, the Board of Police Commissioners is largely supportive of facial recognition, and on Thursday voted against a proposal to draft a resolution in opposition to the contract extension. It’s also unable to set policy without approval from the mayor’s office.
At least one councilmember is crying foul. Raquel Castañeda-Lopez, who was on the committee that gave initial approval for the technology in 2017 and today opposes it, says council was never told the administration intended to use it in perpetuity. And she says the claim that it can is disingenuous.
“That was not my understanding and I don’t think that was the understanding of any of my colleagues,” she added. “I get the argument of, ‘We bought this so now we own it,’ like if we bought a police car, but it’s a little bit different for a technology.
“Technologies (require) updates, so I think it’s dangerous to imply the equipment is separate from the updates because if they were to continue to operate it outdated — given the concerns of the inaccuracies and racial biases — that would just be ludicrous.”
The technology can only be upgraded by DataWorks.
Opponents of facial recogntion — who include demonstrators who've gathered almost nightly following the police killing of George Floyd in Minneapolis — have urged council to oppose an extension, warning of racial bias. Studies have shown facial recognition systems disproportionately misidentify the faces of people of color, and Detroit is about 80-percent Black.
Those fears were realized this summer, when it emerged the police department had wrongly arrested two people whose images were produced by the software as potential matches for suspected criminals. In the first case, a Farmington Hills man was wrongly arrested after he came up as a possible match for a suspect in a watch theft at Shinola. The video of the suspect was grainy and a detective allowed a worker who was not present the day of the theft to select Robert Williams’ face from a lineup of images.
In another case, Michael Oliver, then 25, man was wrongly pegged as a teen suspected of stealing an iPhone. Both cases were dropped and have resulted in a complaint and lawsuit against the department. Williams, in conjunction with the ACLU, is asking the department to end its use of the technology.
Had the department followed its own rules, facial recognition would not have been used in either case. Since 2017, Detroit police officials have touted safeguards in place to prevent false arrests, noting that oversight is key because 96 percent of potential matches produced by the system are incorrect. The technology, they said, would only be used in violent crimes; a technician and supervisor would both have to agree on a match before forwarding to investigators; and a match could serve only as a lead — additional evidence would be needed to make an arrest.
Previously, the guidelines were standard operating procedure; they were put into a formal policy with input from the Board of Police Commissioners last September.
The department has blamed the wrongful arrests on shoddy detective work, rather than the technology itself, and says it’s held additional detective trainings in response. The Wayne County Prosecutor’s Office — which signed the arrest warrants — says it has also developed more stringent requirements for cases using facial recognition, including review during the warrant charging phase, before the preliminary examination, and again when the case is bound over to Circuit Court.
But opponents say the missteps suggest Detroit police may not be ready for the tool.
“No one argues that we don’t need tools to identify violent criminals,” said Castañeda-Lopez. “But something with lower risk, a lower percentage of error, that’s less inherently racist, even if it takes more time — those are the routes we should be pursuing. What we don’t (want) is a system that has been proven time and time again to discriminate against Detroiters.”
As of mid-June, department officials said they had attempted to use facial recognition in 700 cases. Matches were found in 246 of those and an arrest was made in 56.
Thompson, the director of Public Safety and Cyber Security for the city, says he does not expect the system will require upgrading for at least the next several months as officials continue to use it. It’s unclear whether the city’s IT team can update the software or what the consequences of failing to update it might be; a spokesperson for the Duggan administration and DataWorks Plus did not immediately respond to an inquiry.
It’s also unclear how the technology can continue to be used without a licensing extension. When asked, Garcia said, “The language referenced in your question is neither controlling nor operative, because the City already owns the software” and declined to clarify. But licensing the software for its initial 3-year term cost only $20,000, which falls under the threshold for council approval.
With limited options to put a stop to the technology at the local level, civil liberties advocates say they’re looking to federal and state government to intervene.
Bills introduced in the U.S. House and Senate would bar federal entities from using the technology and only allow state and local entities with moratoriums in place to receive federal grant funding.
A bill preventing law enforcement from using the technology except in emergencies passed the Michigan Senate last year.
San Francisco, Portland, and Boston have banned the technology.