Amazon selling facial recognition software to police, records reveal

ACLU releases documentation on Amazon Rekognition software, fueling fears of surveillance via police body cameras

In the aftermath of the uprising in Ferguson, Missouri, over the killing of Michael Brown, police departments and policy makers around the country hit upon a supposed panacea to racist policing and police brutality: body-worn cameras.

Many hailed the move as a victory for accountability. But among the few dissenters was Malkia Cyril, executive director of the Center for Media Justice and a leader in the Black Lives Matter network, who warned early and often that the cameras could become tools of surveillance against people of color because “body-worn cameras don’t watch the police, they watch the community being policed, people like me”.

The scope and scale of that surveillance became clearer Tuesday, when the American Civil Liberties Union of Northern California released a collection of public records detailing how Amazon has been marketing and selling facial recognition software, called Amazon Rekognition, to law enforcement agencies.

Amazon marketing materials promoted the idea of using Rekognition in conjunction with police body cameras in real time – exactly the outcome Cyril feared.

“That is a recipe for authoritarianism and disaster,” Cyril said. “Amazon shouldn’t be anywhere near it, and if we have anything to say about it, they will not be.”

The ACLU and about 40 other organizations also released a letter to the Amazon chief executive, Jeff Bezos, calling on the company to stop selling Rekognition and its “dangerous surveillance powers” to the government.

“Rekognition is a powerful surveillance system readily available to violate rights and target communities of color,” the groups wrote. “With Rekognition, Amazon delivers these dangerous surveillance powers directly to the government.”

Amazon did not immediately respond to a request for comment. A company spokeswoman told the Washington Post: “Amazon requires that customers comply with the law and be responsible when they use” its software products.

Though Amazon is best known to consumers for its e-commerce platform, the company also runs a giant cloud computing business. In November 2016, it launched Amazon Rekognition, an easy-to-use facial recognition service available to customers of Amazon Web Services. While some of the uses Amazon promoted for the product were merely voyeuristic – such as automatically detecting celebrities at the royal wedding – many of the use cases seemed specifically tailored for law enforcement.

The ACLU documents show how Amazon worked with the city of Orlando, Florida, and the sheriff’s department in Washington county, Oregon, to implement the technology.

At a conference in Seoul, Ranju Das, who is listed as the director of Rekognition on his LinkedIn profile, boasted about the capabilities of the panopticon created in the partnership with Orlando.

“They have cameras all over the city,” he said. “We analyze the video in real time [and] search against the collection of faces they have.”

The Orlando police department told NPR that its use of the technology was a “pilot program” and that it was following applicable laws.

But while the capabilities of the technology were impressive, the potential downfallswere catastrophic, warned Cyril.

“Technology is a tool; placing a tool in the context of extreme racism and brutality is simply going to produce more extreme racism and brutality,” Cyril said of police use of cameras. “When you add facial recognition into that context, and you add the supercomputing powers of Amazon, what you do is supercharge already existing discrimination to a level that is unprecedented.

“You not only increase the speed at which discrimination can take place, but you increase the scale at which discrimination can take place.”

The documents show that at least some public employees raised concerns about the technology. “The ACLU might consider this the government getting in bed with big data,” one Washington county employee wrote in an email.

In a blogpost detailing their findings, Matt Cagle and Nicole A Ozer of the ACLU responded: “That employee’s prediction was correct.”