Skip to content
Advertisements

Amazon tool flags Congress members as suspects

Rekognition is being deployed by US law enforcement agencies but its accuracy was tested by a civil rights body which ran photos of Congress members through a police database of suspects

An Amazon facial recognition tool has wrongly identified 28 members of the US Congress as police suspects.

The American Civil Liberties Union (ACLU) compared official photos of the politicians with a database of public arrest photos.

Amazon took issue with the findings, saying the system needed to be set at 95% accuracy not the 80% used by ACLU.

But the civil rights group said it highlighted the inadequacy of facial recognition technology.

“Our test reinforces that face surveillance is not safe for government use,” said Jacob Snow, ACLU’s technology and civil liberties lawyer.

“Face surveillance will be used to power discriminatory surveillance and policing that targets communities of colour, immigrants, and activists. Once unleashed, that damage can’t be undone.”

Unintended consequences

The tool – Rekognition – is touted by Amazon as being useful for a range of things, from detecting offensive content to identifying celebrities.

It is also working with some US local law enforcement agencies to implement the system for identifying criminals.

The 80% accuracy range used by ACLU is the system’s default setting but a spokeswoman for Amazon Web Services told Reuters that, for identifying individuals, it recommended setting a threshold of 95% or higher.

According to ACLU, nearly 40% of the system’s false matches were for black Congress members, even though they make up only 20% of the legislature.

Among those being wrongly identified was civil rights leader John Lewis, who is a member of the Congressional Black Caucus.

That group recently wrote to Amazon chief executive Jeff Bezos expressing concerns about the “profound negative unintended consequences” facial recognition systems could have for black people.

“Congress should press for a federal moratorium on the use of face surveillance until its harms, particularly to vulnerable communities, are fully considered,” said ACLU’s legislative counsel Neema Singh Guliani.

“The public deserves a full debate about how and if face surveillance should be used.”

In the UK, lawyers for civil liberties group Big Brother Watch have launched a legal challengeagainst the use of automatic facial recognition technology by police.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: