Skip to main content

Law enforcement is using a facial recognition app with huge privacy issues

posted onJanuary 20, 2020
by l33tdawg
Engadget
Credit: Engadget

You may have good reason to be worried that police use of facial recognition might erode your privacy -- many departments are already using software with serious privacy concerns. The New York Times has learned that over 600 law enforcement agencies in the US and Canada have signed up in the past year to use software from little-known startup Clearview AI that can match uploaded photos (even those with imperfect angles) against over three billion images reportedly scraped from the web, including Facebook and YouTube. While it has apparently helped solve some cases, it also creates massive privacy concerns -- police could intimidate protesters, stalk people and otherwise abuse the system with few obstacles.

Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview's software, and the company's ability to safeguard data hasn't been tested in practice. Clearview itself remained highly secretive until late 2019. It's certainly capable of looking at search data if it wants -- police helping to test the software for the NYT's story got calls asking if they'd been talking to the media.

The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users' images en masse. Facebook said it was looking into the situation and would "take appropriate action" if Clearview is breaking its rules.

 

Source

Tags

Industry News

You May Also Like

Recent News

Friday, November 8th

Friday, November 1st

Tuesday, July 9th

Wednesday, July 3rd

Friday, June 28th

Thursday, June 27th

Thursday, June 13th

Wednesday, June 12th

Tuesday, June 11th

Friday, June 7th