Technology

New York’s CCTV cameras secretly used by IBM to develop its surveillance software

Facial recognition technology is no stranger to controversial headlines. In its very nature, it’s a dividing technology as many fear it could be put to malicious use, creating an Orwellian state. IBM’s latest use of New York CCTV is unlikely to reassure those who are fearful of this invasive tech, considering it has been utilized to developed features like searching for individuals based on age, gender, and skin tone, reports The Verge.

According to confidential corporate documents and interviews with many of the technologists involved in developing the software, The Intercept and the Investigative Fund have discovered that IBM began developing this object identification technology using secret access to NYPD camera footage. The images of thousands of unknowing New Yorkers were used dating back to as early as 2012, offered up by NYPD officials. IBM aimed to develop new search features that would enable other police departments to search camera footage for images of people by hair color, facial hair, and skin tone.

This revelation is the latest of a long series of controversial ties between tech giants and the government. Amazon has been at the heart of one of these controversial partnerships which saw a letter addressed to Amazon CEO Jeff Bezos in June, featuring nearly 19 groups of shareholders who expressed reservations over the company’s decision to provide Rekognition to law enforcement in Orlando, Florida, and the Washington County (Oregon) Sheriff’s Office. These shareholders were joined by the American Civil Liberties Union, Amazon employees, academics, along with more than 70 other groups in protest.

In an attempt to clear their image Amazon has countered the Rekognition facial ID backlash by citing positive use cases. However, the likeliness of this reducing fears and concerns isn’t promising. After all, these groups are likely to be well aware of the positive uses for this technology but feel the risk, and damage, of it being abused is far too great of a risk to take.

When we consider the groundbreaking research taking place around facial recognition software, it is understandable that certain groups might be skeptical about how this technology could be put to use. Previous research has indicated that even discrete yet divisive factors such as sexuality can be identified using facial recognition software. If this were to fall into the wrong hands, with a hateful agenda, this could be incredibly damaging for the LGBT community.

However, even if we are skeptical about its introduction, it looks like it will be coming anyway to schoolsstadiums, and airports. We can only hope that those who have access to this information will use it responsibly, and hope that Amazon level of optimism isn’t unfounded, with plenty of positive use cases to come. Fingers crossed!

Previous ArticleNext Article

Leave a Reply

Your email address will not be published.