Facial Recognition Software Prompts Privacy and Racism
Absolutely there is no doubt on facial recognition software prompts privacy and racism concerns in cities and states, where some lawmakers have to oppose to it’s limit and usage and even as law enforcement officials and landlords are embracing it, reason being, Facial recognition is different from other technologies.
One can identify someone from afar. They may never know. and you can do it on a massive scale and truth is any technology has the potential to be misused. Some cases have been reported,where the landlord misuse to rent-stabilize and to swap out key fobs for a facial recognition system.
There can be multiple questions can be raised like: What happened if he didn’t comply? Would he be evicted? And as a young black man, he worried that his biometric data would end up in a police lineup without him ever being arrested. Most of the building’s tenants are people of different color. “There’s a lot of scariness that comes with this, who along with other tenants is trying to legally block his management company from installing the technology.
“You feel like a guinea pig, “A test subject for this technology.”
Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition technology, some city and state officials are proposing to limit its use.
The earliest forms of facial recognition technology originated in the 1990s, and local law enforcement began using it in 2009. Such software uses biometrics to read the geometry of faces found in a photograph or video and compare the images to a database of other facial images to find a match. It’s used to verify personal identity .
Agencies rely more on cameras and security personnel to manage safety issues in their communities. They also rely on information they get from residents, who often are the most informed about what’s happening on their floors, in their buildings and in their neighborhoods.