That’s cute people are going to use it anyways. Especially private companies they’ll just have employees that work security sign a NDA.
That’s cute people are going to use it anyways. Especially private companies they’ll just have employees that work security sign a NDA.
Milli Vanilli, Bigger than Elvis
It is a tool to view things with, no more than the camera footage. It would be silly to sentence someone for a crime based on it due to it being fallible, but it would be equally silly to not use a tool at our disposal to help narrow down persons of interest. my point being, it would likely track down WAY more legitimate suspects than it points towards the wrong ones.
The best thing to do would be for BOTH a human and a computer reviewing the footage.
- - - Updated - - -
Of course it would not be used like say, DNA evidence. It would be invaluable for helping to point to possible suspects though. People seem worried that they will be imprisoned because a computer said they committed a crime. It is simply an investigative tool and not a source of evidence. The video footage is the evidence. The tool only helps to point out who the people in said video footage MIGHT be.
Felpooti - DH - Echo Isles
Hack - Warrior - Echo Isles
Pootie - Hunter - Echo Isles
As far as I know these facial recognition systems can't take into account the rate of false positive or false negatives. All they can do is train on more data to reduce the error rate.
The bias doesn't come in at the level of security cameras and algorithms and IT workers. It would have to be a result of differences in society that exist independant of the facial recognition system.It also ignores the point; that these systems are subject to bias and are therefore not justifiable due to how intrusive they are.