The government reports echo critical 2018 studies from ACLU and MIT researchers openly wary of the technology. ![]() The agency’s internal privacy watchdog has said DHS should publicly report the performance of its deployed facial recognition systems, like those in trials at airports, on different racial and ethnic groups. The systems that were tested generally took longer to process people with darker skin and were less accurate at identifying them-although some vendors performed better than others. Test subjects had their skin pigment measured. In February, DHS staff published results from testing 11 commercial systems designed to check a person’s identity, as at an airport security checkpoint. The Department of Homeland Security has also found that darker skin challenges commercial facial recognition. Those with a solid red line uppermost incorrectly match black women's faces more than other groups. Each chart represents a different algorithm tested by the National Institute of Standards and Technology. ![]() Many facial recognition algorithms are more likely to mix up black faces than white faces.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |