-
Notifications
You must be signed in to change notification settings - Fork 9
FAQ: General
This database of faces is diverse in gender and ethnicity and can help minimize bias in FR models using it. The database utilizes an equal representation of all ethnicities and gender- unlike any publicly available datasets.
Here are some related datasets:
- http://www.whdeng.cn/RFW/index.html, (RFW)
- https://www.ibm.com/blogs/research/2019/01/diversity-in-faces/ (DiF)
- https://ieeexplore.ieee.org/document/8756625 (DemogPairs)
The four subgroups regarding ethnicity are: Asian, Black, Indian, and White. These subgroups are used to display how the bias differs between each combination of black males, black females, Asian males, and so forth. In the plots these ethnicities are represented by their first letters A = Asian, B = black, I = Indian, w = White.
There are two genders represented male and female. These are indicated by the letters M and F.
The dashboard is a useful tool that allows users to import their own unique facial recognition dataset, and have it be assessed for bias present in their data. The dashboard will display the information in various plots, as well as generating a data summary table and allowing user edit of the yielded plots.
All source code is made available under a BSD 3-clause license. You can freely
use and modify the code, without warranty, so long as you provide attribution
to the authors. See LICENSE.md
(LICENSE) for the full license text.
Biometrics deals with human body measurements and calculations. Regarding facial recognition, it has to do with the analysis of one's face. Other examples of biometrics in security include fingerprint scanning, retina scanners, and more security measures involving the unique features of the human body.
The manuscript text is not open source. The authors reserve the rights to the article content, which is currently submitted for publication in the 2020 IEEE Conference on AMFG.