Face Recognition Technologies: Designing Systems that Protect Privacy and Prevent Bias
The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Santa Monica, Calif.
RAND Corporation
[2020]
|
Schlagworte: | |
Online-Zugang: | Volltext |
Zusammenfassung: | The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here was intended to highlight for policymakers the high-level privacy and bias implications of FRT systems. In the report, the authors describe privacy as a person's ability to control information about them. Undesirable bias consists of the inaccurate representation of a group of people based on characteristics, such as demographic attributes. Informed by a literature review, the authors propose a heuristic with two dimensions: consent status (with or without consent) and comparison type (one-to-one or some-to-many). This heuristic can help determine a proposed FRT's level of privacy and accuracy. The authors then use more in-depth case studies to identify "red flags" that could indicate privacy and bias concerns: complex FRTs with unexpected or secondary use of personal or identifying information; use cases in which the subject does not consent to image capture; lack of accessible redress when errors occur in image matching; the use of poor training data that can perpetuate human bias; and human interpretation of results that can introduce bias and require additional storage of full-face images or video. This report is based on an exploratory project and is not intended to comprehensively introduce privacy, bias, or FRTs. Future work in this area could include examinations of existing systems, reviews of their accuracy rates, and surveys of people's expectations of privacy in government use of FRTs |
Beschreibung: | XVIII, 69 Seiten 23 cm |
ISBN: | 9781977404558 1977404553 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV046853943 | ||
003 | DE-604 | ||
005 | 20210901 | ||
007 | t | ||
008 | 200814s2020 |||| 00||| eng d | ||
020 | |a 9781977404558 |9 978-1-9774-0455-8 | ||
020 | |a 1977404553 |9 1-9774-0455-3 | ||
035 | |a (OCoLC)1268178988 | ||
035 | |a (DE-599)BVBBV046853943 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-2070s | ||
088 | |a RAND RR-4226-RC | ||
100 | 1 | |a Yeung, Douglas |e Verfasser |4 aut | |
245 | 1 | 0 | |a Face Recognition Technologies |b Designing Systems that Protect Privacy and Prevent Bias |c Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky |
264 | 1 | |a Santa Monica, Calif. |b RAND Corporation |c [2020] | |
300 | |a XVIII, 69 Seiten |c 23 cm | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
520 | 3 | |a The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here was intended to highlight for policymakers the high-level privacy and bias implications of FRT systems. In the report, the authors describe privacy as a person's ability to control information about them. Undesirable bias consists of the inaccurate representation of a group of people based on characteristics, such as demographic attributes. Informed by a literature review, the authors propose a heuristic with two dimensions: consent status (with or without consent) and comparison type (one-to-one or some-to-many). This heuristic can help determine a proposed FRT's level of privacy and accuracy. The authors then use more in-depth case studies to identify "red flags" that could indicate privacy and bias concerns: complex FRTs with unexpected or secondary use of personal or identifying information; use cases in which the subject does not consent to image capture; lack of accessible redress when errors occur in image matching; the use of poor training data that can perpetuate human bias; and human interpretation of results that can introduce bias and require additional storage of full-face images or video. This report is based on an exploratory project and is not intended to comprehensively introduce privacy, bias, or FRTs. Future work in this area could include examinations of existing systems, reviews of their accuracy rates, and surveys of people's expectations of privacy in government use of FRTs | |
653 | 0 | |a Human face recognition (Computer science) | |
653 | 0 | |a Human face recognition (Computer science) | |
653 | 0 | |a Privacy, Right of | |
653 | 0 | |a Physical-appearance-based bias / Prevention | |
700 | 1 | |a Balebako, Rebecca |e Sonstige |4 oth | |
700 | 1 | |a Gutierrez, Carlos Ignacio |e Sonstige |4 oth | |
700 | 1 | |a Chavkowsky, Michael |e Sonstige |4 oth | |
856 | 4 | 0 | |u https://www.rand.org/content/dam/rand/pubs/research_reports/RR4200/RR4226/RAND_RR4226.pdf |x Verlag |z kostenfrei |3 Volltext |
999 | |a oai:aleph.bib-bvb.de:BVB01-032262716 |
Datensatz im Suchindex
_version_ | 1804181689478414336 |
---|---|
adam_txt | |
any_adam_object | |
any_adam_object_boolean | |
author | Yeung, Douglas |
author_facet | Yeung, Douglas |
author_role | aut |
author_sort | Yeung, Douglas |
author_variant | d y dy |
building | Verbundindex |
bvnumber | BV046853943 |
ctrlnum | (OCoLC)1268178988 (DE-599)BVBBV046853943 |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03154nam a2200385 c 4500</leader><controlfield tag="001">BV046853943</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20210901 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">200814s2020 |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781977404558</subfield><subfield code="9">978-1-9774-0455-8</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1977404553</subfield><subfield code="9">1-9774-0455-3</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1268178988</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV046853943</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-2070s</subfield></datafield><datafield tag="088" ind1=" " ind2=" "><subfield code="a">RAND RR-4226-RC</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Yeung, Douglas</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Face Recognition Technologies</subfield><subfield code="b">Designing Systems that Protect Privacy and Prevent Bias</subfield><subfield code="c">Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Santa Monica, Calif.</subfield><subfield code="b">RAND Corporation</subfield><subfield code="c">[2020]</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XVIII, 69 Seiten</subfield><subfield code="c">23 cm</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here was intended to highlight for policymakers the high-level privacy and bias implications of FRT systems. In the report, the authors describe privacy as a person's ability to control information about them. Undesirable bias consists of the inaccurate representation of a group of people based on characteristics, such as demographic attributes. Informed by a literature review, the authors propose a heuristic with two dimensions: consent status (with or without consent) and comparison type (one-to-one or some-to-many). This heuristic can help determine a proposed FRT's level of privacy and accuracy. The authors then use more in-depth case studies to identify "red flags" that could indicate privacy and bias concerns: complex FRTs with unexpected or secondary use of personal or identifying information; use cases in which the subject does not consent to image capture; lack of accessible redress when errors occur in image matching; the use of poor training data that can perpetuate human bias; and human interpretation of results that can introduce bias and require additional storage of full-face images or video. This report is based on an exploratory project and is not intended to comprehensively introduce privacy, bias, or FRTs. Future work in this area could include examinations of existing systems, reviews of their accuracy rates, and surveys of people's expectations of privacy in government use of FRTs</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Human face recognition (Computer science)</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Human face recognition (Computer science)</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Privacy, Right of</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Physical-appearance-based bias / Prevention</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Balebako, Rebecca</subfield><subfield code="e">Sonstige</subfield><subfield code="4">oth</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Gutierrez, Carlos Ignacio</subfield><subfield code="e">Sonstige</subfield><subfield code="4">oth</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chavkowsky, Michael</subfield><subfield code="e">Sonstige</subfield><subfield code="4">oth</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.rand.org/content/dam/rand/pubs/research_reports/RR4200/RR4226/RAND_RR4226.pdf</subfield><subfield code="x">Verlag</subfield><subfield code="z">kostenfrei</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-032262716</subfield></datafield></record></collection> |
id | DE-604.BV046853943 |
illustrated | Not Illustrated |
index_date | 2024-07-03T15:10:46Z |
indexdate | 2024-07-10T08:55:39Z |
institution | BVB |
isbn | 9781977404558 1977404553 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-032262716 |
oclc_num | 1268178988 |
open_access_boolean | 1 |
owner | DE-2070s |
owner_facet | DE-2070s |
physical | XVIII, 69 Seiten 23 cm |
publishDate | 2020 |
publishDateSearch | 2020 |
publishDateSort | 2020 |
publisher | RAND Corporation |
record_format | marc |
spelling | Yeung, Douglas Verfasser aut Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky Santa Monica, Calif. RAND Corporation [2020] XVIII, 69 Seiten 23 cm txt rdacontent n rdamedia nc rdacarrier The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here was intended to highlight for policymakers the high-level privacy and bias implications of FRT systems. In the report, the authors describe privacy as a person's ability to control information about them. Undesirable bias consists of the inaccurate representation of a group of people based on characteristics, such as demographic attributes. Informed by a literature review, the authors propose a heuristic with two dimensions: consent status (with or without consent) and comparison type (one-to-one or some-to-many). This heuristic can help determine a proposed FRT's level of privacy and accuracy. The authors then use more in-depth case studies to identify "red flags" that could indicate privacy and bias concerns: complex FRTs with unexpected or secondary use of personal or identifying information; use cases in which the subject does not consent to image capture; lack of accessible redress when errors occur in image matching; the use of poor training data that can perpetuate human bias; and human interpretation of results that can introduce bias and require additional storage of full-face images or video. This report is based on an exploratory project and is not intended to comprehensively introduce privacy, bias, or FRTs. Future work in this area could include examinations of existing systems, reviews of their accuracy rates, and surveys of people's expectations of privacy in government use of FRTs Human face recognition (Computer science) Privacy, Right of Physical-appearance-based bias / Prevention Balebako, Rebecca Sonstige oth Gutierrez, Carlos Ignacio Sonstige oth Chavkowsky, Michael Sonstige oth https://www.rand.org/content/dam/rand/pubs/research_reports/RR4200/RR4226/RAND_RR4226.pdf Verlag kostenfrei Volltext |
spellingShingle | Yeung, Douglas Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias |
title | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias |
title_auth | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias |
title_exact_search | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias |
title_exact_search_txtP | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias |
title_full | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky |
title_fullStr | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky |
title_full_unstemmed | Face Recognition Technologies Designing Systems that Protect Privacy and Prevent Bias Douglas Yeung, Rebecca Balebako, Carlos Ignacio Gutierrez, Michael Chaykowsky |
title_short | Face Recognition Technologies |
title_sort | face recognition technologies designing systems that protect privacy and prevent bias |
title_sub | Designing Systems that Protect Privacy and Prevent Bias |
url | https://www.rand.org/content/dam/rand/pubs/research_reports/RR4200/RR4226/RAND_RR4226.pdf |
work_keys_str_mv | AT yeungdouglas facerecognitiontechnologiesdesigningsystemsthatprotectprivacyandpreventbias AT balebakorebecca facerecognitiontechnologiesdesigningsystemsthatprotectprivacyandpreventbias AT gutierrezcarlosignacio facerecognitiontechnologiesdesigningsystemsthatprotectprivacyandpreventbias AT chavkowskymichael facerecognitiontechnologiesdesigningsystemsthatprotectprivacyandpreventbias |