Big Data’s Disparate ImpactSolon Barocas, Andrew D. Selbst
Publikationsdatum:
|
|
Diese Seite wurde seit mehr als 7 Monaten inhaltlich nicht mehr aktualisiert.
Unter Umständen ist sie nicht mehr aktuell.
Zusammenfassungen
Advocates of algorithmic techniques like data mining argue that these techniques eliminate human biases from the decision-making process. But an algorithm is only as good as the data it works with. Data is frequently imperfect in ways that allow these algorithms to inherit the prejudices of prior decision makers. In other cases, data may simply reflect the widespread biases that persist in society at large. In still others, data mining can discover surprisingly useful regularities that are really just preexisting patterns of exclusion and inequality. Unthinking reliance on data mining can deny historically disadvantaged and vulnerable groups full participation in society. Worse still, because the resulting discrimination is almost always an unintentional emergent property of the algorithm's use rather than a conscious choice by its programmers, it can be unusually hard to identify the source of the problem or to explain it to a court. This Essay examines these concerns through the lens of American antidiscrimination law—more particularly, through Title VII's prohibition of discrimination in employment. In the absence of a demonstrable intent to discriminate, the best doctrinal hope for data mining's victims would seem to lie in disparate impact doctrine. Case law and the Equal Employment Opportunity Commission's Uniform Guidelines, though, hold that a practice can be justified as a business necessity when its outcomes are predictive of future employment outcomes, and data mining is specifically designed to find such statistical correlations. Unless there is a reasonably practical way to demonstrate that these discoveries are spurious, Title VII would appear to bless its use, even though the correlations it discovers will often reflect historic patterns of prejudice, others' discrimination against members of protected groups, or flaws in the underlying data. Addressing the sources of this unintentional discrimination and remedying the corresponding deficiencies in the law will be difficult technically, difficult legally, and difficult politically. There are a number of practical limits to what can be accomplished computationally. For example, when discrimination occurs because the data being mined is itself a result of past intentional discrimination, there is frequently no obvious method to adjust historical data to rid it of this taint. Corrective measures that alter the results of the data mining after it is complete would tread on legally and politically disputed terrain. These challenges for reform throw into stark relief the tension between the two major theories underlying antidiscrimination law: anticlassification and antisubordination. Finding a solution to big data's disparate impact will require more than best efforts to stamp out prejudice and bias; it will require a wholesale reexamination of the meanings of "discrimination" and "fairness."
Von Solon Barocas, Andrew D. Selbst im Text Big Data’s Disparate Impact (2016) Dieser wissenschaftliche Zeitschriftenartikel erwähnt ...
Begriffe KB IB clear | Algorithmusalgorithm , big databig data , ClassificationClassification , disparate impact , patternpattern , Statistikstatistics , Theorietheory |
Zitationsgraph
Zitationsgraph (Beta-Test mit vis.js)
Zeitleiste
10 Erwähnungen
- The Datafied Society - Studying Culture Through Data (Mirko Tobias Schäfer, Karin van Es) (2017)
- If...Then - Algorithmic Power and Politics (Taina Bucher) (2018)
- Damit Maschinen den Menschen dienen - Lösungsansätze, um algorithmische Prozesse in den Dienst der Gesellschaft zu stellen (Julia Krüger, Konrad Lischka) (2018)
- (Un)berechenbar? - Algorithmen und Automatisierung in Staat und Gesellschaft (Resa Mohabbat Kar, Basanta Thapa, Peter Parycek) (2018)
- Diskriminierungsrisiken durch Verwendung von Algorithmen - Eine Studie, erstellt mit einer Zuwendung der Antidiskriminierungsstelle des Bundes. (Carsten Orwat) (2019)
- The Alignment Problem (Brian Christian) (2020)
- Wenn KI, dann feministisch - Impulse aus Wissenschaft und Aktivismus (netzforma* e.V) (2021)
- New Perspectives in Critical Data Studies - The Ambivalences of Data Power (Andreas Hepp, Juliane Jarke, Leif Kramp) (2022)
- The Datafied Welfare State - A Perspective from the UK (Lina Dencik)
- Mensch und Maschine - Herausforderungen durch Künstliche Intelligenz (Deutscher Ethikrat) (2023)
- Fairness and Machine Learning - Limitations and Opportunities (Solon Barocas, Moritz Hardt, Arvind Narayanan) (2023)
Volltext dieses Dokuments
Big Data's Disparate Impact: Artikel als Volltext (: , 610 kByte; : ) | |
Big Data's Disparate Impact: Artikel als Volltext (: , 610 kByte; : ) |
Anderswo suchen
Beat und dieser wissenschaftliche Zeitschriftenartikel
Beat hat Dieser wissenschaftliche Zeitschriftenartikel während seiner Zeit am Institut für Medien und Schule (IMS) ins Biblionetz aufgenommen. Beat besitzt kein physisches, aber ein digitales Exemplar. Eine digitale Version ist auf dem Internet verfügbar (s.o.). Aufgrund der wenigen Einträge im Biblionetz scheint er es nicht wirklich gelesen zu haben. Es gibt bisher auch nur wenige Objekte im Biblionetz, die dieses Werk zitieren.