Disparate impact

Keywords: disparate impact,fairness

Disparate impact is a legal and fairness concept describing a situation where a model, algorithm, or policy disproportionately affects one demographic group compared to another, even if the system appears facially neutral — meaning it doesn't explicitly use protected attributes like race or gender.

Legal Origin

- Rooted in US employment discrimination law (Civil Rights Act, Griggs v. Duke Power, 1971).
- The four-fifths (80%) rule: If the selection rate for a protected group is less than 80% of the rate for the most-selected group, there is evidence of disparate impact.
- Example: If 60% of male applicants are hired but only 40% of female applicants, the ratio is 40/60 = 67% < 80%, indicating potential disparate impact.

Disparate Impact in AI/ML

- Proxy Variables: Even without explicit use of race or gender, models can learn to use correlated features (zip code, name, browsing history) as proxies that produce discriminatory outcomes.
- Training Data Bias: Models trained on historically biased data will learn and reproduce those biases.
- Feature Engineering: Seemingly neutral features can encode social inequalities.

Examples in AI

- Credit Scoring: A model that denies loans more often to people from certain zip codes may disproportionately affect racial minorities due to historical residential segregation.
- Hiring Algorithms: Resume screening tools trained on historical hiring data may penalize female applicants in male-dominated industries.
- Facial Recognition: Higher error rates for darker-skinned individuals compared to lighter-skinned individuals.
- Healthcare: Clinical algorithms that use cost as a proxy for need can disadvantage groups with less access to healthcare.

Measuring Disparate Impact

- Adverse Impact Ratio: Selection rate of disadvantaged group / selection rate of advantaged group.
- Statistical Parity Difference: Difference in positive outcome rates between groups.
- Intersectional Analysis: Check for disparate impact across combinations of protected attributes.

Regulatory Landscape

Disparate impact analysis is increasingly required by AI regulations, including the EU AI Act, NYC Local Law 144 (automated employment decision tools), and EEOC guidelines.

Want to learn more?

Search 13,225+ semiconductor and AI topics or chat with our AI assistant.

Search Topics Chat with CFSGPT