Famous examples of Machine learning Fairness
1. Unfairness of Google online targeting ads. By CMU
Experiments by Carnegie Mellon University showed that significantly fewer women than men were shown online ads promising them help getting jobs paying more than $200,000, raising questions about the fairness of targeting ads online.
Experiment setting: researchers used AdFisher to create 1,000 simulated users — half designated male, half female — and had them visit 100 top employment sites. When AdFisher then reviewed the ads that were shown to the simulated users, the site most strongly associated with the male profiles was a career coaching service for executive positions paying more than $200,000.
Experiment result: The male users were shown the high-paying job ads about 1,800 times, compared to female users who saw those ads about 300 times
Reference: Questioning the Fairness of Targeting Ads Online
Link: https://www.cmu.edu/news/stories/archives/2015/july/online-ads-research.html
2. Amazon same-day delivery service discrimination based on race.
ZIP codes encompassing the primarily black neighborhood are excluded from same-day service, while the neighborhoods that are eligible are dominated by while residents. (Amazon Prime members in metro areas across the U.S. can enjoy Same-Day Delivery on a broad selection of items. Same-Day Delivery: Order in the morning, typically before noon, and get your items by 9 p.m. Afternoon or evening orders arrive the next day.)
Experiment setting: Bloomberg entered every U.S. ZIP code into the tool, and mapped the results on top of a complete U.S. ZIP code shape file, provided by ESRI, to produce a coverage map of Amazon’s Prime same-day delivery areas. Coverage maps show Amazon data as of April 8, 2016. Population data were compiled using block group figures from the 2014 American Community Survey 5-Year estimates tables. Racial categories including the following subsets: white alone, black or African-American alone, Hispanic or Latino, Asian alone, and other races.
Experiment result: Check out the reference webpage for more details. The most striking gap in Amazon’s same-day service is in Boston. “The centrally located neighborhood of Roxbury, with a population that’s about 59 percent black and 15 percent white, is excluded. The residents of the ZIP codes that border Roxbury on all sides are eligible for the service.” As shown below:
Reference: Amazon Doesn’t consider the Race of Its Customers. Should it?
Link: https://www.bloomberg.com/graphics/2016-amazon-same-day/
3. Criminal Prediction is biased against blacks.
Experiment setting: A computer program give a score for current criminals in jail, predicting the likelihood of each committing a future crime. Borden — who is black — was rated a high risk. Prater — who is white — was rated a low risk.
Experiment result: Two years later, they tracked two criminals. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars’ worth of electronics. Computer algorithm gave a totally wrong predicting result.
Reference: Machine Bais. There’s software used across the country to predict future criminals, And it’s biased against blacks.
Link: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Other readings: There are also other post/blogs/papers talking about more details of fairness definitions and approaches. Here are the links.
1. Approaching fairness in machine learning. Moritz Hardt.
Link: http://blog.mrtz.org/2016/09/06/approaching-fairness.html
2. Equality of Opportunity in Machine Learning. Moritz Hardt.
Link: https://ai.googleblog.com/2016/10/equality-of-opportunity-in-machine.html
3. How big data is unfair? Understanding unintended sources of unfairness in data driven decision making. Moritz Hardt.
Link: https://medium.com/@mrtz/how-big-data-is-unfair-9aa544d739de
4. Mirror Mirror. Reflections on Quantitative Fairness. Shira and Jackie.
Link: https://shiraamitchell.github.io/fairness/#definitions-conditional-independence
5. Fairness Definitions Explained. Sahil Verma and Julia Rubin.