Back Share
Strategies

Principle of Discriminatory non-harm for fairness, key considerations for outcome fairness

reflection-discussion

"Once you and your project team have thoroughly considered the use case appropriateness as well as technical feasibility of the formal models of fairness most relevant for your system and have incorporated the model into your application, you should prepare a Fairness Position Statement (FPS) in which the fairness criteria being employed in the AI system is made explicit and explained in plain and non-technical language. This FPS should then be made publicly available for review by all affected stakeholders." (Leslie, 2019, p.20)"Evaluation of A/IS must carefully assess potential biases in the systems’ performance that disadvantage specific social and demographic groups. The evaluation process should integrate members of potentially disadvantaged groups in efforts to diagnose and correct such biases.

## Further Resources- J. Angwin, J. Larson, S. Mattu, and L. Kirchner, “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks.” ProPublica,May 23,
1. - J. Griffiths, “New Zealand Passport Robot Thinks This Asian Man’s Eyes Are Closed.” CNN.com, December 9,
1. - L. D. Riek and D. Howard,. “A Code of Ethicsfor the Human-Robot Interaction Profession.Proceedings of We Robot, April 4,
1. - R. Tatman, “Google’s Speech Recognition Has a Gender Bias.” Making Noise and Hearing Things, July 12,
1. "p.184, IEEE, 2019

Overarching Principles Respect for persons Beneficence
Title Principle of Discriminatory non-harm for fairness, key considerations for outcome fairness