“Compliance with this assessment list is not evidence of legal compliance, nor is it intended as guidance to ensure compliance with applicable law. Given the application-specificity of AI systems, the assessment list will need to be tailored to the specific use case and context in which the system operates. In addition, this chapter offers a general recommendation on how to implement the assessment list for Trustworthy AI though a governance structure embracing both operational and management level.” (High-Level Expert Group on AI, 2019, p. 24)“TRUSTWORTHY AI ASSESSMENT LIST (PILOT VERSION) “Diversity, non-discrimination and fairness “Stakeholder participation: Did you consider a mechanism to include the participation of different stakeholders in the AI system’s development and use? Did you pave the way for the introduction of the AI system in your organisation by informing and involving impacted workers and their representatives in advance?” (High-Level Expert Group on AI, 2019, p. 30)
##
# IEEE recommendation"To ensure representation of stakeholders, organizations should enact a planned and controlled set of activities to account for the interests of the full range of stakeholders or practitioners who will be working alongsideA/IS and incorporating their insights to build upon, rather than circumvent or ignore, thesocial and practical wisdom of involved practitioners and other stakeholders.
## Further Resources
T. L. Chen, et al. “Robots for Humanity: Using Assistive Robotics to Empower People with Disabilities,” _IEEE Robotics and Automation Magazine, _vol. 20, no. 1, pp. 30–39,
R. Hartson, and P. S. Pyla. The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Waltham, MA: Elsevier, 2012"p.130-131