Strategies
Monitor impact of affective AI in care on human-human relationships and implement safeguards
# "RecommendationsAs this technology develops, it is important to monitor research into the development of intimate relationships between A/IS and humans. Research should emphasize any technical and normative developments that reflect use ofA/IS in positive and therapeutic ways while also creating appropriate safeguards to mitigate against uses that contribute to problematic individual or social relationships:
- Intimate systems must not be designed or deployed in ways that contribute to stereotypes, gender or racial inequality,or the exacerbation of human misery.
- Intimate systems must not be designed to explicitly engage in the psychological manipulation of the users of these systems unless the user is made aware they are being manipulated and consents to this behavior. Any manipulation should be governedthrough an opt-in system.
- Caring A/IS should be designed to avoid contributing to user isolation from society.
- Designers of affective robotics must publicly acknowledge, for example, within a notice associated with the product, that these systems can have side effects, such as interfering with the relationship dynamics between human partners, causing attachments between the user and the A/IS that are distinct from human partnership.
- Commercially marketed A/IS for caring applications should not be presented to be a person in a legal sense, nor marketed as a person. Rather its artifactual, that is, authored, designed, and built deliberately, nature should always be made as transparent as possible, at least at point of sale and in available documentation, as noted in Section 4, Systems Supporting Human Potential.
- Existing laws regarding personal imagery need to be reconsidered in light of caring A/IS.In addition to other ethical considerations, it will also be necessary to establish conformance with local laws and mores in the context of caring A/IS systems.
## Further Resources- M. Boden, J. Bryson, D. Caldwell, K. Dautenhahn, L. Edwards, S. Kember, P. Newman, V. Parry, G. Pegman, T. Rodden and T. Sorrell, Principles of robotics: regulating robots in the real world. Connection Science, vol. 29, no. 2, pp. 124-129, April
- - J. J. Bryson, M. E. Diamantis, and T. D. Grant, “Of, For, and By the People: The Legal Lacuna of Synthetic Persons.” Artificial Intelligence & Law, vol. 25, no. 3, pp. 273–291, Sept.
- - M. Scheutz, “The Inherent Dangers of Unidirectional Emotional Bonds between Humans and Social Robots,” in Robot Ethics: The Ethical and Social Implications of Robotics, P. Lin, K. Abney, and G. Bekey, Eds., pp.
- Cambridge, MA: MIT Press,
- "(IEEE, 2019, p.95)
## "Recommendations
- Commercially marketed A/IS should not be persons in a legal sense, nor marketed as persons. Rather their artifactual (authored, designed, and built deliberately) nature should always be made as transparent as possible, at least at point of sale and in available documentation.
- Some systems will, due to their application, require opaqueness in some contexts, e.g., emotional therapy. Transparency in such systems should be available to inspection by responsible parties but may be withdrawn for operational needs.
## Further Resources- R. C. Arkin, P. Ulam and A. R. Wagner, “Moral Decision-making in Autonomous Systems: Enforcement, Moral Emotions,
Dignity
The human right to be valued and treated with respect because of one's personhood.
, Trust and Deception,” Proceedings of the IEEE, _vol. _100, no. 3, pp. 571–589,
- - R. Arkin, M. Fujita, T. Takagi and R. Hasegawa. “An Ethological and Emotional Basis for Human-Robot Interaction,” _Robotics and Autonomous Systems, _vol.42, no. 3–4, pp.191–201,
- - R. C. Arkin, “Moving up the Food Chain: Motivation and Emotion in Behavior-based Robots,” in Who Needs Emotions: The Brain Meets the Robot, J. Fellous and M. Arbib., Eds., New York: Oxford University Press,
- - M. Boden, J. Bryson, D. Caldwell, et al. “Principles of Robotics: Regulating Robots in the Real World.” _Connection Science, _vol. 29, no. 2, pp. 124–129,
- - J. J Bryson, M. E. Diamantis and T. D. Grant. “Of, For, and By the People: The Legal Lacuna of Synthetic Persons,” Artificial Intelligence & Law, _vol. _25, no. 3, pp. 273–291, Sept.
- - J. Novikova, and L. Watts, “Towards Artificial Emotions to Assist Social Coordination in HRI,” International Journal of Social Robotics, _vol. _7, no. 1, pp. 77–88,
- M. Scheutz, “The Affect Dilemma for Artificial Agents: Should We Develop Affective Artificial Agents?” _IEEE Transactions on Affective Computing, _vol. 3, no. 4, pp. 424–433,
- - A. Sharkey and N. Sharkey. “Children, the Elderly, and Interactive Robots.” _IEEE Robotics & Automation Magazine, _vol. 18, no. 1, pp. 32–38,
- "p.103-102
Title
Monitor impact of affective AI in care on human-human relationships and implement safeguards