Creators shall guard against all potential misuses and risks of A/IS in operation.
## BackgroundNew technologies give rise to greater risk of deliberate or accidental misuse, and this is especially true for A/IS. A/IS increases the impact of risks such as hacking, misuse of personal data, system manipulation, or exploitation of vulnerable users by unscrupulous parties. Cases of A/IS hacking have already been widely reported, with driverless cars, for example. The Microsoft Tay AI chatbot was famously manipulated when it mimicked deliberately offensive users. In an age where these powerful tools are easily available, there is a need for a new kind of education for citizens to be sensitized to risks associated with the misuse of A/IS. The EU’s General Data Protection Regulation (GDPR) provides measures to remedy the misuse of personal data.Responsible innovation requires A/IS creators to anticipate, reflect, and engage with users of A/IS. Thus, citizens, lawyers, governments, etc., all have a role to play through education and awareness in developing accountability structures (see Principle 6), in addition to guiding new technology proactively toward beneficial ends.
## Recommendations1.Creators should be aware of methods of misuse, and they should design A/IS in ways to minimize the opportunity for these.2.Raise public awareness around the issues of potential A/IS technology misuse in an informed and measured way by:
Providing ethics education and security awareness that sensitizes society to the potential risks of misuse of A/IS. For example, provide “data privacy warnings” that some smart devices will collect their users’personal data.
Delivering this education in scalable and effective ways, including having experts with the greatest credibility and impact who can minimize unwarranted fear about A/IS.
Educating government, lawmakers, and enforcement agencies about these issues of A/IS so citizens can work collaboratively with these agencies to understand safe use of A/IS. For example, the same way police officers give public safety lectures in schools, they could provide workshops on safe use and interaction with A/IS.
## Further Resources
A. Greenberg, “Hackers Fool Tesla S’s_Autopilot to Hide and Spoof Obstacles,”Wired, August
C. Wilkinson and E. Weitkamp, Creative Research and Communication: Theory and Practice, Manchester, UK: Manchester University Press, 2016 (in relation to Recommendation
#2).
Engineering and Physical Sciences Research Council, “Anticipate, Reflect, Engage and Act (AREA),” Framework for Responsible Research and Innovation, Accessed
p.32-33