AI education-research not-research-ethics
“We will consider “educational technology” (edtech) to include both (a) technologies specifically designed for educational use, as well as (b) general technologies that are widely used in educational settings.” (Cardona et al., 2023, p. 1)“AI can be defined as “automation based on associations.” When computers automate reasoning based on associations in data (or associations deduced from expert knowledge), two shifts fundamental to AI occur and shift computing beyond conventional edtech: (1) from capturing data to detecting patterns in data and (2) from providing access to instructional resources to automating decisions about instruction and other educational processes. Detecting patterns and automating decisions are leaps in the level of responsibilities that can be delegated to a computer system. The process of developing an AI system may lead to bias in how patterns are detected and unfairness in how decisions are automated. Thus, educational systems must govern their use of AI systems.” (Cardona et al., 2023, p. 1)“Understanding that AI increases automation and allows machines to do some tasks that only people did in the past leads us to a pair of bold, overarching questions:
1. What is our collective vision of a desirable and achievable educational system that leverages automation to advance learning while protecting and centering human agency?
1. How and on what timeline will we be ready with necessary guidelines and guardrails, as well as convincing evidence of positive impacts, so that constituents can ethically and equitably implement this vision widely?” (Cardona et al., 2023, p. 6)“Below we address three additional perspectives on what constitutes AI. Educators will find these different perspectives arise in the marketing of AI functionality and are important to understand when evaluating edtech systems that incorporate AI. One useful glossary of AI for Education terms is the CIRCLS Glossary of Artificial Intelligence Terms for Educators.11 AI is not one thing but an umbrella term for a growing set of modeling capabilities, as visualized in Figure
1. Figure3:Components,types,andsubfieldsofAIbasedonRegonaetal(2022).12” (Cardona et al., 2023, p. 11)(Cardona et al., 2023, p. 11)“12 Regona, Massimo & Yigitcanlar, Tan & Xia, Bo & Li, R.Y.M. (2022). Opportunities and adoption challenges of AI in the construction industry: A PRISMA review. Journal of Open Innovation Technology Market and Complexity, 8(45). https://doi.org/
1. 3390/joitmc8010045” (Cardona et al., 2023, p. 11)Human-Like Reasoning: “The idea of “human-like” is helpful because it can be a shorthand for the idea that computers now have capabilities that are very different from the capabilities of early edtech applications. Educational applications will be able to converse with students and teachers, co-pilot how activities unfold in classrooms, and take actions that impact students and teachers more broadly. There will be both opportunities to do things much better than we do today and risks that must be anticipated and addressed. The “human-like” shorthand is not always useful, however, because AI processes information differently from how people process information. When we gloss over the differences between people and computers, we may frame policies for AI in education that miss the mark.” (Cardona et al., 2023, p. 12)An Algorithm that Pursues a Goal:“This second definition emphasizes that AI systems and tools identify patterns and choose actions to achieve a given goal. These pattern recognition capabilities and automated recommendations will be used in ways that impact the educational process, including student learning and teacher instructional decision making. For example, today’s personalized learning systems may recognize signs that a student is struggling and may recommend an alternative instructional sequence.” (Cardona et al., 2023, p. 12)“Although this perspective can be useful, it can be misleading. A human view of agency, pursuing goals, and reasoning includes our human abilities to make sense of multiple contexts. For example, a teacher may see three students each make the same mathematical error but recognize that one student has an Individualized Education Program to address vision issues, another misunderstands a mathematical concept, and a third just experienced a frustrating interaction on the playground; the same instructional decision is therefore not appropriate. However, AI systems often lack data and judgement to appropriately include context as they detect patterns and automate decisions. Further, case studies show that technology has the potential to quickly derail from safe to unsafe or from effective to ineffective when the context shifts even slightly. For this and other reasons, people must be involved in goal setting, pattern analysis, and decision-making.15” (Cardona et al., 2023, p. 13)Intelligence Augmentation:““Intelligence Augmentation” (IA)17 centers “intelligence” and “decision making” in humans but recognizes that people sometimes are overburdened and benefit from assistive tools. AI may help teachers make better decisions because computers notice patterns that teachers can miss. For example, when a teacher and student agree that the student needs reminders, an AI system may provide reminders in whatever form a student likes without adding to the teacher’s workload. Intelligence Automation (IA) uses the same basic capabilities of AI, employing associations in data to notice patterns, and, through automation, takes actions based on those patterns. However, IA squarely focuses on helping people in human activities of teaching and learning, whereas AI tends to focus attention on what computers can do.” (Cardona et al., 2023, p. 14)“To develop guidance for edtech, the Department works closely with educational constituents. These constituents include educational leaders—teachers, faculty, support staff, and other educators—researchers; policymakers; advocates and funders; technology developers; community members and organizations; and, above all, learners and their families/caregivers. Recently, through its activities with constituents, the Department noticed a sharp rise in interest and concern about AI. For example, a 2021 field scan found that developers of all kinds of technology systems—for student information, classroom instruction, school logistics, parentteacher communication, and more—expect to add AI capabilities to their systems. Through a series of four listening sessions conducted in June and August 2022 and attended by more than 700 attendees, it became clear that constituents believe that action is required now in order to get ahead of the expected increase of AI in education technology—and they want to roll up their sleeves and start working together. In late 2022 and early 2023, the public became aware of new generative AI chatbots and began to explore how AI could be used to write essays, create lesson plans, produce images, create personalized assignments for students, and more. From public expression in social media, at conferences, and in news media, the Department learned more about risks and benefits of AI-enabled chatbots. And yet this report will not focus on a specific AI tool, service, or announcement, because AI-enabled systems evolve rapidly. Finally, the Department engaged the educational policy expertise available internally and in its relationships with AI policy experts to shape the findings and recommendations in this report.” (Cardona et al., 2023, p. 2)