AP Annual Conference 2024

A Values-Driven Approach to AI in the Classroom

AI Presentation at the AP Annual Conference

In classrooms across the country, teachers are figuring out how the boom in artificial intelligence will help or hinder their role of boosting human intelligence. “We’re facing it on the front lines,” said Lisa Boyd, who teaches AP English and AP Capstone courses at Luella High School in Henry County, Georgia. “Students are navigating things, trying out new things, oftentimes using AI in ways that maybe are not the most positive for enhancing their learning. So we’re reacting.”

Boyd was speaking as part of a panel at the 2024 AP® Annual Conference about the impact of artificial intelligence on classroom practices and student learning. The launch of ChatGPT in fall of 2022 kickstarted a debate among educators and policymakers about how generative AI—predictive algorithms that can produce essays, create art, summarize research, and perform other advanced tasks in response to user prompts—should be embraced or regulated in academic spaces. 

Precise data is hard to come by, but surveys have found widespread use of AI programs to complete school assignments even as students and teachers remain deeply uncertain about the impact of the technology on learning. The rapid emergence of mainstream AI tools has left educators scrambling to set best practices and guide students on how to ethically employ the new technology.

“We really care about understanding the lived realities and what’s happening in the classroom,” explained Sophia Romee, head of the College Board’s newly launched GenAI Studio. “We’re committed to responsibly integrating generative AI into AP classrooms and supporting teachers. We know it’s here to stay.” The GenAI studio is College Board’s incubator for new ideas and tools around artificial intelligence, designed to be a resource for AP educators and school leaders grappling with how to use AI to enhance learning.

The discussion at APAC focused on finding the right balance between setting rules for AI and allowing open experimentation. Armin Hamrah, a student at Claremont McKenna College who has done research on AI tools, said that school regulations should make room for students to test out new technologies and be open about what they are trying and learning. An open environment—instead of a prohibitive, punitive approach—is the best way for teachers and students alike to adapt, he said.

“Ultimately, it comes down to student responsibility,” Hamrah said. “We’ve got to really align the incentives, so we make it really easy for students to use these tools to accelerate their learning.”

Amit Patel, a managing director at the investment firm Owl Ventures and a former head of technology for Success Academy Charter Schools, emphasized that the most productive use-cases for AI tools are likely to emerge from organic exploration by students and teachers. Given how new AI tools are, and how fast the field is developing, setting detailed policies at the school or district level may prove difficult. 

“Any time you have a new technology like this, it’s going to be messy,” Patel said. “It’s messy in the corporate sector, in the government sector, in the legal sector. … It’s going to have to be iterative.”

He said that even the engineers building AI tools can’t agree on the best ways to use it, so educators and school leaders should give themselves some grace while trying new things and making inevitable mistakes. He encouraged teachers to get honest feedback from students about how they’re testing out AI tools and making time for classroom discussion about the ethics and effectiveness of different approaches. “See what comes back in terms of surfacing the use cases that are most interesting and most helpful,” Patel said.

Greg Gazanian, the chief strategy and innovation officer for Arcadia Unified School District outside of Los Angeles, echoed the call for lots of experimentation. He recalled past concerns about internet plagiarism, the accuracy of Wikipedia, and even the ethics of using spell- and grammar-check functions in word processing software. “Technology always looks big and scary,” he said. “With any meaningful technology, there are great things that are beneficial and really damaging things we have to mitigate.” 

He encouraged school leaders to come up with core values around the use of technology and testing any new policies against those broader goals. Arcadia’s technology policy, for example, holds that new tools should be used to help students “think and communicate effectively, empower and engage students in learning, and enhance students' opportunities to use information to solve problems and create new ideas.” Having those guiding principles makes it easier to weigh rules and classroom practices against the district’s core mission, Gazanian said.

Boyd called on policymakers to keep teacher feedback front-and-center in any conversation about new technology. Education will always be a fundamentally human enterprise, she said, driven by the strength of relationships between students and educators. “How are we driving students to have questions, to think, to have that independence?” she asked. “That’s our ultimate goal—for students to have questions and pursue those questions. Our goal is lifelong learners, not just completers of assignments.”

Across the AP Annual Conference, the sessions were designed to keep teacher and student voices at the center of the AI discussion. APAC was a showcase for the creative thinking of the AP community on everything from the ethics of AI in AP Art and Design and new tools for integrating AI into AP Computer Science to a hands-on workshop led by AI for Education that attracted over 300 educators.

“We’re very focused on listening to educators, listening to students, and making sure this technology is developed in a way that enhances education,” said Romee, describing the aim of College Board’s GenAI Studio. “We’re all going to have to learn how to use it well.”