AI Role in Education

|

Guest Article

What Schools Are Asking About AI — Part 2 (Eric Hudson)

Mar 27, 2024

Eric Hudson | Linkedin

Guest Writer

Room full of teachers with questions
Room full of teachers with questions
Room full of teachers with questions

Eric Hudson is a facilitator and strategic advisor who supports schools in making sense of what’s changing in education. He has worked with hundreds of schools and learning organizations around the world to think differently about when, where, and how learning can happen and, especially recently, has been coaching schools on how to approach learning with AI. ​

Hudson started his career in education as a classroom teacher for 12 years, where he developed a passion for working with students. Most recently, Hudson spent ten years at Global Online Academy (GOA), first as an instructional coach and ultimately as Chief Program Officer, where he championed initiatives around competency-based learning, human-centered practices, instructional design, and online facilitation.

Hudson currently serves on the board of the Association of Technology Leaders in Independent Schools (ATLIS). His Substack newsletter "Learning on Purpose" has been featured in The Marshall Memo and The Educator's Notebook. He has been an invaluable friend and mentor to the Flint team who has provided candid and insightful feedback and guidance as we’ve developed our platform and partnerships. Now, we're excited to share some of his answers to current, common questions that schools are asking about AI.

Illustrated by AI, written by humans.

Where to find part one

A few months ago, during my travels to schools and conferences, I wrote a post that answered some of the questions I was receiving from educators and students. I’m in the middle of another burst of AI engagements where new questions and insights are emerging as schools dive more deeply into AI. I was happy to collaborate with Flint on selecting a few common questions and offering some answers.

A few months ago, during my travels to schools and conferences, I wrote a post that answered some of the questions I was receiving from educators and students. I’m in the middle of another burst of AI engagements where new questions and insights are emerging as schools dive more deeply into AI. I was happy to collaborate with Flint on selecting a few common questions and offering some answers.

A few months ago, during my travels to schools and conferences, I wrote a post that answered some of the questions I was receiving from educators and students. I’m in the middle of another burst of AI engagements where new questions and insights are emerging as schools dive more deeply into AI. I was happy to collaborate with Flint on selecting a few common questions and offering some answers.

I have serious ethical concerns about using AI. Why should I still learn how to use it?

The ethical issues embedded in the design, growth, and maintenance of AI systems are deep and concerning: algorithmic bias, environmental impact, intellectual property concerns, labor practices, and data privacy are just a few of the problems we must confront when a tool this powerful develops as quickly as AI has. I’m meeting more and more people who are “conscientious objectors” to AI: they do not use AI because, for them, the ethical implications of embracing it outweigh any potential benefits. 

Teacher sitting at desk looking concerned over ethical implications of using AI

Objecting to AI on ethical grounds is valid and important. Ignoring it on ethical grounds is not. If the purpose of school is to prepare students for the world beyond it, then we have a responsibility to talk with students about AI and its impact on their present and future lives. 

The good news is, we don’t have to use AI in order to teach students about AI. I invite teachers to bring their ethical concerns into the classroom in order to model and teach students how to become critical users of AI. Examining case studies, debating the issues, and critically evaluating AI-generated content are all important ways to help students develop their own positions and make better decisions about AI. Maha Bali has been writing about critical AI literacy for over a year and has gathered many useful resources. Leon Furze has an excellent resource for teachers called Teaching AI Ethics

What is the role of teachers in the future?

If AI does more and more work for teachers, what does that mean for the future of teaching?

In October, I was in the audience for a panel discussion of Silicon Valley executives who were discussing the impact of AI on education. One panelist cited this scene of a Vulcan school from a 2009 Star Trek movie:

The utopian interpretation of this: a sophisticated, personalized tutor for every student. The dystopian interpretation of this: every student going to school in a pit lined with screens. However you feel about the clip, I think it reveals that the companies which design and maintain AI systems are focused on tutoring and personalized learning as an important educational use case. 

When we think about what this means for the future of teaching, we have to remember that school is much more than tutoring. As Matthew Rascoff of Stanford University has said, “School is learning embedded in a social experience.” An AI tutor may be able to effectively replicate some core functions of teaching like instruction, differentiation, or assessment, but it won’t be able to replicate the social relationships that come from going to school to be with friends, the mentorship that comes from collaborating with an adult who knows you and your vision for your future, the competencies gained from working on projects with others, or the learning that comes from applying knowledge to real-world experiences. 

The role of the teacher in an AI future may be to become the protector and steward of the human elements of education. The teacher’s expertise might be used to identify where AI capabilities end and human ingenuity and empathy should take over. Or, it might be used to help students apply what they learn from interactions with AI to collaborative projects that have a real-world impact. Or, it might be used to provide mentorship and coaching in a way that weaves together knowledge with social emotional learning. 

The future is unclear, but I do think teachers should be learning how to work with AI rather than how to compete with it. 

How can teachers regulate what AI knows?

To what extent should they micromanage the information AI outputs?

As teachers learn more about how to use AI, they ask me more questions about how they can control it. They want to know if a bot can be trained to become their avatar: delivering instruction, feedback, and assessments in the exact same way they do. They want to know if they can limit a bot’s knowledge to only materials they provide to it. They want to know if they can micromanage outputs to ensure it reflects what they want the student to see. 

Unless you are building your own large language model from scratch, you cannot control what AI knows and what every output looks like. Generative AI doesn’t have a preset warehouse of responses: you can input a prompt into a model ten times and get ten different responses. Bots also have context windows that limit the amount of information they can take in and hold, and so bots won’t be able to hold the same amount of knowledge and experience you have as a teacher. 

Teacher sitting atop a tall pile of books with robot looking up at them from a short box of knowledge.

Rather than trying to control AI, we are better off learning how to work with it in more targeted ways that augment, but don’t replace, our own teaching process: to synthesize data or documents, to refine our feedback, to generate questions or counterarguments, etc. When we understand AI’s limitations and capabilities (and ensure our students understand them, too), we will become more efficient and more effective in our work, even if we can’t fully control AI.

How should my school decide what AI tool(s) to invest in?

What should we be aware of when it comes to selecting, purchasing, and using the many AI tools available right now?

For this question, there are two perspectives to consider: what an individual user should do and what an institution like a school should do. 

Individual teacher playing around with technology on the left side and a school full of kids surrounded by glowing tech on the right.

For an individual, you should be omnivorous: explore widely and freely. The power and use cases for these tools are expanding rapidly, so we should be flexible and curious rather than focused on finding a single solution. And, most of these tools still offer some level of access for free, so it’s worth experimenting. Use the same prompt across different models and tools to see which generates the best output. Take the output from one bot and enter it into a different one to see if it can be improved or even reimagined. Keep an eye on the major models (GPT, Gemini, Claude, Llama, etc.) rather than try to keep track of the dozens (hundreds?) of offshoot tools built on these models. Updates to the models will be a leading indicator of what more targeted AI tools can do.

If you are going to spend money, spend it on the most flexible tool you can find. Right now, this is probably ChatGPT-4 for $20 a month, which gives you a variety of multimodal capabilities and access to one of the best models on the market. Even if you pay for an AI tool, stay open. I pay for ChatGPT-4 and still use Gemini, Claude, Adobe Firefly, and other tools regularly. 

If you are trying to make decisions for your school, then you should have two things in place before purchasing an AI solution: a clear, well-articulated purpose and adoption strategy for that tool as well as a contingency plan for what to do when that tool is no longer relevant and/or its design and terms of use change. Schools are notorious for carrying a large number of apps and platforms and “edtech solutions” that are purchased and then languish, applied sporadically or by only a small group of power users. It would be worth reflecting on previous edtech investments and considering what went well and what did not go well, then applying that learning to decisions about adopting AI tools.

If nothing else, ask vendors some basic questions:

  • What is the involvement of education professionals in the development and updates to this tool? Does the company partner with schools, educators, and/or students in making design decisions?

  • What advances in AI do you anticipate on the horizon? In what ways will those changes improve this tool? In what ways could they render this tool obsolete?

  • What is in the pipeline for updates and new features for this tool? How do you make those decisions?

How should we communicate with parents about our school adopting AI?

Namely, will parents think that we’re outsourcing the work of teaching to AI? And what are they worried about when it comes to their students using AI (e.g. privacy)?

I am spending more and more time speaking to families about AI, and I have the same takeaway as when I work with educators, students, leadership teams, and boards: their knowledge of, experience with, and opinions about AI are all quite varied. My sessions with parents have helped me understand that any collaboration between schools and families on AI should have an educational component for both parties: aligning on definitions of terms, the current state of the industry and the technology, and concrete examples of AI applications at schools are really important to having a meaningful conversation. 

Parents do not have a positive reaction when I describe AI tools that help teachers grade papers or generate teaching materials. This is not unlike the reaction of teachers when I share how students use AI to get feedback on their work, refine their writing, clean up code, or study for tests. It’s also not unlike the reaction of students when I show them how teachers can use AI to write college recommendation letters or narrative reports. In other words, we are deep in a period of adjustment to AI’s impact, and we are all carrying skepticism and worry. It is going to take time to reset our expectations and understanding of school in an AI age, and demonstrating empathy and care should be our first priority. 

When engaging parents in conversations about AI at school, I recommend thinking about the DACI decision-making framework:

DACI framework showing how in a project team you need a driver to lead and assign roles, approver(s) to approve ot veto decisions, contributor(s) who are consulted for advice, and the informed who need to know the decisions being made.

Decisions about AI at school should be driven and approved by the education professionals who work at the school. Parents, however, can and should play an important role as contributors. First, they have an obvious and important stake in the decision: their children will be directly affected. Second, parents are an important connection to the world beyond school: their careers, networks, and experiences can offer a perspective on how AI will affect their children’s futures as citizens and professionals. 

Above all, families want clarity. They want to know that schools have a confident grasp of what AI is and why it’s important to both learning and wellness. They want to know what school expectations for student and teacher use of AI are. They want to know how they can help. 

What’s next for schools and AI?

I recently facilitated a strategic foresight workshop on AI for the board and senior leadership team of an international school, and I appreciated this comment from the board chair during one of our planning calls: “I want us to generate questions, not answers.” 

I think this could be a mantra for schools when it comes to the future of AI in school. We are not experts, we are investigators. I have written before about what an inquiry-based approach to AI can look like, and while I recognize and have deep empathy for the daily challenges AI raises in schools, I think we have to be open to the fundamental changes it might bring to the nature and design of our work. This means learning as much as possible about how early adopters are using the tool in and beyond school, capturing the questions that people are asking, generating big questions of our own, and recognizing that we are watching a major technological innovation unfold in real-time. What are the skills we can bring as educators that help us and our students navigate uncertainty at a time of great change?

Spark AI-powered learning at your school.

Sign up to start using Flint, free for up to 80 users.

Watch the video

Spark AI-powered learning at your school.

Sign up to start using Flint, free for up to 80 users.

Watch the video

Spark AI-powered learning at your school.

Sign up to start using Flint, free for up to 80 users.

Watch the video