Why Flint
|
Case Study
How The York School used Flint to embrace student-centered AI use
Case Study Summary
An introduction of Justin Medved
Can you start by introducing yourself, your role, and how long you've been at the York school?
My name is Justin Medved. I'm the Associate Head of School—Academic Innovation, and that role seeks to play in the intersection of technology, learning and teaching, innovation, and professional growth and development. So I kind of sit at the top level thinking about the future.
Initial considerations about AI
Back in October, or even before that, what was the York school thinking about AI? What led you to search for a platform?
We had obviously been paying attention to the larger conversation, and with ChatGPT rolling out last year, we felt ourselves quickly becoming outmatched to the extent that we were starting to see ChatGPT show up in student work and kids were talking about it. We initially started down the professional development road around detection and policy. So those were the kind of entry points to the extent that we wanted to better understand what AI could do in support of teaching and learning and, obviously, students. But then we also wanted to better understand how it was showing up, maybe without being formally introduced by the teacher, and different ways that detection might be possible. So we took a look at our academic honesty policy and our plagiarism and prevention scope and sequence, and we upgraded those policies.
After we did that, we needed to think long-term about the role that AI might play in education, which is profound. And so, we started down the road of thinking about professional development. So that's my role. We dedicated two days this year, one we've already had and one that's coming up in April, to take a deep dive into AI. From my perspective, we needed to coalesce around a tool that we could try together and then experiment underneath. And what I loved about Flint was that it provided both. It gave access to a kind of walled garden of the ChatGPT experience, and it also offered the educators an opportunity to extend that experience into the classroom with full control. Being able to play with the rules and the experience of the students, and, as I've learned having used it a lot now, the testing aspect were game-changing. You can really test the entire chat experience with yourself across different levels, across different criteria, iterate on the criteria, iterate on the kind of parameters, and then once you feel like you've got it down to a place that you can kind of sign off and you're happy with the experience, then you roll it out to the kids.
Do you remember what it was like when you and your school switched from the prevention side of things to look more future forward? Was that always a thought in the back of your mind or was there something that triggered that sort of response?
Yeah, I think it's quite easy to kind of stay in that space of all the things that AI is going to disrupt and has disrupted. When the Internet was first created or even when books were first created, I imagine teachers everywhere were just immediately saying to themselves, well, how am I going to compete with all of this easily accessible information? How will I know what the kids know legitimately? So, it's required us to rethink assessment. It requires us to rethink what are we actually looking for and what kind of skills we want kids to know and be able to do.
That's no different here. I think a much more productive conversation is one that's underpinned with optimism and a mindset of “if you can't beat it, join it”. I think that is a much more fertile ground for discussion, opportunity, and professional collaboration, which is what you would want as part of any professional development experience.
Requirements for an AI solution
We talked about how you were looking for a platform that will be more future-forward for students. Can you summarize what exactly were you looking for in an AI solution?
One of the biggest challenges I think schools will face is wanting to give students an AI experience, but they'll want to be able to control that experience to avoid hallucinations. The element of control is centered around safety. Right. So, if safety is at the core of the educational proposition, then the extent to which you can guarantee safety is something that you'd want to have built into a tool. Privacy, obviously, is another piece.
And, when you're thinking about putting students on platforms, you want to be able to see all of their interactions. So the beauty of Flint is that it captures those observations and conversations, so you get to see everything that's happening behind the scenes, individualized and personalized per student. The student's not getting just feedback—you've got those conversations and insights that you can go back to and use as evidence of learning for later. So I think that's a very powerful combination of features.
Why TYS piloted Flint
I was curious about when you first found out about Flint. What was it that made it click for you and, as you got to know the platform, what made you decide to invest in having Flint compared to other tools that you were seeing?
It was one of the few tools that I felt had put the student in the center. While the teacher has access to all the tools, really what you're trying to create is an engaging experience for the student, whether it's through the essay assignment type where you put your writing in there and you get feedback, or if it’s the chat experience that you could give it any role and any set of parameters and it would create a really interesting experience for the students.
So I saw, even as Flint’s evolved over the last six months, a real emphasis on that experience, and it keeps getting better and better as the algorithms and the code that powers it are tweaked. I see an intentionality behind trying to create a personalized teacher that can teach just about anything and not necessarily give the answer right away, but nudge the learner along on an exploration of whatever the topic is.
And then I love the way that there's a kind of assessment/feedback piece. I've seen it firsthand with kids who really become engaged when they're getting immediate feedback. That has always been the hardest thing for teachers to deliver at scale with time and efficiency. Either you're talking to kids one-on-one or you're grading papers one at a time. The time to feedback loop has always been long and everyone tries their best, but what Flint offers, and I imagine it will only get better with time, is that immediate feedback loop. And, the teacher has created and tested the experience that the students will undergo.
Roll-out process to faculty and students
How did you go about rolling out Flint at your school?
I have a pretty unique position in that I sit at both the technical level, in that IT reports to me, and I sit at a professional development level, where we have embedded PD with all of our teachers during the week. So, I can roll out any tool within two weeks and see everybody. That's not always the case at schools. We take professional development quite seriously here, and so it's timetabled into everyone's schedule. And that's what I did.
I secured the partnership with Sohan, and the Flint team, and did some testing myself. I rolled it out in phase one to some of my early adopter teachers, who just started playing with it, started using it, and then once I saw how it was being used and I got my head around it, then I rolled it out with everyone else. And then we had an entire day devoted to professional development at the school, and I made Flint the default platform for the entire faculty and staff. We onboarded everyone by doing a whole workshop around using the AI Chat function and how to do prompt engineering for a whole bunch of different use cases, not just the educational context. So that was kind of like a three-pronged approach.
What I loved about it is that we had a common platform—Flint. Everyone was using that platform when they were thinking about AI. And that allowed us to both honor safety and security, saying, this was a walled garden and your privacy is protected. And then we didn't have to pick up 300 GPT-4 Pro licenses, which was a huge opportunity. It was like a two-for-one.
Were there any issues that you ran into with either teacher sentiment around AI, or with teachers not quite understanding what the use cases of Flint are, etc.?
I think it really just takes one to experience it, and then very quickly you see the power. So I think there's a spectrum of those who are early adopters and they're like, “This is unbelievable!” And then there are those who, not that they see their job being threatened, but they have big questions around what's their role given sometimes AI can make an error, and so you don't want to release complete trust to it, but at the same time, it can often give just as good, potentially, if not better, feedback, faster. So what does the partnership look like? Lots of great questions come out of that, and I think that's we want. It's a healthy dialogue around what we can do because we're in our infancy with AI in education.
And then it opens up larger conversations, given this technology's current iteration. What might things look like in ten years, which is what we really want to be thinking about and preparing for. So I think it's okay to have lots of different perspectives because the future is coming, no matter what. And so we have to start preparing our thoughts and our attitudes for that eventuality.
Current state of Flint at TYS
After having had Flint for almost four months, what do you think teachers think? What is usage like?
Varying. Some love the feedback piece and I’ve seen all kinds of weird and wonderful adoptions. I found that our arts teachers are loving the fact that kids can reflect in it. So they're building out chats to have students access a conversation with someone who's asking them to reflect on a performance, or asking them to reflect on a piece of music. When students were previously asked to just journal, you hear a groan. But if you ask them to do it in this, then suddenly it's a much more engaging experience because it's like a Q&A opportunity. So the answers you can write in small chunks and you're prompted to dig deeper, which has been really exciting.
I'd say, where it hasn't won the hearts and minds yet has been with math, because it will make mistakes and math teachers do not like mistakes. They are running into issues when trying to have students go through a problem set or work through a problem. When Flint integrates with Wolfram Alpha or another engine that is likely going to solve that issue, and that'll be awesome.
English teachers have adopted it quite quickly. They see the power in not only having students write in it, but the teachers will open up an assignment and then cut and paste student work in, and then they'll get the feedback, and then they'll put that feedback back into the Google Doc with their own nuances as well. The fact that Flint got them 80% of the way there, I think has been really cool.
The languages are slow to adopt it. They've seen the power. It's just going to take some time. But that's something I'm really excited about.
What do you see as the real benefit of Flint so far? I feel there's a lot that we can extrapolate from just looking at the feature set, but are there any benefits that you've seen actual evidence of?
I'd say engagement. The students getting that feedback and then immediately getting a grade, even if they know that grade isn't necessarily the end all be all, they immediately chase wanting to improve it. I think that that, for something like writing, has never ever been possible with this kind of speed.
What I've been coaching the teachers on is to really be detailed. What I think Flint does really well is it exposes weaknesses in rubrics. If your assessment criteria aren't detailed enough, then that amorphous kind of broad-based language reveals a whole bunch of subjectivity that is inherent in assessment that we don't like to talk about. We assume students can read between the lines. As soon as you start seeing that AI is inherently generous when it marks, you also then realize how much subjective nuance you've been putting into your assessments. So, if you want it to mark like you mark, then you have to actually be really specific with what you're asking it to do. So if you want flawless grammar and punctuation, if you want three examples as opposed to two, as opposed to one, you'll have to have that graduated nuance to your rubrics and then AI will do it. But if you don't, then AI will actually err on the side of, I think, generosity, if you want to call it that. And that for me has been a really useful conversation starter with teachers.
That's so insightful. I remember when I was a student, I always did best in the classes where I understood the teacher and had a personal relationship with them. I'm realizing it's because you understand what they're asking for because you know them as a person. But if you're a student who hasn't had the chance yet to establish that relationship, or you simply don't understand the teacher, maybe you both have very different views on the subject, then that creates an unfair playing field. And you're saying that AI is helping the communication there at least to be more fair.
Yes! It's a great conversation. Starting when someone says, “Yeah, the AI is pretty generous, it gave high marks.” And so I say, “Okay, well, how would you change your assessment criteria to mark more in line with how you would assess? And if it's too generous, then what do you need to do to change that so it's more accurate?” And then people would go, “Ohhh, yeah.” Some of this broad language that is meant to capture the graduated nuances needs to be more specific to. If you're really asking for that, then you should tell them—the students and the AI.
Next steps for TYS
What is your future vision for York?
I mean, for myself, it would be to partner with a platform that offers students a safe and managed AI experience to leverage all the best bits. So the extent to which AI can be a personalized tutor, the extent to which AI can give feedback, the extent to which AI can help with the ideation and kind of brainstorming phases, these are all the ways that we want to support students. A tool that does all those things, but then also allows the teacher to get evidence of the interaction so that they can use that to better understand their students and how much they're learning is what we’re continuously striving towards.
Is there anything else you want to add?
I haven't ever seen a tool—other than something called Hapara, which we've adopted as a Google suite add-on—I've never seen something become adopted so quickly. I've been in this space for a long time now, and looking at tools and what's capturing the minds and hearts and minds of the staff. And this is definitely something pretty special. I think you've nailed something pretty special.