On Teachers and AI Use

Published: 2024-03-28 4:05 PM

Category: AI | Tags: education, artificial intelligence, chatgpt


The Markup had a post this month looking at educators' use of AI tools. I took some time to read it this week, first on my phone and then printed out so I could mark it up and think through the material more carefully. I'm still working out my own feelings on large-language model (LLM) AI and I wanted to be sure to read and digest this before reflecting on it.

Edtech Influence

Edtech is a weird space. It's full of influencers and personalities wanting to be "a name." The influencer sphere is courted by big tech to try and create authenticity for what they're building. I have a really hard time staying neutral when reading these kinds of articles becuase those are the people you tend to see interviewed. The influencer vibe came out with a couple in particular (one even self-identified as an influencer) and I think it is important to note this perspective because they're the ones who are most heard when it comes to setting the baseline for tool use in education.

The low-hanging fruit is usually time. "AI will save you time!" is like a teacher dog whistle. Our time is pulled and, if only we had more, we could do all the things. AI is a tempting way to make up for time, but to me, it's the worst way to consider these tools.

[He] plugs the topic into MagicSchool AI, along with his estimation of how much class time that teacher has to teach the particular subject, and lets the AI generate a set of lesson plans. "When a teacher sees how fast the AI works, they immediately sign up."

Part of the work of teaching is to make sure lessons are cohesive, aligned to skill development, and appropriate for the students in the room. If teachers are encouraged to just type a tomic and timeframe into an AI, they have effectively removed themselves from the most critical part of the job. There is no thought into the progression of skill development, the connections to other ideas, or the scaffolding that has to happen to help students build understanding. At the other end of the spectrum, another early (and dangerous) recommendation for teachers to try AI is to allow it to grade student writing. Giving feedback on a student's demonstration of understanding is the highest calling and we shouldn't look for usefulness over meaning.

It's flashy and impressive, but it's also dangerous when it isn't paired with evaluation and introspection.

Aside from generating lesson plans, there's this undercurrent in education that everything a teacher needs to use should be free.

Kids just deserve the best education they can get, and if that means borrowing lesson plans from a bot, I'll take it. If we're just teaching lessons, it doesn't really matter where we got it from.

Deep breaths.

Wanting to give students a good education is not license to use tools poorly or to condone the theft of materials. Large language models are already in hot water becuase they incorporated copyrighted material and tried the legal excuse of, "oops?" Sorry in the name of progress!

Children deserve a fair, equitable, and high quality education. That does not mean teachers should use AI to write machine-developed, untested, under developed, and low quality lesson plans to save time. How we act and how we justify our actions matter just as much as how students act.

Besides, if it's just a paper, why does it matter where they get it from?

Brainstorm and Inform

Others are more nuanced in their approach to AI. Another interviewee, Kim Maybin in Alabama, mentioned using ChatGPT to develop multiple versions of the same prompt for differentiation and validity of assessment:

...she often found herself creating additional structure or "sentence starters" to help her struggling students...

This is paralleled by the desire to use tools to find patterns in data or to "rubber duck" patterns and data. This is more closely aligned to how Simon Willison advocates making LLMs work for you that I'm slowly coming around to using more.

In the last two weeks, I used ChatGPT to generate three or four questions on a particular learning standard. It was late and I just didn't have the brain power to write the quiz questions on my own. But I knew what specific skill I wanted to assess, so I was able to write a prompt which generated a couple good starters. They were not scenarios I had used before, so they were novel to the students, but directly aligned to the content.

And that's the difference between using an AI tool to do the work vs using an AI tool to refine the work. The corpus of information has patterns which match well-known topics in the sciences (good for me). I can reliably get some starting points and then move on from there. The difference between this and other wholesale approaches to AI in education is that the human (me) is more heavily involved in the process rather than less. I don't know if I saved a major amount of time starting with a brainstorm, but it got my mind running by priming the idea.

And maybe that's a better metaphor. Using an LLM to prime the thinking process can reduce some of the cognitive load of starting cold. I'm stil working through my own apprehensions of using AI, including the larger impact of the resources it takes to produce them at all. I hope that, at least in education, the narrative starts to move away from the "magic" of the tool and picks up more nuance of the implications and ramifications of AI.

Share this post
Previous: Switching to System Fonts Next: March 2024 Reading

Comments