We got talking about ChatGPT yesterday at work and surprisingly, none of us have really been asked by teachers to block it from students. It could be because they haven't heard about it yet or because we're in final exam week and students aren't doing a whole lot of work aside from wrapping up for the semester.
Feelings among the coaches were mixed. We understand the anxiety that comes from the publicity that ChatGPT has drummed up and for some of the examples we've tried out. At the same time, if you actually try using it yourself (requires a login), you'll quickly discover that it isn't as scary as it sounds.
A tool is a tool
At the end of the day, ChatGPT is a computer program which takes in a question and gives back a human-ish response. You can ask about anything (for the most part - it couldn't tell me about myself even though I'm on The Internet) and the site will give you a response summarizing the thing. The summaries were okay and, I will admit, the code samples were cool to see created on the fly.
But here's the thing - it's a summary machine. It gives these responses based on information it has already been given. If you're a teacher or student looking for an interactive method for summarizing information, this is a great tool because it can take natural language prompts ("Tell me about the solar system") and provide information quickly.
Finding teachable moments
If you're not looking for summaries and are more concerned about students making less-than-genuine submissions to your assignments, don't lose any sleep. The responses from the machine are very dry and are easy to spot. If you're taking time to actually read what's coming in, you'll be fine.
For other, less subjective submissions, here are some ways you can use ChatGPT to push your assignments up toward Synthesis and away from Knowledge on the Bloom's spectrum:
Since it's a summary machine, consider generating a summary via ChatGPT on your own and then use the response as a close-read and editing activity with your students.
- Is the summary factually correct?
- Is there extra, unnecessary information that can be removed?
- Is there context that should be added?
I also gave it a short prompt to write a story and it gave back a passable response. It was creative in the sense that it followed the prompt ("Write a story about a penguin named Sparky who moves to the rainforest.") and gave a story with a start, middle, and end. If your students do this, here are some questions to ask:
- Who owns a story once it's written? The person with the idea or the writer?
- Can this be edited to have a better story arc?
- Is the resolution satisfying? What makes a satisfying resolution?
One of the big breakthroughs with this model is the ability to generate code samples on the fly. Learning to code can be frustrating because we might not always have the mental model to do what we need to do. Giving ChatGPT a prompt like, "Write a program which generates random odd numbers in python" will give you a working program. Use this as a starting point:
- Is this the best way to accomplish that task?
- Can you refactor it into something more concise?
- How would this type of program be useful?
- If you work for a company and you use code from ChatGPT, who owns it?
Wolfram Alpha has had an equation solver for a while, but this goes a little further because you can ask ChatGPT to validate a proof or equation. I tried giving it a proof that "proves" 1 = 0 by including a subtle logical fallacy. The machine tells me it isn't valid, but it does a poor job of explaining why.
- Provide students an explanation of what is happening (created by ChatGPT) and then improve it.
- Give students a challenge and ask them to validate their responses using the AI.
- How could using AI to evaluate mathematics change the way we think about math?
I'm not going to lie, this one was cool. Google Translate already exists and we know students use it. ChatGPT differs in that I could prompt it for regional dialect and formal vs informal responses. It will event provide phonetic responses if requested.
- Quickly generate prompts for students to translate or to analyze in class (less work on you)
- Compare and contrast different methods for translating a piece of text
- Summarize rules for translation for reference
What do we really want to teach?
ChatGPT is a summary machine. It can be used as a way to quickly get information to use as a starting point, and that's the key. It does not cite these sources, and that's where the teaching comes in. Evaluation and synthesis are the next steps to actually do something with what we know. This is an emerging tool and we don't know what wider impact it will have in the future. For now, I would recommend thinking deeply on what I want to teach and how a powerful source of summaries could be used.
Don't forget that it is still susceptible to errors. Even a calculator will always give an answer, but that doesn't mean it is the correct one. Teach students to develop critical habits and to check what they're given to make sure it's factually correct. Train them to look for errors by inviting them to challenge ideas and ask questions.
ChatGPT is impressive and AI is only going to become more impressive. Take some time to think about implications of the tools as it relates to teaching practice. What kinds of questions are worth asking?