Note to punctuation geeks: Most style guides write it as AI without periods but I think that’s visually confusing – “Who is this Albert guy you’re writing about?” – so I do it this way.
If you’re wondering whether the technologies known as “artificial intelligence” belong in schools, you’re way behind the curve. They are already being used — and not by students cheating on essays.
“I started by developing A.I. literacy curriculum, helping teachers bring lessons about A.I. to their classroom. In 2021 it was very niche, no one was interested. 2023? They’re knocking down the doors, saying, ‘Not only do I need my students to understand this, I need to understand this’,” said Randi Williams. Interest has only grown since then, she said.
Williams, a Ph.D. from MIT who is about to become a professor at Carnegie-Mellon University, was one of several presenters in a room full of New Hampshire educators at NHTI. The day-long summit on Tuesday, March 18, was sponsored by the state Department of Education to consider if, when, where and how teachers should use A.I.
“It’s less about the tool and more about the system you can put in place with the tool,” Williams said.
This isn’t entirely new news in New Hampshire. Since last June, the non-profit Khan Academy has been piloting its teaching assistant and tutor Khanmigo in the state. The service was just extended for another year.
I admit to being a skeptic about A.I., but it’s changing so fast that it’s hard to be too dogmatic. These sessions weren’t shy about the drawbacks of artificial intelligence, which despite its name isn’t intelligent at all. Mostly, it’s a predictive engine that counts examples of what human beings have done in billions of online situations – like which word people have most often written after “elephant” in a sentence, which color they’ve used in a picture next to a black pixel or which change they’ve made to a particular tempo and tone in a song. Then, it uses that information to choose what to do in a similar circumstance.
Importantly, A.I. doesn’t know if it has made a wrong choice — it doesn’t “know” anything — so sometimes it makes terrible mistakes and reflects egregious bias. That’s why using A.I. is a bad idea in many circumstances.
With that limitation in mind, the presentation discussed various uses for A.I. in the classroom, both potentially and in real practice. Some were boring, like drawing up class schedules; some were helpful, like analyzing an exam and suggesting an alternate version to be given as a make-up for students who missed the first one; and some could be revolutionary.
“The early adopter teacher is using it to help run class projects, to help create rubrics, then having students augment the rubrics and having the students reflect on their own work. They’re just completely revolutionizing what they’re doing in the classroom,” Williams said.
For example, A.I. could create multiple difficulty levels of the same text to keep the whole class on the same track: third-grade level for this pupil, fifth-grade for that one, tenth-grade for the bored kid in the corner.
However, Williams added, “I’d say there are very few doing innovative things.” Most teachers and administrators are tip-toeing into the A.I. field — if they’re using it at all, a contrast with industry. “In business, everyone is using it: ‘If you can save me 10 minutes, you’re my guy’.”
Williams presented at Concord’s community college as part of a program called Day Of A.I., a not-for-profit organization created by MIT to help schools consider A.I. by giving them a blast of orientation on a single day. Sessions at NHTI included “fair and responsible A.I. in education” and “developing students’ AI literacy.”
Not much of the discussion concerned students using the technology to cheat. Jeffrey Riley, a former Massachusetts Commissioner of Education now on the Day Of A.I. staff, said “We recommend that high schools address this issue head-on, by providing guidelines in their student handbooks… We believe that students should always disclose when they use A.I. in their work. At the same time, we also believe teachers should disclose if they want their students to use A.I. or not.”
Part of one discussion mentioned how A.I. could be used as a rudimentary teaching assistant, giving individual feedback to students. Done right, Williams said, this approach can “give students more agency and more support than a teacher can give as one person with a classroom of 30.”
That use touches on fears about A.I. being used to displace humans; fears that presenters partly, but not entirely, assuaged.
“A.I. won’t replace people. But people who have A.I. skills will probably replace those who don’t,” cautioned Riley.
If nothing else, he said, schools can’t ignore A.I. any more than they were able to ignore calculators or word processors when those disruptive technologies came along.
“A.I. is already here,” he said. “We’re all going to have to grapple with it as a society.”
I’m currently reading “Bonk”, Mary Roach’s book about the “Curious Coupling of Science and Sex”. And since it was written back in the Dark Ages (2009), AI refers to “artificial insemination”.