Students need guidance on AI. Are Wichita schools ready?

Wichita and Andover schools have blocked many artificial intelligence tools on student devices as they figure out how to approach the technology constructively.

by Maria Benevento

  • Wichita-area schools are taking a cautious approach to letting students access new artificial intelligence tools, like ChatGPT, in the classroom. 
  • Many educators are focused on preventing students from cheating with AI, but others say they need more nuanced guidance on a technology that could pervade their lives. 
  • Some teachers in Wichita and Andover are using AI as a time-saver and discussing how to help students use it in constructive ways. 

Gwyneth Pic should have plenty of time to figure out her career. 

But the sophomore at Andover Central High School already has a nagging worry.

“What if,” she said, “my job was taken over by a robot?”

That’s a practical concern as advances in artificial intelligence — particularly generative AI tools such as ChatGPT that can create text and images in an eyeblink — let computers perform more human tasks. 

Gwyneth’s lessons don’t cover which career paths might become obsolete. And she credits only one teacher, Mark Fleske, for helping her understand AI in a nuanced way instead of simply warning her about using it to cheat. 

“I’m wondering if (AI will) be something that you’ll kind of use with your job, even,” she said. But “we’re obviously not taught how to work with it, because there are cheaters.”

Wichita-area schools know they need to figure out AI. How it could save teachers time or help students learn. How to avoid AI plagiarism. How to prepare students for a future where AI might be everywhere. 

Some school districts, like Andover, are intentionally taking things slowly to avoid mistakes. 

But even districts like Wichita that say they’re enthusiastic about AI have yet to get all teachers on board — or grant students direct access to AI on school devices. 

Barriers such as rapidly changing technology and overwhelmed educators are real, but students urgently need their schools’ guidance on AI, said Joseph South, chief innovation officer at ISTE/ASCD, an education nonprofit focused on technology and innovative learning. 

“It’s imperative that we prepare our students to engage with (AI) in a way that they have control,” he said. That’s only going to happen if they understand how it works, and how it can benefit their lives, and how it can also work against their interests. … We can’t just ignore it.”

Andover’s approach to AI in schools

The Andover district isn’t ignoring AI, said Jill Lachenmayr, assistant superintendent of academic affairs. But it’s wary of being too cutting-edge. 

“We know that it will eventually permeate everything that we do as a society,” Lachenmayr said. “But we really want to take a slow and thoughtful approach.” 

Andover has blocked many AI tools, even for teachers, as district administrators try to “wrap our heads around” the potentials and pitfalls of the technology and decide how to best use it, she said. 

Educators who experiment with AI such as Fleske, a facilitator for gifted education at Andover Central High School and Andover eCademy, and his principal, Amanda Grier, have led some discussions. They helped with a presentation Lachenmayr made at an October school board meeting. 

The district hasn’t sorted out how much AI assistance adds up to plagiarism or cheating and whether some AI use could violate privacy or copyright, Lachenmayr said. 

Grier said her school has led teacher training and discussions. Many teachers talk to students about academic dishonesty and AI. Some have students experiment during lessons, such as a teacher who had students use AI to rewrite their essays in a “yellow journalism” style.  

If the school eventually wanted certain lessons about using AI responsibly to reach the whole student body, Grier said they could be part of “seminar time,” a class period all students have. 

Wichita’s approach to AI in schools

Wichita Public Schools drew national attention for embracing AI in schools. 

Technology coaches spread AI skills to teachers, showing them how to write better prompts and suggesting ways AI can help them plan lessons and assignments that meet curriculum goals and serve students with different needs. 

Using AI well is “actually having a conversation, like you’re brainstorming with the best people that you know,” then vetting the information, said Tana Ruder, a member of the district’s education technology team.  

Administrators recently used AI to analyze feedback on the district’s budget in a way they considered more objective than solely human analysis, helping to steer leaders toward closing schools rather than cutting staff. 

But even Wichita hasn’t yet given AI access on student devices. 

The district is working on a policy, chief information officer Rob Dickson said, but expects it will take time to educate parents and school board members on how schools intend to keep students safe, prevent cheating and protect privacy. 

And word of the AI initiatives hasn’t spread throughout the whole district, said Mike Harris, United Teachers of Wichita vice president. 

“If this is something that’s a big push for them, and is a big thing that they’re proud of, it’s escaped conversations,” he said. 

Training on AI tips and tricks is available, but not mandatory for all teachers, said Jake Thimesch, an English teacher at Wichita Southeast High School. He opted to go to two training sessions but said the district hasn’t given broad guidance on AI strategy. 

“There’s no expectations on how to use it or how to keep the kids from using it, maintaining any kind of integrity when using it,” he said. “They kind of expect us, as teachers, to know if we’re using it correctly or not.” 

Some of his colleagues aren’t interested in adopting yet another new technology, he said. 

Thimesch is more comfortable using it in his own work than encouraging students to use it. He’s afraid they won’t handle it appropriately and could pick up bad habits. 

“We have to make it clear that plagiarism is anything that they didn’t generate themselves,” he said. “You can get kicked out of college. You could lose scholarships.”

Teaching with AI

Imagine an English teacher thinks a student plagiarized an essay with an AI tool. 

The teacher tries an AI detector, which returns an ambiguous result. 

That puts a teacher on their own to decide whether to accuse a student and defend that accusation to parents, administrators — maybe even the school board. 

Navigating AI plagiarism in one batch of essays can add as much as a full workday, without extra pay, to the plate of an already overworked teacher, said Fleske, the gifted-education facilitator. 

He previously taught English for two decades and said his current role gives him more time to follow his curiosity about AI. But he understands why many of his fellow teachers are wary. 

There’s a sense that “until we can get our brains around the fact that students right now are using it for unethical uses, we can’t even have the energy to get excited about the creative and cool uses,” he said. 

How to stop AI-enabled cheating isn’t the most important question, South said. But teachers need support to ask better questions, like how to show students the ways AI algorithms shape their lives and how to prepare them for the future.

Support could include helping teachers experiment with AI or switch to methods of assessing student learning that highlight their thought processes rather than the final product, he said. Schools could also emphasize AI’s time-saving potential and train them how to use a tool that could be ubiquitous and inevitable in the 2030s. 

Using AI for things like creating multiple versions of quizzes and adapting assignments for special-education students makes his job easier, Thimesch said. But he said it requires expertise to recognize when AI makes mistakes. 

“I’m very confident in my ability,” he said. But “when it’s the night before a lesson has to be made … I’m a teacher. I’m a parent. I don’t have time to always generate those questions.”

Using AI in that way can ease the workload of overburdened teachers, said James Basham, a professor in the special education department of the University of Kansas School of Education and Human Sciences. 

“Yeah, it’s going to cause you grief in some areas,” he said, “but it can also help support you in others.”

Learning with AI 

When one teacher asked his son’s class to write first drafts with AI, Basham thought it was an “extraordinary” idea.

Whether it was more effective than a traditional assignment is an open question, he said. 

“For some students, I’m sure that really helped them accomplish the task. For other students, I don’t know if it helped them or not,” he said. “When you edit something, are you actually learning as much as just writing it?”

Researchers like Basham are studying how AI can best be used to help students learn. 

One application for the technology could be special education, he and several others wrote in a paper published last year. AI could help adapt lessons for different learners or function like a “prosthesis” to help students communicate in ways they otherwise couldn’t. 

But some teachers also want to ensure AI doesn’t prevent students from learning basic skills. 

“The next Emily Dickinson or Walt Whitman, the next Shakespeare, the next Maya Angelou, I don’t think AI is going to thwart them,” Fleske said. But for a regular person, “I think AI is going to devalue the practice of writing … and the understanding of language to be able to craft it yourself.”

AI might also strengthen students’ writing, Fleske said. He likes the idea of using it to brainstorm ideas, collect examples of figures of speech or identify passive voice. 

Gwyneth, the Andover student, said she mostly avoids using AI despite learning from Fleske about some of its positive applications.

“I know there are many ways I can use it for good,” she said. “But I’m always afraid that it’ll be seen as bad, so I just don’t use it at all.”

But some of her classmates don’t have the same qualms. She knows they use it to cheat on assignments or write messages. 

And while generative AI — large language models like ChatGPT or Google’s Bard — has recently captured the public’s imagination, South said other forms of AI could prove even more dangerous partly because they’re less visible.

Students need to be able to see, for example, how social media algorithms drive polarization or create social pressure to throw the perfect birthday party. 

Harris, the union vice president, said he isn’t worried about student access to AI in schools because most can already use it on personal devices or bypass the school’s software. 

He expects success in the workplaces of the future will be based on who’s best at using AI. 

And even districts like Wichita, which he believes is ahead of the curve, are lagging behind new developments, he said. 

“We can’t be quick enough,” Harris said, “because the technology is changing too fast for us to keep up.”

This article was republished here with the permission of: The Beacon