AI is everywhere. How should schools handle it? Teachers’ different approaches show its potential — and limits – San Diego Union-Tribune

AI is everywhere. How should schools handle it? Teachers’ different approaches show its potential — and limits – San Diego Union-Tribune

AI is everywhere. How should schools handle it? Teachers’ different approaches show its potential — and limits – San Diego Union-Tribune

Author: Kristen Taketa
Published on: 2024-10-29 12:00:53
Source: Technology – San Diego Union-Tribune

Disclaimer:All rights are owned by the respective creators. No copyright infringement is intended.


In Jeff Simon’s math class at Sage Creek High School, students are not only allowed but encouraged to use AI.

Simon introduces students to artificial intelligence tools that explain, step by step, how to solve a math problem — all they have to do is take a picture of it. There are AI tools that graph equations and math chatbots that students can ask for help on a problem, as if asking a teacher. With AI tools, students can quickly check their answers before turning in classwork or homework.

To Gabriel Raposo, a 15-year-old sophomore in Simon’s Intermediate Algebra class, “it’s kind of like a private math tutor.”

Simon, a teacher of 31 years with an electrical engineering background, is unique among Gabriel’s teachers in how much he embraces AI.

But more teachers are embracing it, or at least incorporating it into their jobs. Schools around the country, including in San Diego, have been ramping up efforts to train teachers on using AI in the classroom — both in showing teachers how they can use AI to make their jobs easier, and in teaching students the proper and ethical use of AI.

With AI tools already prevalent not just in daily life but in the workforce, schools now find themselves needing to navigate several questions: When is it appropriate for students to use AI, and when is it not? How will schools teach students about AI literacy, including how to detect whether something is AI-generated or human-generated?

How can teachers prevent AI cheating and plagiarism and ensure students are truly learning, rather than relying on AI to come up with answers? How can schools and students monitor the content AI produces for bias and inaccuracy? And how can teachers incorporate AI without compromising human relationships and creativity?

“We’re trying to teach the ethical use of these new tools that are just going to keep on growing,” Simon said.

California doesn’t yet mandate any policies or rules for schools about AI use or teaching, but it says students and teachers should be taught about safe and appropriate AI use and how AI produces content. A new law approved last month requires the state’s Instructional Quality Commission to consider incorporating AI literacy into curriculum frameworks for math, science and social science.

“In this age of AI, it is essential that both educators and students demystify this technology and grasp how it produces output,” state guidance says. “A conceptual knowledge of the benefits and potential risks of computing technologies is increasingly relevant for our students and educators alike.”

‘Give me all the tools’

Simon introduces AI tools to his students at the beginning of every course. He shows students how to use them, and for what purposes. He collects their feedback on the tools and updates a list on his website of their favorites.

“People are so afraid of this stuff, and where’s our future going to go with this?” Simon said. “I’m just of a certain orientation where I’m like, ‘Screw that, give me all the tools out there.’”

Discussing them up front with all his students takes away any “sneakiness” factor. AI also helps level the playing field for students, Simon said.

Before AI, individualized help and resources such as private tutoring were largely available only to those who knew about it or could pay for it. Now, everybody has somewhere to check their answers or get help, Simon said.

AI also addresses a key problem inherent to Simon’s classroom: He has about 40 students and only 70 minutes per class, so there’s not enough time to help every student individually. AI tools can answer students’ questions when he is unable to.

Katelyn McNamara, 15, compares her math solving steps with a classmate at Sage Creek High School on Tuesday, Oct. 22, 2024, in Carlsbad. (Nelvin C. Cepeda / The San Diego Union-Tribune)
Katelyn McNamara, 15, compares her math solving steps with a classmate’s at Sage Creek High School on Tuesday, Oct. 22, 2024, in Carlsbad. (Nelvin C. Cepeda / The San Diego Union-Tribune)

“At first, I didn’t really like AI. It was always seen as a bad thing,” said Katelyn McNamara, a sophomore in the same class as Gabriel. “But now that Mr. Simon introduced us to so many tools, it’s very helpful in math … I kind of like it now. It helps me a lot when I’m confused.”

If they’re running short on time to complete a math assignment, Simon’s students said they will sometimes use AI tools just to generate the correct answers to their remaining math problems so they can turn them in.

That’s not against the rules in Simon’s class — but it still has its own consequences.

“Instead of punishing us by lowering our grade or giving us a zero, the punishment is us not learning and then, on the test, getting a horrible score,” Gabriel said.

Simon’s grading rubric places far more weight on tests over busy work such as homework and class assignments, and during the tests, students cannot use any AI. So it’s during the tests that students are held accountable for their learning, Simon said.

Luka Esquer, who is also in Simon’s class, doesn’t use AI just for math. He also uses it for English — not to generate essays, he said, but to get feedback from ChatGPT on how to improve his drafts. He asks ChatGPT to grade his essay draft using his teacher’s rubric, then edits his draft himself before submitting.

Katelyn uses AI when performing research for English or history papers. Tools like ChatGPT will quickly summarize the main bullet points about a topic for her.

But AI tools are not perfect. Sometimes AI will suggest ways of solving math problems that are confusing and don’t align with what Simon taught them, the students said.

Katelyn and classmate Lucca Chao also said they are wary of relying too heavily on AI. They worry that if they use it too often, they won’t challenge themselves enough.

“Sometimes you just use it easily to show steps, even though you probably could figure it out,” Katelyn said. “It kind of takes away from you pushing yourself to try to look at different solutions for the problem.”

“You don’t want to use it on every problem,” Lucca said. “Sometimes I do feel guilty.”

There is one thing that has surprised Simon about AI. He had expected it to improve student grades. Instead he found that, before and after introducing AI tools, his classes’ performance remained relatively flat compared to previous years’ results.

“Math is a tough one to move the needle,” Simon said.

‘A passing score at best’

The freedom to use AI differs by teacher and by subject. In English teacher Katrina Waidelich’s classes, its use gets more scrutiny and restriction.

The Carlsbad High School teacher needs to make sure her students don’t submit AI-generated content for their essay assignments. Waidelich said such AI use would fall under Carlsbad Unified’s academic dishonesty policy, which doesn’t explicitly mention AI but bars students from copying work and receiving “improper assistance on an assignment meant to be individual work.” In other words, students can only use AI tools when and in the ways that their teachers allow them to.

Also, as an Advanced Placement Seminar teacher, Waidelich is required by the College Board to check the authenticity of her students’ work. The College Board says students can use AI tools to explore topics, search for sources, check their writing for grammar and tone and confirm their understanding of a complex text, but they must read primary and secondary sources and analyze texts on their own.

To monitor student work for AI cheating, Waidelich uses tools like Classwork Zoom and Draftback, which show the history of students’ writing in a Google Doc, including when they were typing in the doc and even the history of their keystrokes. So if Waidelich suspects a student copied a paragraph from AI or elsewhere, she can look in their doc history to see if that paragraph suddenly appeared in one keystroke. She also uses plagiarism checkers from Google Classroom and TurnItIn.

Waidelich shows her students not only in what situations AI is against the rules, but also the fact that AI won’t necessarily do a good job of writing essays for them.

To show students the limitations of AI, Waidelich has taken AI-generated paragraphs and shown her students how they would fare under her grading rubric.

She had students analyze a sample course welcome letter and discuss how AI was unable to reflect the audience’s needs or the author’s bias. The letter was also missing key specifics like the name of the teacher or the name of the course.

“It’ll get you maybe to a passing score at best. But it’s not going to get you to an A or a B,” Waidelich said.

While Waidelich’s classes bar students from submitting AI-generated essays, Waidelich still lets students use AI tools to get feedback on their drafts. For example, she showed them one bot that can read their proposed research question and advise them on how to narrow or focus the question. And she prefers tools that suggest feedback for students, rather than tools that automatically change or correct their work.

‘Something teachers shouldn’t be afraid of’

Both Simon and Waidelich also use AI in their own work as teachers.

Simon developed his own AI platform that launched this year, called HappyGrader, that grades students’ tests and provides grading feedback.

It’s cut his grading time in half. He can now give students their scores the same day as the test and give everyone detailed feedback — both impossible before.

“It’s giving us this extra strength that we didn’t previously have,” Simon said.

Jeff Simon works with one of his students working on her classroom math assignment at Sage Creek High School in Carlsbad. (Nelvin C. Cepeda / The San Diego Union-Tribune)
Jeff Simon works with one of his students working on her classroom math assignment at Sage Creek High School in Carlsbad. (Nelvin C. Cepeda / The San Diego Union-Tribune)

Another bonus of using AI, he said, is it is more consistent and less biased than human grading.

“Everybody is getting graded by the same standards. We’re all grading the same way, because it’s machine-assisted,” he said.

Waidelich is a technology coach for Carlsbad Unified and has led sessions for teachers on how to use AI. She personally uses AI tools to give students feedback, but she avoids using them to assign grades to student work.

“Ultimately, I feel responsible. If a student asks, ‘Why did I get this grade?’ I should be able to articulate (why),” she said.

San Diego County’s largest school district doesn’t yet have a policy or guidelines about AI. But San Diego Unified administrators are working to dispel teachers’ fears about AI and show them the many ways it can benefit them, said Julie Garcia, the district’s senior director of instructional technology.

For example, AI tools can analyze students’ answers to a math problem, then tell the teacher which students struggled with which aspects of solving it. AI could suggest next steps for teachers, such as whom to pull into small groups.

“Imagine a teacher having that kind of data at their fingertips,” Garcia said. “It has potential. It’s something teachers shouldn’t be afraid of.”

She cited the example of MagicSchool — a popular AI platform that can do many of the key aspects of a teacher’s job. It generates lesson plans, slide presentations, assignments, quizzes, tests and even special education plans for individual students. It grades papers and tests and gives students individualized feedback. The platform says it has 3.5 million teacher users.

MagicSchool cautions that teachers should still provide a human touch to the content its AI tools generate. And it requires teachers to sign off on several guidelines before using its platform, including that they will check AI content for bias and accuracy, use it only as a starting point and protect student data privacy.

“We don’t want AI to replace our teachers. We want AI to enhance what our teachers do,” Garcia said.

This past summer Garcia said San Diego Unified held an AI expo where it taught more than 150 educators how to use AI tools, check for biases in AI, redesign class assignments to reduce potential AI “cheating” by requiring student creativity and teach students about ethical AI use and AI proficiency, such as how to distinguish AI content from human content. The department has also been talking about how to make sure students’ data is kept private and safe when using AI tools.

The district is convening a task force that will draft district guidelines for AI use by June of next year.

Originally Published:


Disclaimer: All rights are owned by the respective creators. No copyright infringement is intended.

Leave a Reply

Your email address will not be published. Required fields are marked *

Secured By miniOrange