AI use has become more common, but unclear guidelines leave many confused

By Eloise Kolofer

kolofere@findlay.edu

As artificial intelligence becomes more common in college classrooms, students and professors say the real issue isn’t AI itself, but confusion over how it’s allowed to be used.

While the University of Findlay has AI policy guidelines on AI permission levels in courses, they are not known well by students, leading to confusion.

Andrea Simmons is an assistant professor of teaching in marketing who teaches courses about AI. She sees students tend to focus on what individual professors allow rather than any overarching rules.

“As a whole, the university does not have a policy, but guidelines,” Simmons said. “Students are less knowledgeable about an overarching rule from the university. They’re most familiar with what their professor tells them is possible.”

UF student Zoe Cuvo said AI helps her better understand material she struggles with in class.

“I mostly use it to help me study,” Cuvo said. “It helps explain things that are hard to understand, and I like it for asking for more examples when I’m practicing problems.”

A common concern among professors is that students may rely on AI to complete assignments, removing the educational experience and academic integrity.

“There’s concern that students might see AI as a quick and easy way to get their homework done without critical thinking,” Simmons said. “We want students to critically think and learn the academic content, not replace that with AI.”

Simmons also said that inaccurate information and plagiarism are dangers of poor AI use, and that AI responses can be almost identical between multiple students, increasing the risk of being flagged for plagiarism.

For students, the lack of consistency in AI policies is a major frustration. Cuvo said she often hears complaints from classmates about differing rules.

“A lot of students get annoyed because some classes have different rules,” she said. “Some people feel like they’re getting in trouble even when they’re using AI the right way, or sometimes when they don’t use it at all.”

Elaina Smith said she follows AI rules closely but understands why students want clearer guidance.

“Most students are requesting fewer restrictions because they see AI as a helpful academic tool and want clearer guidelines that allow them to understand more,” Smith said.

Smith also said that she uses AI after completing assignments to check for clarity and make sure she followed rubrics, which she sees as responsible and educational.

AI policy enforcement relies on human judgment. While tools such as GPTZero and TurnItIn claim to flag potential AI use, Simmons pointed out that those tools are only a starting point.

“I never take the monitoring tool at face value,” she said. “I look at it, then I communicate with students. The human element is always important.”

Simmons said she plans to incorporate AI more into her digital marketing and leadership courses as the technology evolves, while students predict schools will focus more on teaching responsible use rather than banning tools outright.

As AI continues to shape education, clearer, more consistent guidelines could help connect the gap between innovation and cheating, allowing students to benefit from the new technology without sacrificing their education.