Site icon Viking Fusion

A look at anti-AI perspectives

Bella Patton, Campus Carrier features editor

Ava Jarrell, Campus Carrier asst. features editor

As artificial intelligence (AI) programs continue to advance, their use in education, daily life and the workplace is still a controversial topic. Students and faculty across institutions all over the world have to deal with the rapidly growing realm of AI chatbots. These platforms are quickly becoming something students rely on to draft essays and work on projects, raising the question of how AI is impacting students’ education and faculty’s experiences when dealing with academic dishonesty. 

On one side of the debate, some students and faculty are against the use of AI and recognize the danger that AI tools like ChatGPT and Copilot pose. Senior Scarlett Biggers, a library research specialist, has a firsthand perspective on how AI affects student work. She is one such person who avoids AI.

“People will just have a whole source list of made-up sources that don’t exist that ChatGPT or Copilot or whatever they used had just pulled out of thin air,” Biggers said. “And now, you ask AI to get an article for you and provide a summary of it, but when you try to access it yourself, it’s behind a paywall. The issue there is that these are researchers that are not getting paid for their work, and research publications that are no longer getting paid because we have this sneaky little robot coming in behind them and taking work.”

Her objections also extended beyond academia. As a student, a young person and a future practitioner of communication and law, Biggers worries about the mental toll that extended AI usage can take on our ability to function and think in day-to-day life. 

“I personally cannot stand AI in the academic setting or in a casual everyday setting,” Biggers said. “I am disturbed by the increasing casual use of AI. Why are we using AI to make a grocery list? Why are we asking Chat a question that we could type into Google? For that matter, why have Google, TikTok, Instagram and every other major media platform suddenly incorporated AI assistants?”

Ruby Dailey | CAMPUS CARRIER
Scarlett Biggers serves as a library research specialist, assisting students in finding sources and compiling research in the lobby of Memorial Library. Students can normally find her at the desk on weeknights from 6 p.m. to 8 p.m.

She warned of the greater societal implications of integrating so much of our lives with AI. 

“It’s a gateway to making the masses dependent and unwilling to think for ourselves,” Biggers said. “It’s a willing illiteracy.”

Many syllabi across all departments have begun to limit a student’s choice to avoid using AI, with justifications often pointing to theorized future applications in the workforce and a fear of being left behind. Biggers acknowledged this reasoning but said that ethical and environmental considerations should also be taken into account.

“I think there at least needs to be discussion when we use AI of the implications, both ethical and environmental,” Biggers said. “Implications of how much water it takes, the land that’s being cleared to support these data centers, as well as the ethical implications of stealing other people’s research, not giving credit to the researchers that would neglect certain aspects of the research in favor of conformation bias.”

Biggers explained that a reliance on AI could be mitigated by not utilizing it, and said that human error is better than the work created by generative AI.

“We have to be willing to do things the hard way,” Biggers said. “I know that it’s a whole lot easier said than done, and a lot of us have several classes and we put things off at the last minute, and it’s easy to prompt AI to give us what we need; however, sometimes it’s better to turn it in late if it means you did it yourself.” 

Ruby Dailey | CAMPUS CARRIER

On the educator side of the AI conversation, many professors are also worried about the implications of AI in academic spaces. Over the last three years, most professors have had to create an AI policy specific to their classroom. Victor Bissonnette, associate professor of psychology, said that while he can appreciate some aspects of AI, academic dishonesty and the loss of learning skills are serious issues posed by AI.

In his research classes, Bissonnette has students do literature reviews where they visit the library and search for articles. Bissonnette said that the only time he allows his students to use AI is to find research articles. 

“Where I draw the line, and the rest of my policy is, students may not use generative AI in any way to generate text for the writing assignment,” Bissonnette said. “For example, if they’re reviewing a journal article, they’re responsible for reading the article themselves, capturing the major points in the article in their own notes and then writing their own summary themselves. I’m really strict. I tell the students, ‘You must put your fingertips on the keyboard and type every word. You may not use generative AI.’”  

Bissonnette said that a lot of the time, students want to use AI on their assignments because they don’t recognize the academic value in completing the work themselves.  

“Think about why we assign you that paper,” Bissonnette said. “Was it just to produce a paper? No. It’s not about the product. Now, I know someday in a workforce, you might have to produce a product, like a memo or a paper, a summary or something, and AI might be useful as saving you time and doing some of that work. But as a student, of course, the reason we assigned that wasn’t about producing the paper. It’s about you producing the paper. It was about the intellectual activity of you understanding what you’re reading. Your ability to organize your thoughts and your ability to express your thoughts.”

Bissonnette also said another reason students could be drawn to AI is because they want to avoid receiving negative feedback on assignments.

 “The other big issue, I think, why we’re getting so much AI abuse, is that students cannot handle their own imperfection,” Bissonnette said. “As student writers, you all are developing your writing skill, and for the most part, when we’re undergraduate students, we’re all pretty terrible. We are not gifted writers when we’re at that age, at that level of education. Learning to write is a lifelong journey of growth and getting better over time. So, I think when students face a difficult challenge, and they don’t quite understand what they’re doing, and they’re really not certain they’re doing good work, they fear getting negative feedback.”  

Ruby Dailey | CAMPUS CARRIER

Bissonnette further said that using AI on assignments is cheating, a word he uses very purposefully with his students. He said that last semester, over half of the meetings about academic dishonesty with students were about using Grammarly to rewrite papers. Despite Grammarly being advertised as an AI-powered platform, he said many students were surprised by his accusation of academic dishonesty because they didn’t view Grammarly as an AI use violation. 

“They said, ‘Well, I did not use AI, I used Grammarly,’” Bissonnette said. “Which is an AI platform, and I had to explain that. Go to the website, and look at their advertisements. They are advertising that they use AI to rewrite your paper. It is absolutely an academic dishonesty infraction.” 

Looking forward, Bissonnette hopes that educational spaces can adopt AI in ways that don’t encourage cheating in the first place and that future instructors will find ways to be enthusiastic about assignments. 

“I’m hoping to find ways where AI will be a partnership in the learning process rather than a simple shortcut to circumvent the learning process,” Bissonnette said. “This is difficult, and we’re trying to be open minded. We’re trying to be fair. I don’t want to just be an old fuddy duddy who says, ‘No, this new technology’s evil. Make it go away.’ No, no, no. We’ve got to live in the world, we’ve got to live with the students. We got to work with the students we have. We got to meet you where you’re at. So we’ve got to be constructive and find collaborations between new technologies and the underlying learning goals that we have.” 

Exit mobile version