Abigail Dunagan, Campus Carrier features editor
Cammie Wilks, Campus Carrier asst. features editor
It is no secret that technology impacts everyday life, especially for college students. Years ago everything was done with paper and pen, but now students use the internet for emailing teachers, writing essays and turning in online assignments. A new facet that is being talked about more often is artificial intelligence (AI). AI usage has become normalized in the coming years, and now students use it to the point where the faculty at Berry have taken notice.
Every professor at Berry has their own policy about AI, and they allow as much or as little usage as they see fit. Professor of Communication Brian Carroll has an optimistic view of AI being used in both the workforce and the classroom. AI is a tool that can be used for both good and bad, but it is important for students to become familiar with its uses. To achieve this, Carroll assigns his students projects that require them to use AI to complete. In his COM 415 class, he encourages his students to use it to start their research for a literature review. This will not be turned in, but it is used to help the students get a jump-start on identifying previous research that has been done on their subject.
“I call them AI-eligible projects,” Carroll said. “The learning objective is to become facile with generative AI. In the same way when the internet was born, we were offered this vast new universe of tools and knowledge, so why wouldn’t we?”
According to Carroll, the question of ethics must be understood when utilizing AI. While proper ethical use of this tool is still relatively unknown, Carroll said that it must involve transparency and disclosure, and it is important to understand when AI can be beneficial and when it should not be used. Although AI is great at transcription and summarizing large amounts of information, there are some things that it does not do well with, often referred to as “ AI hallucinations.”
“AI does not seem to be particularly good yet at specific sourcing,” Carroll said. “So, if you are writing a term paper and it requires that you cite each and every one of your sources, generative AI is going to let you down.”
Although professor’s individual approaches to AI in the classroom differ, Associate Professor of Teacher Education Chang Pu aims to teach her students how to use AI in a manner that will benefit their future students. Many K-12 school districts use a program called “Magic School AI,” and they use a variety of programs to assist them in the classroom. Some teachers even use AI to evaluate student test scores, or to communicate with parents. These things can save teachers a lot of time, but it is still important that teachers understand it.
Not only do students use computers in class, but most assignments are completed and submitted online.
“I constantly urge my students who are going to be teachers to think about how they are going to perceive their bot,” Pu said. “How are you going to interact with your robot? How are you going to train your AI model to work for you, while at the same time not losing that human connection?”
As with any new technology, there is a concern that students will use it to cheat on their homework. While this is certainly a possibility, Pu believes that this can be prevented if the teachers are familiar with the way that AI works so that they are prepared to educate their students.
“If you ask ChatGPT to create an essay, it doesn’t actually help you,” Pu said. “AI can be a great writing tutor. It can generate different types of essays in different tones. There is a lot of research in the field about how to leverage AI to help students develop writing skills. But of course, plagiarism and cheating are things that that concerns teachers a lot. It is about educating students.”
Since navigating the ethics of AI can be a slippery slope, some professors have elected to stick to traditional teaching methods. Associate Professor of English Christina Bucher has been teaching at Berry for nearly 30 years. Outside of grammar and spell checking, her policy prohibits AI use for any assignment. Most of her classes are for English majors and require students to write literary essays, and she has encountered situations where students have used AI to write their papers. Despite this, she tries to trust her students to write their own work.
“I want them to learn to write themselves,” Bucher said. “I want them to learn to make an argument alone and to polish their style on their own.”
Even though Bucher has a strict AI policy, she still allows the use of it in class within reason. Students are able to use it to summarize readings if they are confused; however, they must explicitly state what they used, or else they will not receive full credit. Although Bucher acknowledges that sometimes she can’t tell that a student has used AI for an assignment, she still tries to trust her students to use it for the right reasons.
Many students use their computers to take notes in class, and sometimes are
encouraged to use AI. Students work on their computers in Assistant Professor of
History Kelsey Rice’s class in Evans 156.
Bucher has experimented with AI in the past. While she doesn’t use it to make lesson plans or to teach, she once asked AI questions on the class readings to see if it would provide the correct answer. The first two questions were correct, but the last one was incorrect. Ever since then, she has been doubtful about AI.
“I’m nearing the end of my career,” Bucher said. “I don’t really have the time or the energy to invest in learning AI, even if I was interested in doing it, and I’m not when it comes to writing. I want students to write using their brains.”
Associate Professor of History Christy Snider has helped introduce AI to Berry’s faculty and staff. She leads a group for those who are interested in learning about AI, new developments with it and how to solve problems with it. There is also a group of faculty that are participating in a group discussion at the American Association of Colleges and Universities (AACU). The group participates in conversations about learning to use AI for faculty, staff and administrative use. Snider, along with other faculty members in this group created a survey for students to answer. According to their results, out of 262 participants, 4 out of 5 students reported they had at least one professor encouraging them to use AI.
According to Snider, students and faculty have come up with creative ways to use AI, and she has incorporated assignments to teach her students how to use it ethically and to simplify assignments. One time, she even translated difficult readings into a comedic speech so that her students would understand it better. While there are worries about AI, she’s confident it can make life simpler.
“I do have some concerns,” Snider said. “However, I don’t think there’s a way to put AI back in the box, so I think it’s better to try and teach people how to use it well than to say you can’t use it at all.”
Despite the range of opinions that educators have on the topic, it is clear that AI is not going away anytime soon. It is crucial that society expands its knowledge on the topic, and that we learn how to use it in a way that is ethical, without compromising academic integrity.
