Artificial Intelligence changes educational perspective

Generative Artificial Intelligence (AI) is a relatively new and emerging tool that allows users to create various forms of content, including texts, visuals, and code. For academia, AI brought up questions about the way students and faculty may use the tool. 

One Minnesota State faculty member who has been monitoring AI advances is Assistant Professor of Geology and Soil Science Beth Fisher. She attended a training in 2022 about AI to explore the tool’s possibilities.

“As I learned more, I realized that I needed to make a decision about how I use this with my classes,” said Fisher. “I’m a parent of teenagers and they use the internet in a massively different way than I do. And I figured there are ways to use these tools productively and there are ways to use them unproductively.”

One of the most controversial ways to use AI is to have AI do the work students are supposed to do. Plagiarism and academic dishonesty are serious issues on a college campus. And things got worse during the COVID pandemic.

According to National Public Radio News, there were 1,077 reports of academic misconduct at Virginia Commonwealth University in 2020-21 — triple the amount from the previous year. Cases of cheating doubled at the University of Georgia, with 228 cases in the fall of 2019 to more than 600 last fall. At Ohio State University, cheating incidents are up 50% over the year before. 

Nevertheless, Associate Professor and Graduate Coordinator for Data Science Rajeev Bukralia states that AI is not always as useful as many believe. He stated that ChatGPT provides confident responses to users’ prompts even when the response is incorrect.

“When students don’t know the material, I can see it by how confident they are,” said Bukralia. “But AI is very confident even if the information it is providing is not relevant.”

Fisher is encouraging her students to explore and learn about emerging tools in a way that allows them to play with AI without violating class policies. 

“Yes, students are going to cheat. They’re using all sorts of things on the internet to cheat, said Fisher. “And I don’t think it’s a constructive productive job as a professor to prevent cheating. I think it’s my job to teach in a way that it’s not even a relevant task for students to go try to cheat.”

Moreover, Fisher says she’s even used AI to create part of her syllabus on using AI for class purposes. The statement reads, in part: “Feel free to discuss AI-generated ideas with your peers, encouraging collaborative learning. However, be cautious not to directly reproduce the generated content in your submissions.”

Same was discussed on faculty training by English professor Kelly Moreland. She explained that it would be smart to indicate the degree that teachers allow using AI ranging from “never use AI” to “always use AI”.

“Both feel equally ridiculous to me to say that we can never use it or that we always have to use it. It’s not that realistic in either capacity,” said Moreland. “We were talking about how a syllabus statement should say something in between, where you use it to help get your writing started, but then you always tell your teacher how you used it and how you revise based on the initial suggestions that you got from AI.”

Sophomore Aruzhan Betigenova, majoring in computer science, said AI was briefly mentioned during her time in classes.

“Usually, it’s not that we learn about it, but it gets mentioned, especially in my computer science classes. What usually is mentioned there is that we should not use AI for class purposes because it’s plagiarism,” said Betigenova. “But in my computer science classes, we usually talk about the industry and how it’s changing and growing. So it just depends on the class.”

Changing industry is also a topic that is discussed among students in engineering classes. Professor for Integrated Engineering Rob Sleezer believes that his role is to prepare students for their future careers, which will involve the use of innovative tools such as AI.

“If my student engineers are not trying to use these tools to be more effective with their time I am disappointed. We have a duty to ensure that our graduates are well prepared for their futures,” said Sleezer. “If we are going to be competitive in the future job market, the job market where these tools are adopted and have transformed society, we need to be well versed in them.”

Sleezer agrees that students can be more productive in academia and future careers by using generative AI. However, he also believes that knowledge about 

basic concepts are still required to obtain. 

“For example, I’m going to teach my children how to do basic arithmetic, even though calculators have largely displaced the need for that,” said Sleezer. “Do they have to be as good at it as a student in the 50s or the 60s? Probably not. Do they need to understand the core concept? I believe so. And distilling those core concepts and sometimes making it clear where we’re exercising those muscles versus exercising our broader capability to lift something is an analogy that I think about.”

Moreland said that AI needs humans’ contribution to do its job as it does right now. She highlights that there will never be a point when we don’t need human intervention.

“The way that AI works is jumbling together writing based on texts that it has been fed,” said Moreland. “Which means that somewhere along the line, texts still need to be produced by humans in order for AI to be smart enough to do what it does.”

Bukralia said he believes the thin line between using AI as an assisting guide and violating academic honesty lies in communication and transparency with professors. 

“Transparency means make sure you look at the syllabus of the course. Make sure you talk to the professor in advance about whether or not they think GPT use is allowed,” said Bukralia. “As long as a student uses critical thinking, I am OK with that. If I tell students that for this work you can use GPT, but not for other types of work, then they should follow those guidelines.”

Header photo: Technological advances in A.I. such as ChatGPT have made it both easier and more difficult for professors to grade students’ work. 

Write to Amalia Sharaf at

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.