
With the relatively recent rise of common-use Artificial Intelligence (AI) such as ChatGPT, there has been an ongoing debate over the ethics of using it for educational purposes.
AI has been a major topic in educational debates of late. Online sources do not always agree on what the definition of AI is.
Most definitions share similar stances that AI is a technological tool doing complex tasks that would normally require human problem-solving. However, this still casts a wide net on what AI is and what it does.
Many people similarly struggle to define what counts as AI, and what does not. While people might not think of grammar-checking tools as AI, many grammar checkers have recently implemented AI to help restructure sentences or correct grammar.
This issue of defining AI becomes apparent when it comes to creating rules and regulations around it. Some strongly believe that using AI is plagiarism, but do not think of using Microsoft’s grammar-checking tools or Grammarly as using AI.
At GCSU, there is an ongoing discussion about the use of AI in the classroom. Multiple GCSU faculty referred to various committees being made to handle AI or rules possibly being in the talks to address AI on a university-wide level.
An AI Policy Review Committee was created recently, which allows faculty and staff to give opinions on how AI will affect the college.
Craig Callender, an associate professor of English, does consider the use of AI to be plagiarism and wants to remind students that one type of plagiarism is taking an outside source’s ideas and not giving credit.
“While I do think using AI to write a paper is plagiarism, I consider it ethical to brainstorm ideas or check a text for grammar and style with AI,” Callender said.
Nicholas Creel, an associate professor of business law who teaches business ethics classes, believes that for idea generation and grammar checking, AI is acceptable, but that anyone using it for more complex work without disclosing its use is in dangerous territory.
“In a period of a lot of ambiguity, I want a lot of clarity,” Creel said.
Jim Berger is the director of the Center for Teaching and Learning and believes there’s no clear line when it comes to plagiarism and AI. He, similarly to other faculty, suggests professors be proactive about addressing AI usage, whether that is to be for or against using it.
Some professors have had to make tests more difficult for students and teachers as a consequence of attempting to prevent AI usage. Professors have also employed LockDown Browser to prevent the use of AI and cheating.
Many professors do not use AI-detecting tools, however, due to their accuracy issues.
“Any percentage of false positives opens up the door to someone being accused of something they didn’t do,” Berger said.
One measure that can be taken to protect students and professors from AI plagiarism is for students to write their assignments in Google Docs or a similar program that tracks progress and edits. A student’s ability to answer follow-up questions regarding their work to verify its originality can also safeguard against AI use accusations.
Saad Akhtar is a computer science major who believes that many students will cheat regardless of AI since many were cheating before AI was created. Akhtar believes that AI is plagiarism, as AI is generative and collects its information online.
“It is not the same as putting things in your own words,” Akhtar said.
Akhtar also brought up the issue that people who have better grammar skills and a wider vocabulary are more likely to be flagged for AI plagiarism, which leads to students producing lower-quality work to avoid being accused of using it.
“Learn about AI, because they will want it in jobs,” Callender said. “But be ethical about it.”
Professors and faculty are also increasingly using AI for classes. Callender says that he uses AI for proofreading and to create Kahoot questions. Berger thinks that AI has made some work tasks more efficient.
“Use AI to supplement, not replace, traditional learning,” Berger said. “Just like any tool, be aware of its advantages and disadvantages. If folks who use AI avoid the learning process, then they avoid the work it takes to get to the end product.”
Professors were not concerned about their jobs being taken by AI but did think that AI would forever change the job world and may cause disruption. Akhtar and other students thought that while it may not entirely eliminate jobs, AI could greatly reduce the amount of computer science or editing jobs.
“There will be massive displacement in white-collar jobs,” Creel said. “It will affect education significantly, but AI can’t replace human interaction. It makes our role all the more important.”
Professors and students both are concerned about the accuracy of AI. Creel and Akhtar both said that accuracy is a big concern, however, citing that it needs to always be double-checked to find the source of anything said like you would while using Google.
“AI can be used as a tool but it needs rules and regulations, just like other tools,” Akhtar said. “There’s no ethical boundaries in the world of AI currently.”
Akhtar gave music sampling as an example, explaining that sampling is allowed only in certain uses, to prevent plagiarism. Akhtar believes AI should be similarly relegated.
“Everybody has to learn to grapple with it,” Creed said. “You can’t pretend it’s not there anymore.”