Nov. 30 marked the first anniversary of a program that has changed the world of artificial intelligence, also known as AI.
Created by OpenAI, a prominent AI research and deployment company, ChatGPT functions as a large language model chatbot that can produce text on demand to satisfy user inputs.
Andrew Perrault, an assistant professor of computer science and engineering at Ohio State, said ChatGPT’s existence offers clear advantages and disadvantages in the world of academia. Many higher-educational institutions — including Ohio State — have taken steps to define the working relationship students can have with ChatGPT and AI technology at large, university spokesperson Chris Booker said in an email.
“I still primarily view [ChatGPT] as a negative, certainly in the context of teaching,” Perrault said. “In the context of research, it potentially enables a lot of really cool projects.”
Perrault said ChatGPT is an example of a transformer — a deep neural network. Transformers essentially use a wide range of data inputs from the internet to generate results, meaning it draws on materials already available on the internet to create a “new” product.
“GPT is a ‘generative pre-trained transformer’ that is trained on a huge amount of data, basically all of the text that is available on the internet, to be as good as possible at predicting the next word in a sentence,” Perrault said. “So when you input something into ChatGPT, it tries to complete it in the most probable way possible.”
Perrault said what sets ChatGPT apart from other AI programs is its ability to interact with people, which is a direct result of the sheer amount of data it can access. Previous “goal-driven” AI programs have required strict and precise input commands that have limited their abilities to replicate human interaction, he said.
“The kind of major breakthrough that GPT has represented was [that] we could give them lots of data from human interactions, and the internet created this enormous pile of human interaction data in natural language,” Perrault said.
As a result of such human interaction data, Perrault said ChatGPT can successfully perform a wide variety of tasks such as paraphrasing, creating computer code and altering preexisting text to be delivered in a different tone or style. He said one area in which ChatGPT struggles, however, is fact recall.
“They have some ability to recall facts about the world, although this is where they’re possibly weakest,” Perrault said. “At the level of many undergrad courses, they really are pretty strong, but when you start getting into more expert-level topics, they start to make pretty bad mistakes.”
In Ohio State’s “Code of Student Conduct,” academic misconduct is defined as “any activity that tends to compromise the academic integrity of the university or subvert the educational process.”
Booker said students have been using ChatGPT in ways that correspond with this definition, which has moved the university to revise its code to address improper AI use.
“Obvious cases of misconduct include submitting a paper a student has not written,” Booker said. “The most recent revision, approved in November, included several changes, among them clarifying that the unauthorized use of generative artificial intelligence systems or similar technologies to complete academic activities would be an example of academic misconduct.”
Booker said some students might not understand that relying on ChatGPT or engaging in other forms of academic misconduct harms their intellectual growth.
“The acquisition and development of knowledge and skills are fundamental to an Ohio State education, and those goals are undermined by academic misconduct,” Booker said.
Beyond students, Perrault said ChatGPT has made instructor evaluation increasingly difficult, as it can be hard to discern if certain answers are the result of students’ hard work or AI technology.
“Typically in most undergrad courses, ChatGPT is going to be performing better than most of your students just because it has seen so much information about the subject,” Perrault said. “It’s really hard to know on a homework or take-home exam whether the student came up with an answer on their own or whether it’s produced by one of these systems because they’re just so good, so honestly it’s a real pain and I wish it didn’t exist in that in that sense.”
Booker said a noticeable lack in sentence fluidity, drastic writing improvements, repetition of words found in prompts, circular writing and a lack of in-text citations are all indicators of AI in students’ submissions.
“There is not one thing alone that makes an instructor suspect the use of AI,” Booker said.
Perrault said yet another shortcoming of ChatGPT is its uncredited use of sources. While it relies on online information to produce results, he said its actual sources receive virtually no recognition or earnings.
“If you look at [OpenAI’s] research papers, they upweight the samples from reputable sources, so Wikipedia and books are actually the things that get the most emphasis, and right now that is being done without any kind of compensation to the authors of those documents,” Perrault said. “To me — this is not me speaking as an AI researcher, but just as a citizen who’s interested in people producing useful things — this seems really unfair.”
Despite the negative outcomes that models like ChatGPT are guilty of creating, Perrault said AI programs have a lot of utility in their ease of use; notably, they have transformed the computer science field by minimizing the need for humans to complete complicated and detail-oriented coding tasks.
“Being able to just write something in natural language and have a GPT translate that into software inputs, and then get out the picture or the model or the chunk of code that does things you want, just seems like something that would be tremendously useful, right?” Perrault said.
Perrault said ChatGPT’s language, editing and translation abilities have made it easier for students who aren’t native English speakers to feel more confident in unfamiliar academic settings.
“For students that English is not their first language, which was a huge portion of the computer science and engineering graduate students, I think it’s had a very major and positive impact in that way,” Perrault said.
Perrault said considering the AI field is advancing rapidly, an innovative next step is introducing the ability to replicate humans and their behaviors.
“There’s some research showing that you can use them as stand-ins for real people in some kinds of experiments,” Perrault said. “You can imagine this could be a really useful tool for developing new software interfaces, for running behavioral experiments at a huge scale with low cost, I think these kinds of things are potentially really exciting.”
Perrault said OpenAI is pioneering the world of AI with not only ChatGPT but also its March 2023 release of GPT-4, which is even more advanced than its predecessor. He said these releases are historic moments in the AI realm as a whole.
“ChatGPT and GPT-4 are probably some of the most important events in artificial intelligence in the last decade,” Perrault said. “I read all [OpenAI’s] papers the day they come out, I use all the models myself and try to figure out what they feel like and [how] they have transformed the research field.”
More information about using AI in an academic setting can be found on the Ohio State Teaching & Learning Resource Center’s website.