ChatGPT was released by Open AI on November 30, 2022. Milligan faculty, staff, and students have responded to the language-generating software with caution, optimism for how it might be used creatively and a focus on keeping people at the center. 

Since its release, ChatGPT has been the subject of discussion surrounding its implications and how to respond. The newest release from Open AI, GPT-4, has recently caused concern, including from Elon Musk and Steve Wozniak, surrounding the capabilities of artificial intelligence. 

Some Milligan faculty began conversations about ChatGPT in the Fall semester following its release. This semester Director of Writing Heather Hoover gave a talk at the January faculty meeting to tell colleagues about an addendum that the Composition faculty had made in the syllabus.

The technology raises many concerns and possibilities for academics in particular so The Stampede asked some of Milligan’s faculty, staff and students about their perspectives on it.

Caution

One major concern for education is the effect it will have on classwork. 

Academic Dean Rich Aubrey, who holds a doctorate in educational leadership, said that he has explored websites that can analyze writing to see if it has been generated by artificial intelligence, but that is not his primary response. 

“I think it’s responsible to consider those, and see if they might help, consider the cost, and so on,” Aubrey said. “But ultimately, the responsibility is between Milligan faculty, Milligan administration and Milligan students to have an open line of communication.”

Aubrey said he sees a danger in trying to anticipate every shortcut someone might make, especially when that means penalizing students that do things the right way. 

“That’s not to say we want to make it easy to cheat,” Aubrey said. “I think there’s a balance there that we’re trying to find. And every new month or week might bring a new challenge. So it’s just wise of us to stay aware of what’s happening.”

Sam Arendale, a sophomore majoring in history and pursuing secondary education licensure, compared it to other tools such as Quizlet, which allows users to make and share flash cards online. He sees the potential for misuse, such as looking up answers for homework on Quizlet rather than using it to study.

In his classroom in the future, he said that he would consider banning ChatGPT and that a school may need to implement policies to prevent abuse of the technology. 

Haley Brannan, a senior who is double majoring in computer science and information systems, said that she has heard concern that people would use ChatGPT to plagiarize.  

“At some point you’re just cheating yourself out of the money you’re paying to be here,” Brannan said. “I don’t think I’d trust a computer to do homework.” 

Dr. Blackburn, professor of church history, agreed to do an experiment to test the capabilities of ChatGPT. He graded a one page writing assignment written by me, a student in his class, and another written by ChatGPT, prompted so as to answer the discussion questions for the assignment. At first he was surprised.

“It is a little frightening,” Blackburn said. “I thought I wouldn’t have to work at it.”

Blackburn  said that the writing was at a high level so he was not able to identify which response was from ChatGPT based on grammar. But he did successfully identify which was written by ChatGPT. The assignment was to interpret small excerpts of a reading in light of it as a whole. The lack of context is what gave ChatGPT away.

“You might be able to infer that (a statement made by ChatGPT), but it doesn’t seem to be a theme throughout the reading,” Blackburn said. “What is completely missing is a contextual understanding of that (the quote) within the whole work.”

Blackburn said that the software might require professors to raise standards in grading student responses. He also said it might affect how he asks questions.

“If anything it will prompt us to ask more imaginative questions,” he said.

Assistant Professor of Cybersecurity Joseph Carpenter considers critical thinking to be an important part of working in the technology field. He expressed concern for how the software could diminish the need for people to actually learn. 

“It’s very concerning from a lifelong learning capacity,” said Carpenter. “I don’t think the person [using it] is aware of the adverse effects.”

Carpenter said that, like anything, it can be used for good or for bad. One other concern he has is for privacy. 

“It’s constantly learning, which is very intrusive from a privacy standpoint,” said Carpenter. He says that the privacy agreement is very important to read, particularly because the software is utilizing information from the user. 

Although expressing concern for the ways the software could be used poorly, Brannan does not think there should be fear surrounding the software. 

“I definitely think it could be taken too far,” said Brannan. “I think a lot of people are scared of it (AI). But it’ll only do what you tell it to do.”

Creativity

“I think there is always fear surrounding new technological tools, but I don’t think we have to approach it that way. I think we can be cautious and creative at the same time,” said Hoover.

Aubrey and Hoover said they think that the best approach is to be in communication with students.

“What I told my colleagues is that I’m actually not that worried about it because of the way we teach writing,” Hoover said. “We really focus on process. We do a lot of writing in class by hand. We talk about having a conversation with the material, with each other.”

Arendale said that he thinks it is helpful for students to be taught how to use technology well. He recalled that his high school provided each student with a computer, and that allowed him to learn how to use that technology productively.

“I learned how to use it as a tool and not as a crutch,” he said.

Arendale also sees possible learning benefits from the technology, such as getting a summary of a book in order to understand the main idea of it after reading it. He said that he would hate for such an interesting technology to be blacklisted so soon when it could possibly be used in beneficial ways.

“Quite frankly, my initial response is it opens a new world for abusing technology, but it also opens a new world for using technology,” said Aubrey.

Hoover has read about teachers who have their students use ChatGPT to develop outlines that they then use to write their own papers. 

“You’re able to take that tool to build your own essay,” Hoover said. “I think that’s brilliant.”

Aubrey said he read an article from a professor who uses the tool in the writing process. In her class she has students take an essay from ChatGPT, and edit, critique, and grade it. Aubrey said he thinks this is a very creative way to use this sort of tool while still teaching the writing process. 

“You can use it to give feedback on papers,” said Hoover. “I think AI may be threatening on some level, but I think it will push us to be better, at some point.” 

Brannan said that one of her favorite things about computers is the ways that it allows her to be creative. Carpenter shares in the appreciation for creativity and said that one of his primary concerns is what effect AI could have on authentic creativity. 

“Technology has the potential to take away from authentic creation,” said Carpenter. “There’s no substitution for that.” 

Need for people

“There’s totally creative ways to approach it. But I don’t think it’s an educational silver bullet that’s either going to kill it or save it,” Hoover said. “And really I don’t think people are obsolete. I think relationships are really where the magic happens.”

The addendum to the academic integrity policy in the composition class syllabus focuses on the importance of students learning the skills of writing.

“In my academic integrity policy it says that’s a tool we can use, but not right now,” Hoover said. “Let’s learn how to do these processes ourselves. Then maybe at some point we may interact with it (ChatGPT). But first we need to learn how to write in a good way ourselves.” 

Aubrey recognizes the need for what people offer when it comes to gaining new information.

“Human beings are at the center of this process because, while artificial intelligence can go out and gather information, it cannot create information,” Aubrey said. “The information is there, and it’s been put there by human beings. So we’re at the center of this process, and we need to make sure we continue to be at the center of it.”

Brannan recognizes a similar need in the process of computer programming. Although ChatGPT has computer programming capabilities, Brannan recognizes certain parts of the process that an impersonal AI software could not achieve.

“There has to be a person involved. It can’t be completely impersonal even if it creates a slightly more impersonal experience,” she said. “If you’re programming for people to use it, you have to think about the user… You have to think about the person and making it comfortable to use.”

Aubrey said that through his years as a teacher he has shifted toward focusing more on concepts rather than facts.

“At some point, in any assessment I give, I want my students to have to process those concepts, to make sense of them and to apply them to real life situations,” he said. “I think that in some ways we can use technology as a tool to gather information. But then we as humans need to process that information together.”

Headline Image created by iblnews.org

Related Stories

Milligan Students make their mark on annual senior graphic design showcase

SGA President Evan Magness Charges Towards Change this Academic Year

Boosting Healthcare Education: Ballad Health $1 million dollar donation

Choosing Milligan’s Next President: Inside the Presidential Search Committee

Milligan Hosts Book Release for Dr. Hoover’s New Book: “Composition as Conversation”

Tutoring Services Have Started Back Up at Milligan