ChatGPT sparks conversation about campus usage
When associate professor Mathew Muether input his physics test into ChatGPT, the chatbot answered about 95% of the questions correctly. On the qualifier for physics doctorate students, it reached 100%.
In the past few months, Wichita State faculty and staff have had various concerns and questions about the use of AI. In early April, the Faculty Senate discussed Wichita State’s policy on ChatGPT without coming to a concrete decision.
The interest in artificial intelligence (AI) has increased with the spread of ChatGPT, a chatbot that generates text based on a prompt. According to Digital Trends, ChatGPT is built on a large language model, trained on large amounts of various internet content.
John Jones, Media Resources Center executive director, organized an AI Interest Group so faculty could discuss using AI in classes and educating students on the tool.
“We wanted to be able to try and have conversation to compare notes …. and talk to folks about how they’re using it for personal projects, how they’re using it for the institution, how they’re talking to students about it,” Jones said.
During their first meeting in April, the AI interest group discussed the need to train faculty on how AI works.
Academic dishonesty and authorship
According to Muether, some faculty have expressed concerns about academic honesty, like students passing off AI-generated work as their own.
Muether said GPT-4, the latest ChatGPT model available for a monthly subscription, makes it difficult to discern that the text is AI-generated. Microsoft 365 Suite, which WSU offers to students for free, also plans to add ChatGPT into its software.
“(ChatGPT is) not copying and pasting. It’s generating (text) on its own,” Muether said. “These tools are going to be getting more and more powerful.”
English professor Darren DeFrain said the English department has already had a couple of students submit papers generated by Chat GPT.
He said he has a friend at another university using AI to create artwork, and it can be hard to determine the difference with students using AI for portions of their paper.
“It can be a difficult thing to really articulate that in a way that’s fair to the student, to the other students in the university,” DeFrain said.
According to the United States Copyright Office in March, a work must be created by a human to be copyrightable and ‘of authorship.’ The office stated it won’t register works produced by a machine “without any creative input or intervention from a human author.”
While the AI Interest group is not creating AI guidelines for the university, members like Muether and Imran Musaji emphasized that teachers must clearly state their policies on AI, such as if students should credit AI or not use it at all.
“If you’re given an assignment, and you use a tool for an assignment that you were told not to, that’s not okay,” Musaji, assistant professor of communication sciences and disorders, said.
DeFrain said AI could potentially help with brainstorming or writing in a specific voice for essays, but students should credit AI.
“The thing that students have to do is to acknowledge in the body of their paper: ‘GPT made this,’” DeFrain said.
Educating students on AI
Jones said AI is already used in fields like finance, business and health care, and it is important for faculty to educate students on its uses.
“The advent of ChatGPT is the tip of an iceberg,” Jones said. “We need to be connected well … to know what tools are being used and how they’re being used (in different fields), so we can build that into our curriculum.”
Musaji said when Google first emerged while he was in college, his biology professor taught students how to do a Google search, something he plans to emulate with ChatGPT.
“(My professor) recognized it was a disruptive technology that was going to become important in all of our lives,” Musaji said. “I think back on that, and I’d like to do that for this next crop of students (with ChatGPT).”
DeFrain said he also plans to educate his English graduate teaching assistants on AI in the fall.
Musaji and Muether both said people should be aware of privacy concerns with AI. Musaji said there isn’t an understanding of how companies store and save data entered into their models, which he said greatly concerns him as a health care provider.
Musaji also said people should realize AI is biased from the content it was trained on, which can affect its responses.
“(If) I said, ‘I want (ChatGPT) to generate some text in the style of teenagers,’ and it generates a bunch of texts for me, where is it getting that training data from — and is it accurate?” Musaji said.
AI on campus
According to outgoing Faculty Senate President Susan Castro, the use of AI would likely vary in each field.
Muether said that physics and other disciplines use other types of AI to synthesize, classify and analyze research data on campus. He said ChatGPT can assist with coding or generating practice sets, which he said a few physics students have used it for.
“I can imagine that publishers are going to go to using this tool once we vet that they’re credible,” Muether said. “Since (ChatGPT is) generating (a problem set) on the fly, the problems could be responsive specifically to things students are struggling with.”
Jones, who has master’s degrees in English and fiction, said he doesn’t expect AI to be used in literature classes. He said using AI for writing can undermine learning important skills.
“People who make the mistake of using the tools irresponsibly don’t develop the skills that they’re going to need later, and so they hurt themselves,” Jones said. “It’s part of our job as instructors to help our students understand the value in developing these skills themselves.”
DeFrain said using AI for papers can be problematic when students don’t realize the answers are inaccurate. While playing with ChatGPT, he noted some instances when the algorithms couldn’t connect separate ideas and just “fills it in with whatever it can reach for.”
“It’s sort of a parallel with some of the plagiarism cases I see over the years, where people just grab something wholesale and copied into their essay, and they don’t really look at it,” DeFrain said.
Jones said with ChatGPT being used for writing, he expected some classes would switch to in-class writing assignments or oral exams. When asked how this would affect disabled students who struggle with those tasks, he highlighted accommodations and restrictions for AI websites.
“If we have a student who has a learning disability like dyslexia, and for them, having to handwrite an assignment is disproportionately difficult … (using a computer is) an accommodation that (Office of) Disability Services can arrange,” Jones said.
DeFrain said with many online English classes, faculty have also been looking at other approaches to track writing while balancing trust of students.
“We want students to recognize that this is something that clearly is going to be a part of their world, and they need to find ethical ways to use it,” DeFrain said.
Muether said the university should act fast in establishing guidelines.
“At minimum, we need to have some resources to train faculty and students about these tools for fall … because these tools will be widely deployed by that point,” Muether said.
AI in the job market
Some students might be worried about securing jobs due to ChatGPT. Jones said certain jobs may disappear or change, but new technology has also led to new opportunities, like content creation on YouTube and TikTok.
“One of the great things about programs like ours, where we have a lot of focus on entrepreneurship and innovation, Wichita State’s positioned really well to help students find a path in a changing world,” Jones said.
When asked how ChatGPT could affect jobs based on writing, DeFrain said it might impact how people approach writing, but the “human element” is always crucial, especially in memoirs or expressions about specific people.
“The computer can give me the facts, and it can even gloss it up and make it look like it was written by a real human,” DeFrain said. “But if it’s not, then there’s fundamentally something that’s just wrong with that picture.”
Musaji said he encourages everyone to play with ChatGPT.
“Both the students and the faculty need to have basic understanding,” Musaji said. “People don’t realize how powerful (AI is).”
Courtney Brown was one of the news editors for The Sunflower during the 2023-2024 year. She previously worked as a reporter and assistant news editor....
Wren Johnson is an illustrator for The Sunflower. Johnson is a third-year communications major that loves chickens. In her free time she likes to read,...