Happy birthday to the AI elephant in the FGCU classroom – https://fgcu360.com/

5 – minute read

It may be difficult to believe, but ChatGPT, the widely popular natural language processing (NLP) model, is only a year old. Developed by OpenAI, the free version of ChatGPT launched Nov. 30, 2022. Its younger sibling and paid upgrade, GPT-4, rolled out March 13, 2023. OpenAI’s products use generative artificial intelligence to create human-like text and images based on textual inputs known as prompts.
Chrissann Ruehle is the provost faculty fellow on AI at Florida Gulf Coast University. The Lutgert College of Business instructor has been tasked by Mark Rieger, provost and vice president of Academic Affairs, as a strategist to help FGCU adopt AI.
Ruehle helps address the elephant in the FGCU classroom: How can AI tools like ChatGPT be used responsibly and ethically at FGCU?
“I see my role as being a champion for the AI initiatives at FGCU,” she says. She leads the AI Task Force, and works to increase AI literacy for faculty, staff, students and Southwest Florida community leaders through education and micro-credentialing.
“As educators and staff, we need to proactively consider the ethical implications of ChatGPT as we plan our courses rather than blindly designing the courses and treating ethics reactively as an afterthought,” Ruehle says. One of the ethical considerations she suggests be included in the planning process revolves around transparency and effectively disclosing the use of generative AI, so as not to lose student or user trust and credibility.
Artificial intelligence isn’t new. The most ubiquitous AI-powered tools — Siri, Alexa, email filtering and “frequently bought together” recommendations — are hardly seen as AI anymore.
Ruehle sees generative AI, including ChatGPT, Google’s Bard and Anthropic’s Claude, as being on a path similar to other AI tools. “However, I think the adoption curve is substantially accelerated and shortened. Research has shown that ChatGPT had 1 million users within the first week and accelerated to 100 million users within the first two months.”
Language and Literature instructor Lori Cornelius teaches creative writing, composition and literature courses. In a September panel involving faculty, students and staff, she addressed the concerns of academic integrity and generative AI.
“I’ve tried using ChatGPT for actual work a few times, but I don’t think it does as good a job as I do in outlining main points or creating content,” Cornelius says. “I’m still playing with it to see if it has any value to me. Given its propensity for error and misattribution, I doubt that it has significant value for the kind of research in which I engage.”
Like many people, Cornelius utilizes common AI tools such as GPS navigation and “what to watch next” recommendations within streaming services. She uses Achieve, an online learning system with game-like adaptive quizzing that helps students master writing concepts with customizable questions. She also allows her students to use some of Grammarly’s tools before submitting writing assignments.
However, she notes Grammarly, which uses machine learning and natural language processing to correct grammar-based mechanics, often suggests changes to the structure of a sentence or passage so drastically that a student’s narrative voice may be lost. Cornelius’ course objectives revolve around students gaining creative writing skills or proficiencies. With the upsurge and ease of generative AI tools, she fears writing course objectives cannot be met.
“Text predicting programs are a barrier to creativity, especially in creating unique voices in narratives or poetry,” she says. “There’s a big difference between thinking creatively and asking a machine to think for me.”
“There are many amazing things happening in this field as a whole right now, but we cannot forget that before students are given calculators, they know how to do basic mathematical functions, they understand what the numbers represent,” Cornelius adds. “Replacing the critical thinking and writing processes by ‘feeding the right question to the machine’ will not properly prepare students to engage in the full range of interpersonal or professional experiences that they need to succeed.”
Communication major Emory Cavin is FGCU’s student body president and a Board of Trustees member. He provided tech support during the September panel on academic integrity and generative AI.
“At FGCU, we pride ourselves in being young, nimble and innovative,” Cavin says. “We always strive to help students become good, well-rounded citizens while ensuring they have the tools they need to have a great career once they graduate.”
Cavin believes ChatGPT and Grammarly are probably the most prevalent AI tools used by FGCU students, and he understands the fears surrounding students’ use of generative AI. “I can definitely see a future where tools like these diminish students’ ability to learn and replace humans in the workplace. However, with the right approach, they also present us with amazing opportunities to learn, grow and develop.”
The AI Task Force has researched and distributed a variety of AI policies and syllabus statements for faculty to use, ranging from ‘not permitted at all’ to ‘high use levels with citation,’ according to Ruehle.
“We are committed to maintaining academic integrity while still providing faculty with the academic freedom to use or not use these tools inside their classrooms,” she says.
This is the first in a series of stories about AI at FGCU.


Leave a Comment