ChatGPT is everywhere; it may even be open on your browser right now as you ask it what you should write for your history paper. It may seem like a simple tool to lighten a heavy workload. However, the question of how artificial intelligence (AI) interacts with higher education — and if it will be an effective tool for students and professors — remains a question in academia Macalester is trying to answer.
There is an overwhelming sense of anxiety regarding AI’s integration to our society. With ChatGPT’s launch in Nov. 2022, the world has had to grapple with what AI will look like in education and beyond. Questions pertaining to job security, plagiarism and ethical dilemmas of AI’s biased algorithm have come up. Macalester is uniquely interested in this question and AI is even the subject of our upcoming International Round Table.
A study published by the Pew Research Center revealed that 52% of Americans are more concerned than excited about AI. However, this study also found that Americans with more access to education generally value this technology more. This again raises the question of AI’s use in higher education on Macalester campus.
In response to these questions, Provost Lisa Anderson-Levy and Jenn Hass, vice president for information technology services (ITS) & CIO sent an email out on Monday, Aug. 21 to the entire student body.
“In order to ensure responsible use of these tools, we want you to be aware of the following principles and resources,” the email read. “We recognize that the increasingly widespread availability of these tools raises both legitimate concerns and interesting possibilities that will necessarily vary according to context for use, learning goals, academic discipline, etc. We will continue to communicate as the technology evolves.”
The email was modeled closely after the Harvard letter written this July. Macalester’s email laid out guidelines that the AI Literacy Working Group created and amendments to Macalester’s academic integrity policies. They now include “[t]he unauthorized or unacknowledged use of generative AI tools” under plagiarism and cheating.
The working group was formed last summer to field questions from faculty and create education resources. Britt Abel, associate professor of German studies and director of writing, and Mozhdeh Khodarahmi, associate director of the library are chairs of the group. They worked with faculty and staff from many departments, including the Macalester Academic Excellence (MAX) Center, the Dewitt Wallace library, philosophy and the education studies departments in creating theAI Literacy and Critical Thinking libguide as a resource for students and staff in answering the ever-puzzling questions of AI in higher education.
Their goal is to give Macalester students and faculty the tools they need to make the decision on when and how to use AI.
“ChatGPT and Bard [Google’s version of ChatGPT] are not traditional IT issues,” Tam Perlman, associate director for academic technology services, academic information associate for fine arts and languages and member of the working group, wrote in an email to The Mac Weekly.
“We can’t shut it off and we cannot recommend a reliable way to detect AI use. We can only share what we know about the technology and support the Macalester community as they grapple with the limitations, ethics and opportunities it provides.”
So, what does this all mean for Macalester students who have easy access to AI and for professors who have to decide if, and how, their students should use it? Macalester’s guidelines and lack of policies emphasize faculties’ autonomy in AI use in their classroom. The working group created syllabus statements to help professors approach AI. It gives statements on how AI can be used ethically to enhance one’s own work and ideas, including one that allows it to be used limitedly, such as in the brainstorming phase, and one where AI is completely forbidden.
Professors have taken different approaches, but, for the most part, are leaning into the cautious and hopefully ethical use of AI.
Andrew Latham, professor of political science has decided to include AI as part of the syllabus in his first-year course, “Foundations of International Politics: Western and Non-Western Perspectives,” and teach it as a tool students can use in their writing.
“As much as some people might want to wish it away, it isn’t going away … We have a responsibility to help our students think about how to use it productively, constructively and ethically,” Latham said.
Latham plans to talk about how students can use AI to help brainstorm topic ideas if specific-enough prompts are entered.
Sonia Mehta, assistant professor of educational studies, is also using this tool in her class.
“In my class, which is about education [in] global perspectives, I am allowing the use of AI absolutely, but with the understanding that it cannot replace personal reflection, critical analysis or generational wisdom,” Mehta wrote in an email to The Mac Weekly.. “It is a tool (and a wonderful one) that can be misused as well as lead us towards better learning.”
Paul Cantrell in the mathematics, statistics, and computer science (MSCS) department tasked students with asking ChatGPT to write and paragraph for their final paper and then critique its output. The results were incredibly mediocre and, although grammatically correct, lacked much critical thought.
Cantrell explained this phenomena, saying that language learning models, such as ChatGPT, have no way of knowing if their output is “correct”. They are simply skilled at identifying patterns and relationships. Thus, they can create coherent and grammatically-correct paragraphs but not one with lots of critical thought and introspection.
“So, it turns out that something that uses the appropriate language structure and language forms in a particular context and produces grammatically correct outputs really looks a lot like intelligence to us,” Cantrell said. “And people are now sort of figuring out the limitations of this particular technology.”
There are, however, large benefits to this technology beyond brainstorming. AI can be incredibly helpful for multilingual writers whose primary language is not English. By asking AI to correct grammar and spelling or what Jake Mohan, writing support and coordinator/instructor in the MAX center, calls “surface errors”. Thanks to AI, in this case, students can stop stressing smaller things and focus on what matters more: their theses, argument and logical flow of the paper.
Macalester faculty and staff have also stressed AI as a tool that comes with concerns but also opportunities to grow teaching skills.
“If cheating is something that a computer can do, or if cheating is even a good idea, what are we even teaching?” Cantrell said.
It even poses existential questions about writing.
“Why do we have students write?” Mohan asked. “What purposes does writing serve in college? And I think, for some, [AI] has forced us to ask questions like, do I really need to assign writing in all the ways that I traditionally have? Are there ways I can make it more meaningful, more efficient, more relevant for my students?”
In response to how students should use AI in classwork, Mohan urges the idea of humans in the loop.
“[humans in the loop] is this principle of keeping a human being involved in whatever processes an AI is doing,” Mohan explained. “So if an AI writes a rough draft, a human should still look at it … Because otherwise, we might end up with a closed system where an AI writes it and then an AI grades it, and then is anyone actually doing anything?”
The working group and subsequent learning communities are looking for student input in how AI is being used and what support they can give. The Jan Serie Center for Scholarship and Teaching is also holding sessions for faculty in the same vein to see what questions can be answered and how people are thinking about AI at Mac. The questions remain and the work continues, Mehta explains,
“Our work now, collectively, is to be able to tell the difference and to purposefully tether the amazing reach of ‘artificial’ intelligence to the work of furthering human intelligence: learning, wisdom and compassion,” Mehta said.