AI and the humanities
Art by Jeff Chase. Photos courtesy of Persephone Braham, Jua Hwang, Meghan McInnis-Domínguez, Alison Terndrup and University Library, Museums and Press Special Collections September 17, 2024
Humanities professors use AI to help students learn more about themselves
Ju-A Hwang doesn’t consider herself particularly tech-savvy or an early adapter to new technology. Nevertheless, she added ChatGPT to her introductory writing syllabus after the platform's public release in 2022.
“Some colleagues think that I’m excited about this technology and I’m embracing it,” said the assistant professor of English. “That’s actually not true. I’m trying to understand it.”
Across UD, professors are adapting their teaching in uncharted, unexpected ways, working to explore how generative AI can strengthen instruction, enhance education and boost learning.
This tech-forward approach may seem like a paradox for the humanities, where machines can never fully replicate the human experience. But as Blue Hen professors are quickly discovering, AI also presents novel opportunities to enhance curiosity, creativity and cultural understanding—the very attributes a humanities education seeks to foster and enrich.
How? By beginning with the end in mind.
Prompt: Help English-as-an-additional-language users build confidence and find their voice.
Hwang teaches English 110: First-Year Writing to international, immigrant and first- or second-generation students who use English as their second language.
After a class discussion on AI technologies, she asked her students to read and summarize four passages about academic integrity and plagiarism, then prompt ChatGPT to edit their text. Students had to accept, modify or reject ChatGPT’s suggestions, providing a rationale for each choice. Hwang did not grade the assignment, but it helped her understand the students’ thought process.
At the outset of the class, Hwang expected some students might accept every suggestion, assuming that the bot produced a higher quality text. To her surprise, she found that students were discerning, often choosing to maintain their own voice, even if the AI suggestion used more formal, grammatically correct language.
Peovthida Huot enrolled in Hwang’s class in spring 2024 during her first semester at UD. The business analytics and accounting major grew up in Cambodia speaking her native Khmer. And though she has used ChatGPT in one of her business classes to help analyze research reports, she cautions against relying on AI to generate text.
“If you’re using it constantly, you can over rely on it [and] lose a sense of critical thinking and your own original thoughts,” she said, “especially when it comes to humanities subjects where you have to express your own opinions.”
This takeaway is precisely what Hwang hoped students would gain from the experience.
“I don’t want my students to accept revisions without thinking,” said Hwang. “I want to emphasize their agency as a writer.”
Prompt: Help art students strengthen their research capabilities while assessing the ethical implications of AI.
In Alison Terndrup’s art history classes, students use a large-language model like ChatGPT to create an “object essay,” a formal analysis of a piece of art selected from UD’s collections. The collections do not house famous works like the Mona Lisa or Starry Night. Instead, their pieces range from the chromolithograph “Standing Woman in a Ball Gown” by the 19th-century print studio Imprimerie Rigo to the 1838 lithograph “Recette pour Guérir la Colique” by the French artist Honoré Daumier.
Terndrup then asked students to reverse the process by inputting their analysis into an image generation tool (like NightCafe) to see how AI visually interpreted their object essays. This helped students learn how to ask AI for the results they wanted (for example, a prompt for the Daumier print could be, “A satirical caricature showing an exasperated theater director coaxing his star actor onto the stage by promising ever larger sums of money, rendered in black-and-white lithographic style with expressive lines .”).
Terndrup’s original goal was to better understand the strengths and limitations of AI. If technology can make the research process more efficient, she thought, then scholars will have more time to devote to analysis.
She soon found the process to be a man-machine collaboration, with students learning how to phrase their prompts to receive better results, then interpreting and assessing the resulting text.
“Students were not so much doing the writing part as they were the editorial part — the actual analysis,” said Terndrup.
Sophomore chemistry major Gage Walker found AI’s initial errors helpful in learning how to use the tool more effectively.
“Most of the time the AI wouldn’t do what I wanted it to do so I would keep adding details and guiding it in the correct direction until I was able to get a result I felt was suitable,” he said.
But as students learned how to refine their prompts, questions of ethics inevitably arised.
“Students, especially studio art students, understandably, have reservations when it comes to the use of tools trained on other artists' work," Terndrup explained. The visiting assistant professor permitted students to opt out of AI-based assignments if they had ethical objections, offering alternative, more traditional options.
Terndrup noted that the art world has wrestled with new technology, originality and issues of authorship before. “We discuss the controversial history of photography as art,” she said, “As in, ‘should photography be considered art or technology?’”
Most importantly, she added, AI can’t help students interpret the human element in art, the aspects that elicit an emotional response.
“We’re essentially playing with AI like a pack of crayons or paint brush,” she said. “It’s a tool. But at the same time, it's an incredibly strong tool that we don't fully understand.”
Even as faculty guide students through AI-based exercises to learn more about themselves, everyone is learning more about the technologies as both platforms and users change.
Prompt: Help explain complex terms in centuries-old texts.
A team of Spanish faculty in the Department of Languages, Literatures and Cultures have been working together to implement AI-based assignments in their upper-level literature courses.
Professor Persephone Braham worked as an IT specialist before coming to UD, drawing on this background to explore the more robust language capabilities of AI programs, working alongside associate professors Meghan Dabkowski and Meghan McInnis-Domínguez.
The trio has found that AI can help scan text and provide summaries, especially useful when studying literature that contains archaic terms.
Braham has students input older text and ask AI to provide definitions, saving time looking up unfamiliar vocabulary: “It gives them instant access to engage directly with the ideas of the text—just like a book with a glossary.”
AI also helps explain literary movements and historic context, according to McInnis-Domínguez, who is using AI to have students in her Spanish literature classes create images of characters they read about. To get the image they want, students must also explain their thought process, forcing them to write better prompts.
“Students are visual learners, so this helps them see how others might be imagining a certain scene,” she said. “It’s the traditional class, but in a different way.”
Prompt: Educate students on responsible tech use and prepare them for the jobs of the future.
McInnis-Domínguez’s interest in AI stemmed from concerns around cheating and plagiarism, which she said, “will always be an issue—with or without AI.”
Teaching students to understand the risks and appropriate, ethical uses are therefore critical.
“I thought that if I incorporated large language models and showed students how to use them responsibly, it might actually help curb plagiarism rather than promoting it,” said McInnis-Domínguez. “And that was the case.”
A baseline understanding of AI is also a professional expectation for many jobs.
“A lot of companies train their employees to use these kinds of technologies to enhance their work outcomes and productivity,” said Hwang, who teaches the First Year Writing seminar to non-native English speakers. “So creating this transfer between academic context and professional setting is really important.”
Humanities professors agree that not all courses lend themselves to AI-based assignments. Small class sizes help foster critical discussions around ethics and provide more time to reviewing the AI process. Terndrup said she would avoid having AI-based assignments in online classes, as the face-to-face interaction was so vital.
Still, even basic AI programs have provided a valuable tool. Terndrup’s class was life changing for senior Bradley Graybeal, an art history major with minors in museum studies and art.
“Genuinely, this was the first time I had ever used ChatGPT, and it forever changed my approach to writing,” he said. “For me, starting is always the hardest part, and programs like this are helpful in just getting something on the page.”
Contact Us
Have a UDaily story idea?
Contact us at ocm@udel.edu
Members of the press
Contact us at 302-831-NEWS or visit the Media Relations website