[While acknowledging the use of the conversational AI ChatGPT technology to compose an important message is more ethical than concealing its use, the reaction to a message from the Deans of the Office of Equity, Diversity, and Inclusion (EDI) at Vanderbilt University’s Peabody College of Education and Human Development regarding the recent shootings and deaths at Michigan State University confirms that there are some contexts in which presence-evoking technologies are not acceptable. The story below from The Vanderbilt Hustler
provides details, and a follow-up story has the headline, “Peabody EDI deans to temporarily step back following ChatGPT-crafted message about MSU shooting.” The author of an Inside Higher Ed
opinion column titled “Boilerplate and AI. This almost certainly will happen again.” writes this:
“For a senior leader with shaky prose skills and a tight schedule, I could imagine making the mental leap from ‘I’ll have my assistant write it’ to ‘I’ll have ChatGPT write it.’ They’re both versions of delegating. In this case, clearly, that was an egregious miscalculation.”
–Matthew]
[Image: The Peabody Administration building, captured on Oct. 24, 2022. Credit: Hustler Multimedia/Claire Gatlin]
Peabody EDI Office responds to MSU shooting with email written using ChatGPT
The email stated at the bottom that it had been written using ChatGPT, an AI text generator.
Rachael Perrotta, Editor-In-Chief
February 17, 2023
UPDATED: This piece was updated on Feb. 17, 2023, at 7:30 p.m. CST to include an apology from the EDI Office. It was further updated on Feb. 17, 2023, at 9:05 p.m. CST to include a statement from the university about its use of AI in generating messages.
A note at the bottom of a Feb. 16 email from the Peabody Office of Equity, Diversity and Inclusion regarding the recent shooting at Michigan State University
stated that the message had been written using ChatGPT
, an AI text generator.
Associate Dean for Equity, Diversity and Inclusion Nicole Joseph sent a follow-up, apology email to the Peabody community on Feb. 17 at 6:30 p.m. CST. She stated using ChatGPT to write the initial email was “poor judgment.”
“While we believe in the message of inclusivity expressed in the email, using ChatGPT to generate communications on behalf of our community in a time of sorrow and in response to a tragedy contradicts the values that characterize Peabody College,” the follow-up email reads. “As with all new technologies that affect higher education, this moment gives us all an opportunity to reflect on what we know and what we still must learn about AI.”
The initial email emphasizes the importance of maintaining safe and inclusive environments amid ongoing gun violence across the country. It states that “respect” and “understanding” are necessary for doing so.
“We must recognize that creating a safe and inclusive environment is an ongoing process that requires ongoing effort and commitment,” the initial email reads. “We must continue to engage in conversations about how we can do better, learn from our mistakes, and work together to build a stronger, more inclusive community.”
The body of the initial email mentions “Peabody” once and does not use any other Vanderbilt-specific terms. It also refers to multiple “recent Michigan shootings”; however, only one incident occurred.
Laith Kayat, a senior, is from Michigan, and his younger sister attends MSU. He stated that the EDI Office’s use of ChatGPT in drafting its email is “disgusting.”
“There is a sick and twisted irony to making a computer write your message about community and togetherness because you can’t be bothered to reflect on it yourself,” Kayat said. “[Administrators] only care about perception and their institutional politics of saving face.”
Kayat called on Vanderbilt administrators to take more action in preventing gun violence and to be more intentional with these actions.
“Deans, provosts, and the chancellor: Do more. Do anything. And lead us into a better future with genuine, human empathy, not a robot,” Kayat said.
Senior Jackson Davis, a Peabody undergraduate, said he was disappointed that the EDI Office allegedly used ChatGPT to write its response to the shooting. He stated that doing so is in line with actions by university administrations nationwide.
“They release milquetoast, mealymouthed statements that really say nothing whenever an issue arises on or off campus with real political and moral stakes,” Davis said. “I consider this more of a mask-off moment than any sort of revelation about the disingenuous nature of academic bureaucracy.”
Samuel Lu, a sophomore and another Peabody student, stated that this use of ChatGPT is disrespectful to gun violence victims. Like Davis, he stated that he was disappointed in Peabody’s actions.
“It’s hard to take a message seriously when I know that the sender didn’t even take the time to put their genuine thoughts and feelings into words,” Lu said. “In times of tragedies such as this, we need more, not less humanity.”
Vice Provost and Dean of Students G. L. Black also sent an email to students on Feb. 14 regarding the shooting. His email did not include a note regarding ChatGPT. The university did not yet respond to The Hustler’s request for comment about whether ChatGPT has been used to generate other university statements.
OpenAI, the company that created ChatGPT, launched a free, online tool to distinguish human-generated and AI-generated text on Jan. 31. The tool labels text as either very unlikely, unlikely, unclear if it is, possibly or likely AI-generated. The body of the EDI Office’s initial email was rated as likely AI-generated by this tool when tested by The Hustler, while Black’s email was rated as unlikely AI-generated, and Joseph’s apology email was rated as very unlikely AI-generated. However, OpenAI warns that this tool is not always accurate and correctly identifies AI-generated text as likely AI-generated 26% of the time.
Joseph Sexton, also a senior and a Peabody student, questioned in what other contexts the university would use ChatGPT to generate messages.
“Would they do this also for the death of a student, faculty, or staff member?” Sexton said. “Automating messages on grief and crisis is the most on-the-nose, explicit recognition that we as students are more customers than a community to the Vanderbilt administration. The fact it’s from the office of EDI might be the cherry on top.”
In a message to The Hustler, a university spokesperson said the university does not use ChatGPT or other AI systems to generate messages.
“We believe all communication must be developed in a thoughtful manner that aligns with the university’s core values,” the emails reads.
Sexton added that using ChatGPT is presumed to be cheating in academic settings, adding to the “irony” of the EDI Office’s use of the tool. The university did not respond to The Hustler’s request for comment on whether using ChatGPT for academic assignments violates the university’s plagiarism policy.
“Who knows precisely what was paraphrased. This kind of parenthetical citation would get points off in any class I’ve taken here,” Sexton said. “Nonetheless, this feels definitively wrong, whether ChatGPT provided a single word or the entire spiel.”
|