My teacher used Chat-GPT for my recommendation letter. What should I do?
“I’ve recently heard a rumor that a teacher I asked to write my college rec used ChatGPT to write their letters. I am upset. If I knew they were going to use ChatGPT I would have asked someone else. I thought hard about who to ask, and I chose a person I respected and who I thought respected me. Now it’s too late to ask another teacher. Should teachers have to disclose if they use ChatGPT to write their letters? Is it okay for students to ask them to promise not to?”
I understand your frustration. Asking a teacher to write a recommendation letter is an enormously vulnerable experience. You’re trusting them to represent you as a student to the best of their ability beyond what a college can see from the other data they receive, like transcripts and test scores. The reason using ChatGPT to write a recommendation letter feels so wrong is that these letters are inherently personal; they’re meant to tell a story about a real human and the student-teacher relationship. Conversely, ChatGPT is inherently robotic — opening up the possibility of plugging in a name and a descriptor into the engine to pump out a letter. The risk of using ChatGPT is that the letter becomes less likely to represent a student’s character or individuality and that the relationship between the teacher and student is more likely to be transactional than genuine.
ChatGPT can be a powerful tool to enhance writing. It can save time, allow for language barriers to be overcome, and streamline the writing process. Indeed, ChatGPT can even level the playing field, leveling teachers with different writing abilities, like those writing in their second or third language, for example. The same could be said for students; but if students don’t disclose their use of ChatGPT or get permission to use it, they are subject to academic integrity violations. These measures exist so that teachers can authentically assess a student’s writing and thinking; it’s about honesty and integrity. And that goes both ways.
Therefore, irrespective of whether or not the quality of the letter of recommendation is worse with or without ChatGPT, teachers should publicly disclose their use of ChatGPT before it’s too late to ask another teacher. If the knowledge that the teacher used ChatGPT would change their decision in who to ask, disclosure is even more essential and must happen while students are still able to switch teachers. Students have the right to make decisions that best align with their application and their values. While students can switch between teachers, teachers maintain the right to refuse students. Therefore, public disclosure (while switching is still permissible) remains justified.
Given that the teacher has publicly disclosed their use of ChatGPT, it is unfair for a student to ask their teacher not to. Teachers are entitled to their own procedures of writing. They have limited time, and they don’t get compensated for their work — it comes down to a favor. Because of this, if teachers decide they want to use ChatGPT, it’s their decision. Students can’t ask for their application to be different; they can’t be the one who gets a non-ChatGPT letter, because that comprimises ideals of equal treatment among all students.
Because of its nature, we, as a community, must develop norms surrounding AI use in college letters. There’s a difference between using ChatGPT to generate a whole letter using only a name and three adjectives — which most people would find unacceptable — and asking it to help polish the final draft. Because we’re in the early stages, we don’t have clear ethical guidelines, and that is part of what makes the disclosure so vital to maintaining trust between teachers and students, and for that matter, colleges who receive these applications. If using ChatGPT is perfectly ethical, disclosing it should be ethical too. If disclosing its use feels wrong, using it might be unethical.
With AI continually weaving itself into various aspects of our lives, human writing is somewhat threatened. When colleges begin to ask for bulleted lists of information instead of thoughtfully written recommendations, they put into question the point of human writing in the first place. If we are all reduced to our SAT scores or GPAs, data points on paper, when can we truly showcase our humanity?