When Abby Miles (MSB ’26) applied to study abroad for the Spring 2025 semester in Spain, she had everything in order — two teacher recommendations, application questions and a 500-word essay in Spanish to demonstrate her foreign language literacy. Miles, who has received an A for every Spanish class she took at Georgetown University, was more than excited for the chance to study outside the United States.
Weeks later, however, Miles received a surprising email request from the program advisor: Could she explain why she incorrectly used masculine gender pronouns instead of feminine ones in her application essay?
“I immediately wrote back saying that I use she/her pronouns, and it might have been Google’s autocorrect feature because Google will frequently autocorrect feminine pronouns to masculine,” Miles told The Hoya. “His subsequent email immediately deferred me to the Honor Council and suspected me of using ChatGPT or some form of AI to write my study abroad essay.”
“My initial reaction was complete shock. To this day, I have never used ChatGPT to write an essay,” Miles added.
The use of artificial intelligence (AI) is ubiquitous among college students. Around a third of college students use AI regularly for coursework, according to research by OpenAI, the non-profit organization behind the large language model chatbot ChatGPT.
As this use has become more widespread and differentiating between AI and human writing becomes increasingly difficult, universities have had to adapt their policies to combat AI cheating and plagiarism, among other academic violations.
At Georgetown, the process of detecting and responding to AI usage has generated confusion, but also innovation, among students and professors alike. Discussions about AI’s role in education are only just beginning.
The Institutional Framework
At Georgetown, approaches to AI usage in the classroom are intentionally inconsistent. Official university guidance stipulates professors have discretion over student interaction with AI in their classrooms.
The Georgetown Honor Council’s Honor System Policies requires that “the question of how to acknowledge AI-generated intellectual work and whether it is to be allowed at all is answered by individual course policies.”
A university spokesperson similarly said professors hold the highest say on AI use.
“Students should refer to their professor’s syllabus policies on AI usage to inform their work,” the spokesperson wrote to The Hoya. “These policies take precedence over personal practices or variations in faculty allowances. Students should adhere to the guidelines provided in each course syllabus.”
Molly Chehak — the director of digital learning at the Center for New Designs in Learning and Scholarship (CNDLS), Georgetown’s home for pedagogy — said Georgetown’s lack of a central AI policy apparatus in the classroom is to give professors creative and academic freedom when teaching their courses.
“The reason that Georgetown doesn’t have a policy that governs everyone equally is that we wouldn’t want a faculty member in one field or discipline being constrained in their use if they need it for teaching,” Chehak told The Hoya. “The university is an incredibly diverse place.”
In the classroom, Georgetown’s English department, for example, has a wide range of professors’ AI requirements.
Nathan Hensley, an associate professor in English and the director of undergraduate studies in English, prefers students to avoid using AI tools in their writing. In one of his classes, Hensley said he makes students critique an AI-generated essay to look for stylistic flaws.
“I have students articulate what about the large language mechanics they see reflected stylistically in the, for example, vacuous prose of the large language model, its inability to create accurate citations or its inability to produce accurate, close readings of texts,” Hensley told The Hoya.
Hensley said these assignments act as a way to critically reproach AI, centering these tasks as a dialogue between the student and AI rather than an integration of it into the classroom. Hensley said this approach encourages students not to be afraid of taking intellectual risks and making mistakes.
“We get a false sense of security from typing things into the machine and having it tell you what’s the point of Sigmund Freud, for example, and what it gives you is some watered-down set of cliches,” Hensley said.
On the other hand, Phil Sandick, an associate teaching professor in the English department and the director of the creative writing minor, said he is more open to the idea of students using AI in the classroom, including in the required writing course WRIT 1150.
“I decided in the fall to turn all of my WRIT 1150 sections, for better or for worse, into a class thinking about each assignment in some way engaged with generative AI,” Sandick told The Hoya. “We didn’t get too technical in it, but it revolved around big projects where students had to write a guide for other students about how to use generative AI in ways that are sort of meaningful, ethical.”
Seth Perlow, an associate professor of English currently on research leave, said he has been considering how to address AI when returning to the classroom in 2026.
“I don’t think that there’s much question that ChatGPT, among other tools, is now capable of producing coherent, expository prose such that it is indiscernible from human writing,” Perlow told The Hoya. “I do think it’s worth thinking about the wide range of possibilities of assessing student learning.”
“These tools are a part of daily life now, and I don’t think it’s any likelier that they will not be used in some capacity for writing than it is that we won’t use calculators for arithmetic,” Perlow added.
Navigating AI Policy
While Georgetown doesn’t have a university-wide framework for AI usage in classrooms, it has a consistent approach to addressing suspected instances of AI misuse.
In the 2022-23 academic year, roughly half of the alleged instances reported to the Honor Council in violation of the Honor System, the university’s academic expectations set forth in the student handbook, dealt with plagiarism.
Without a central policy or mechanism for finding and detecting AI use in assignments, each professor has the ultimate say when they believe a student has used AI in their work without authorization. However, there is not a firm consensus on what unauthorized AI use looks like.
Hensley said the repetition of a three-point structure, lack of close reading skills and the overuse of the same stylistic trends key him into potential AI use. Perlow said he looks for the hallucination of fake citations and stock phrases not at a student’s writing level, and Sandick said that while AI writing is hard to detect naturally, he looks more generally to see if a student’s writing has emotional depth that AI cannot reproduce.
The Honor Council’s Honor System Policies, which apply to all undergraduate students, states that while professors should handle regulations of AI generated coursework, “if you didn’t generate the words yourself, say so by quoting and citing the source; if you generated the words but not the content and ideas, say so by citing the source.”
Ali Whitmer, associate vice president in the Office of the Provost and executive director of the Honor Council, said the Honor Council handles unauthorized AI cases no differently than other violations.
“When an allegation comes in from a faculty member or TA or someone else, the first thing that happens is I review all the case materials, and then I ask one of our faculty members from the Honor Council faculty assembly to do an investigation,” Whitmer told The Hoya. “The investigation generally includes a conversation with the professor and then a conversation with the student, and then a written report is submitted back to the Honor Council.”
Regarding Miles’ case with the Honor Council, Miles said the faculty member who reviewed her essay noted the errors of referring to herself as the male “un estudiante” instead of the feminine “una estudiante.”
Miles also discussed with the faculty member the use of “e” instead of “y” to represent “and,” a non-beginner conjugation in Spanish that seemed at odds with the previous gender errors, learned as early as introductory courses. The two also reviewed Miles’ editing history on Google Docs to see how fast she typed the essay.
Whitmer said that these preliminary conversations serve as one component when deciding if a complaint will go to a full hearing.
“It is a conversation that we have with students asking them to explain things about their assignment, things about their exam, how they did their work,” Whitmer said. “The investigator might ask them to show some previous drafts of their work or ask them how they approach the material.”
The Honor Council decided to have a full hearing after the faculty member met with Miles, which she said came as a surprise.
“There were elements in it that I felt could not have been recreated by AI, so I was frankly extremely shocked that anyone could assume I had used ChatGPT,” Miles said.
At her full hearing, Miles said the Honor Council cleared her of suspected wrongdoing.
“I think once the Honor Council saw my editing history, it was pretty clear that I didn’t use AI, because it showed how I had typed things and gone back to edit them,” Miles said. “My other explanation was that I hadn’t taken Spanish at all over the summer, so I was kind of rusty, and I had never had to write a personal essay since getting to Georgetown.”
AI Policy and The Future
Moving forward, Georgetown students and faculty have different ideas on how to factor AI into students’ learning and classroom experience.
Hensley said some in the English department feel AI may act as a crutch to learning, stifling students’ creativity and reinforcing the emerging transactional nature of education.
“If you can create a situation of trust where students feel that they can take risks and that they can fail and that’s okay, and that that’s how you actually accomplish things that are new and real,” Hensley said.
Cal Newport, a professor of computer science and contributor to The New Yorker, wrote an article in October 2024 titled “What Kind of Writer is ChatGPT?” where he followed students using AI to write essays.
Newport said many students used generative AI in fragments to tweak or edit assignments, yet this crutch sometimes led to other problems in their writing ability.“To outsource core aspects of writing to AI would be like wanting to be a better athlete but outsourcing some of the gym time to a machine to lift the weights for you,” Newport told The Hoya. “Yes, technically, the workout was done, but do you think you’re any stronger? So what was that? What was the point of that?”
Others, like Sandick, are more willing to embrace AI as a classroom tool. Sandick said his introduction of AI in assignments was to fight back against traditional biases regarding AI, acknowledging the technology in real time and expanding the depth of conversations in his classroom.
“Passion, I think, is the most important thing that happens in a college humanities class. It’s the debates that you had, the disagreements, the feeling that things started to click,” Sandick said. “So more and more, I’m going to probably have to redo how we do class discussions to really look deeper.”
In the summer of 2024, Georgetown offered grants ranging from $3,000 to $5,000 to professors who proposed pedagogical ways of using AI in their courses, according to an email sent to faculty that The Hoya obtained. Funding for these initiatives comes largely from the Baker Trust for Transformative Learning, which supports Georgetown in value-based endeavors, and the Sonneborn Innovation Fund, which funds new pedagogical practices. Each organization is a partner of the Red House, Georgetown’s educational research and development unit.
“The writing program faculty who were interested had an interest meeting about the program, and we talked in a workshop about what kinds of ways we could teach this,” Sandick said. “The spectrum ranges from wanting to adopt it into the classroom to too many environmental costs, don’t want to do it, don’t really want to engage with it, it’s too ethically fraught.”
CNDLS offers financial support to carry out the Initiative on the Pedagogical Uses of AI (IPAI), a program to support faculty experimentation with AI across disciplines and foster innovation in teaching and learning at all levels.
A university spokesperson said the different grants and opportunities the university offers through IPAI align with the university’s values of expanding educational opportunities.
“The first round of proposals received approximately 90 ideas, with 40 projects funded and implemented. These projects span three themes: ‘Rethinking Ways of Teaching,’ ‘Researching New Ways of Working,’ and ‘Improving the Student Experience,’” the university spokesperson wrote. “The IPAI exemplifies Georgetown’s commitment to integrating AI into education while remaining true to its mission of forming future leaders.”
“IPAI awarded $160,000 to 19 projects in its first round of funding. A second round in 2024 expanded this to $224,000 across 33 grants, including student ‘X-grants,’” the spokesperson added.
Jeanine Turner, a professor with a joint appointment to the McDonough School of Business (MSB) and the communication, culture and technology (CCT) program, said she has used AI in her MSB and CCT classes to better help students in their educational efforts.
“I’ve had students use AI to help them with their writing, help them build their presentations, help them interact with it as a way to roleplay a challenging conversation,” Turner told The Hoya. “I think it’s critical that we integrate it in the way that we teach and think about how we use it with learning.”
Turner said she wanted to introduce AI into the classroom because of its popularity among students.
“The second that I saw it, I said this is exactly what students are going to be using. I would be using it too as a student,” Turner said. “I think what AI has done has helped us all think about learning in a different way. It’s become so integrated into knowledge, so now we have to think about the role of the faculty member. I think we’re moving away from a very old model of teaching.”
Some classes have fully integrated AI into their curriculum, including “Music in the Age of AI,” taught by Benjamin Harbert, chair of the department of performing arts and the director of undergraduate studies in music. The course culminated in an AI music festival held in the Gonda Theater April 23.
Sophia Dorr (CAS ’27), a student in the class, said it has changed her view of AI as a personal and educational tool.
“It really made me a lot more open to using AI in my everyday life,” Dorr told The Hoya. “I now use AI a lot more for all my work than I did before this class. My understanding of AI has become so complete after taking this class, and I think it was really useful in that way.”
“It’s completely different from any other class I’ve taken, because we use it for everything from composition to brainstorming to organizing. Some of my classes, you’re allowed to use AI, but it’s not encouraged or embraced like this class,” Dorr added.
While the university’s AI policy in classrooms continues to evolve, students continue to have varied experiences.
Miles’ story did not end after her Honor Council hearing. To remove the conditional status from her study abroad application and avoid taking a Spanish language class abroad, Miles took a Spanish placement test to further demonstrate her familiarity with the language. Miles said she scored a C1, which translates to near fluency.
Miles has happily spent this semester in Spain with her host family, but the AI accusation had the chance to derail her college experience.
“I was very frustrated at this point, because I had been through what is essentially six hours of meetings and also prepared a full written statement,” Miles said. “Who wants to spend two hours taking a placement exam for a language they’ve clearly already demonstrated their position in because of an AI accusation?”
For professors, these conversations begin a long road to a permanent solution.
“Although I want my students to be intellectually honest and not find ways of cheating on assignments, I also don’t want to organize the student-faculty relationship around surveillance or the kind of detective work and enforcement that structures questions around plagiarism,” Perlow said.
“We as humanities instructors might respond to these AI tools by thinking less about the quality of student writing and more about the quality of students’ ideas,” Perlow added.