Not All AI Wins Make Headlines And That’s Okay!

An Interview with Dr Meera Gatlin Assistant Teaching Professor, Department of Infectious Disease and Global HealthClinical & Assistant Professor, Public Health and Community Medicine at Tufts Cummings School of Veterinary Medicine
By Mehek Vora
When you think about AI in classrooms, we might picture slideshows made with AI or editing essays with AI. But what does it look like when you bring generative AI into a veterinary public health classroom? According to Dr. Meera Gatlin who is the Assistant Teaching Professor in the Department of Infectious Disease and Global Health and Clinical Assistant Professor in Public Health and Community Medicine it looks a lot like playful experimentation, pedagogical curiosity, and a whole lot of trial and error.
Dr. Gatlin first tried ChatGPT in her kitchen, just to make a packing list and see what it could do. Soon she was using it for emails and SMART goal planning. It got her wondering, if this works for me, could it help my students too?
In the DVM/MPH program at Tufts, Dr. Gatlin teaches a small cohort of students. Her “Integration” class, which is a small seminar blending public health and veterinary medicine, became the testing ground for introducing AI into teaching.
One memorable assignment focused on disease eradication. Instead of pairing students up with classmates, students were asked to “partner with AI,” using ChatGPT to research topics and then critique the results. The exercise sparked dynamic discussion. “It made for a really interesting conversation”.
Other classroom experiments included summarising research papers using Claude, identifying public health events to recognise AI’s outdated training cutoff and mind-mapping federal research work.
Despite these small wins, things shifted when Dr. Gatlin tried AI in a larger, lecture-based class. “Two-thirds of my students had no interest in using it,” she noted. “It was honestly kind of shocking.”
This stark resistance opens up important questions to think about:
Why were students, especially those in graduate or clinical programs, so hesitant to adopt AI?
Was it a lack of training or trust in the tool?
Were they unsure how to use it meaningfully in such a specialized field?
Interestingly, the tide may be turning. “My first-years this year used AI in their undergraduate programs. When I asked them to research a public health topic, they casually said, ‘Oh yeah, we asked ChatGPT about it.’ That never happened before.”
This shift prompts more questions for Meera:
Is familiarity with AI in earlier education shaping comfort and curiosity?
Will incoming students expect AI integration in all their courses, including medicine and veterinary science?
Dr. Gatlin sees the generational transition as pivotal. “The first cohort I introduced AI to had already been in vet school when it launched. Now, students arrive having used it already, so what does that mean for me?
For Dr. Gatlin, one key message stands out: AI is just another learning tool coming our way.
“It’s not this all-knowing force,” she says. “It deserves the same kind of evaluation and critical thinking as any other resource like a book, a peer-reviewed journal, or a website.”
OR
“It’s just another resource. If we’re going to manipulate it in a way that’s useful for us, then we need to do our due diligence—like we would with any other source.”
She emphasizes the importance of transparency and integrity and being worried of not knowing when students are using AI even if her class policy openly encourages citing it, disclosure and use.
Though her classroom experiments with AI have paused, Dr. Gatlin remains hopeful about clinical applications. From reading pathology slides to drafting client-facing communications, AI has promising use cases in diagnostics and workflow efficiency. But these applications come with caution.
Dr. Gatlin’s story is less about triumph or failure, and more about experimentation, context, and growth. Her experience reminds us that AI isn’t a one-size-fits-all solution. It’s a tool that works or maybe does not but is always evolving.
As AI continues to shape higher education, stories like Dr. Gatlin’s are essential. They show us that successful integration isn’t about flashy tools. It’s about thoughtful teaching, honest reflection, and meeting students where they are.