Rethinking AI’s Role in Learning and Human Connection

Rethinking AI’s Role in Learning and Human Connection

An interview with Dr. Carie Cardamone, Senior Associate Director, Center for the Enhancement of Learning and Teaching

By Mehek Vora

In a world where AI conversations often focus on individual adoption stories, Dr. Carie Cardamone offers something different: a campus-wide perspective on how an entire university community navigates artificial intelligence together. As Senior Associate Director at Tufts’ Center for the Enhancement of Learning and Teaching (CELT), she’s been facilitating conversations about AI across disciplines, departments, and roles for nearly three years—long before ChatGPT made AI a household topic.

Her journey with AI began in “another lifetime,” as she puts it, seeing how machine learning could help to classify galaxy shapes and understand their evolution. Then”Around three years ago, our director was thinking about what is the future of education, and one of the things that she thought was going to have a big impact was AI,” Carie recalls. She formed an AI learning community for staff before chatbots became mainstream to begin discussions of the potential affordances and concerns with various AI tools.

That early start has given Carie a unique vantage point as AI has exploded into public consciousness. Today, she leads faculty and staff learning communities, organizes campus-wide forums, and facilitates conversations between students, faculty, and staff—all while staying current with AI’s rapid evolution. “I was just talking to a professor this morning about professional writing in engineering,” she notes, “and she was discussing how different it was this year than writing with AI last year.”

Forklift or Weightlifting? A Framework for AI in Learning

How do you know when AI is helping your learning versus hindering it? Carie find that thinking of using AI in the framework of using a “forklift” or going “weightlifting” as a helpful way to understand how it’s impact on students learning differs from that of the professional workplace.

A forklift handles routine, mechanical tasks when the purpose is simply to accomplish the task efficiently. Weightlifting, however, only benefits you if you do it yourself—and the effortful work of learning requires grappling with ideas to build understanding. This can’t be outsourced to AI.

“If you’re writing the methods section for an analytical paper that has a structured linguistic style and format that you need to conform to, maybe free writing what you did and having AI put that into the language is a forklift,” she explains. This kind of AI use handles routine formatting while preserving human thinking.

But the same technology becomes counterproductive when applied to learning-focused tasks: “If you’re trying to find your voice or think about how you understand a matter through writing, then having AI generate a draft actually prevents you from really understanding what you think. It’s like having AI lift weights for you. The piece of writing then has no benefit and it was a waste of time. Why would you have AI go to the gym in your place?”

This framework helps students and faculty navigate when AI serves as a productivity tool versus when it replaces the cognitive work that drives learning—a distinction that becomes crucial as AI becomes more sophisticated and tempting to use for everything.

Connecting People and Pedagogy to Navigate AI Strategically

Beyond translating AI’s capabilities for specific classroom uses, Carie plays a vital integrative role—helping Tufts as a whole navigate what responsible and meaningful AI use looks like in higher education. She connects not only disciplines, but also people, questions, values, and visions.

“I think of my role as helping create the connective tissue across campus,” she reflects. “Not just between faculty in different fields, but between values and practices, between policy and pedagogy, between what’s technically possible and what’s educationally wise.”

Rather than advocating for or against AI, Carie focuses on cultivating a shared language and a culture of inquiry around it—hosting design sprints, organizing inclusive forums, and working side-by-side with faculty in engineering, arts, medicine, and beyond. “Some of the most energizing conversations I’ve had have been in departmental meetings—hearing how different disciplines are uniquely wrestling with the challenges and opportunities of AI,” she says.

Her approach honors the complexity of the moment. “What we’re facing isn’t just a tech adoption issue—it’s a question of who we are as educators and what we want learning to mean.”

Through this work, Carie helps Tufts not just respond to AI, but reimagine learning with it: “We’re not trying to bolt AI onto old ways of teaching—we’re asking, what does learning look like when we stay grounded in what humans do best, and use AI to support, not replace, that?”

The Subtle Dangers of Uncritical Uses

While Carie sees AI’s potential for making education more interactive—she personally uses it as a collaborative writing partner, having it reflect ideas back to help refine her thinking—she’s deeply concerned about AI’s invisible impacts. She shares a striking example: researchers enrolled an AI in a fully online master’s health program, where it earned a 97% average by completing all assignments. The only human intervention was having a student actor participate in live discussions and verifying that sources were real.

“That to me says that we’re going to have students who are AI chatbots, not actually students trying to take some of our courses,” she warns. “If you can get a degree online but you didn’t actually interact in any way with any of the content, what does that degree mean?”

This concern extends to more subtle scenarios: “I’m most worried about cases where AI does something so well that it looks right, but it’s not right in detail. It’s not the glaring errors that undermine learning—it’s the quiet ones that slip through when no one’s looking.” She gives the example of using AI to reformat research data—if it correctly handles 995 out of 1,000 entries but quietly errors on five, the researcher has unknowingly compromised their work.

“These kinds of issues show why we need to approach AI in education as a design problem,” she explains. “It’s not about banning or embracing it—it’s about intentionally designing assignments and assessments that help students build awareness of what AI can do, and where human judgment is essential.”

Think About Your Reader

When asked for the one thing she most wants the Tufts community to understand about AI, Carie offers practical wisdom: “When you’re using AI to create something like a piece of writing, stop and think first about the person reading that piece of writing.”

She illustrates this with scenarios many can relate to: “If you’re writing an email to a professor and you’re generating it with AI, imagine you’re the professor and you receive 50 emails from students a day, and they all sound the same because they’re generated with AI. To you, that email sounds beautiful. But to the professor reading it, it doesn’t.”

The same principle applies to faculty writing recommendation letters with AI assistance. “You think, ‘Oh, this sounds really great, that’s better than I could have written.’ But now you’re the person on the scholarship committee reading these letters and you can just tell they’re all AI generated and they don’t sound authentic.”

Reimagining Education’s Future

Rather than seeing AI as a threat to education, Carie views it as an opportunity to clarify what makes human-centered learning irreplaceable. From her perspective, “What does the faculty member bring that is unique? It’s the ability to have those human-to-human interactions and to think more in terms of personalization.”

“At Tufts, we have an incredible opportunity—not just to adapt to AI, but to lead in showing what thoughtful, human-centered integration can look like,” she emphasizes. “If you’re just looking for information, AI can give you that. But what Tufts can offer is something more: a chance to learn in community, to build relationships, to be challenged and supported by real people who care about your growth.”

As AI continues reshaping education at unprecedented speed, Carie’s work reminds us that the most important conversations aren’t about the technology itself, but about preserving what makes learning fundamentally human. Her message is both pragmatic and hopeful: understand AI’s capabilities and limitations, use it thoughtfully when it serves genuine purposes, and never forget that education’s greatest value lies in the connections we make with each other.

Note: This interview was drafted using Claude 4.0 Opus based on a transcript of our conversation, and subsequently edited by Carie for clarity and accuracy.