
What we are learning about learning from the latest AI studies
by Carie Cardamone, Sr Assoc Director, CELT

AI is impacting our world, but for educators, what matters is its impact on learning, not just productivity. Recent studies of AI in the classroom confirm what learning sciences already tell us: real learning happens through effortful mental work and social interaction.
One helpful analogy is the difference between using a forklift and lifting weights. As a productivity tool, AI can be a forklift—moving faster and carrying more weight than we could alone. But when the point is learning or building “mental muscle”, just like at the gym, having AI do the lifting defeats the purpose.
So the key question for learning isn’t whether students should use AI, but how they use it. AI can either support or short-circuit learning processes, the difference lies in how it’s used.
AI Can Short-Circuit the Hard Work of Learning
Students learn best when they grapple with material: putting in cognitive effort, wrestling with problems, and working through productive struggle.
Many recent studies show what happens when AI short-circuits this process. In one experiment, students wrote essays under three conditions: with no tools, with Google, and with AI (Kosmyna et al. 2025). EEG brain scans showed the strongest brain activity when students wrote unaided, less with Google, and even less with AI. Students who began with AI often couldn’t recall what they had written, suggesting they bypassed the deep engagement that leads to learning. However, those that used AI to revise after writing independently preserved their learning.
Other studies echo this finding: while students may generate stronger essays or solve more complex problems with AI, their critical thinking, self-regulated learning, and independent performance all decline when AI replaces their mental struggles (Gerlick 2025, Fan et al. 2024 & Bastini et al 2024).
The lesson for us as teachers is clear: when AI acts like a tutor, i.e., prompts reflection, paces information, adapts content, provides practice opportunities, it can enhance learning (Kestin et al. 2024). But when AI simply outputs assignments or removes desirable difficulties that come from grappling with ideas, it harms learning.
Preserve the Power of Human Connection
Learning is enhanced by social interactions; in college, we learn from peers, faculty, advisors, and the broader academic community. When AI replaces these connections, students lose motivation and their sense of belonging.
In recent interviews with computer science students, many described how AI had begun mediating their peer interactions (Hou et al. 2025). Instead of asking classmates for help, they turned to AI. Over time, this eroded their sense of peer support and motivation. A survey of nearly 400 students confirmed the risk: heavy AI users reported greater loneliness and lower belonging, even when feeling “socially supported” by AI (Crawford et al. 2024).
But AI doesn’t have to be isolating. In one classroom project, students used collaborative whiteboards enhanced with AI prompts (Liu et al. 2025). Because the tool supported team-based work, students reported higher engagement, creativity, and teamwork. AI enhanced human collaboration rather than replacing it.
How to Guide AI Use Towards Learning
True learning requires human effort, curiosity, and connection. AI can be a partner in that process, but never a replacement. In our role as educators we can guide students toward using AI in ways that support learning.
- Think First, Then Prompt – Ask students to generate their own ideas before turning to AI. Position AI as support that builds on their thinking, not a shortcut.
- Engage AI in Dialogue – Guide students to interact with AI through conversations, sharing their ideas, asking for counterarguments or directing the AI to act as a tutor or debate partner.
- Design for Process, Not Just Product – Keep attention on how learning happens, not just what gets turned in by layering assessments with multiple touch points, e.g., drafts, outlines, or reflections in addition to final products.
- Prioritize Human Connections – Keep peer review, group projects, and office hours central. AI should supplement, not replace, human-to-human learning.
- Integrate Reflections – Build in opportunities for students to critique AI outputs, compare them with their own work, and reflect on how AI shaped their process.
References:
Bastani et al. 2024 Generative AI Can Harm Learning
Crawford et al. 2024 When Artificial Intelligence Substitutes Humans in Higher Education: The Cost of Loneliness, Student Success, and Retention
Fan et al. 2024 Beware of metacognitive laziness: Effects of generative artificial intelligence on learning motivation, processes, and performance
Gerlich 2025 AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking
Hou et al. 2025 ‘All Roads Lead to ChatGPT’: How Generative AI Is Eroding Social Interactions and Student Learning Communities
Kestin et al. 2024 AI Tutoring Outperforms Active Learning
Kosmyna et al. 2025 Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task
See Also
- Artificial Intelligence Resources for Tufts Faculty and Staff (CELT)
- Addressing Academic Integrity in the Age of AI (Teaching@Tufts)
- Designing Courses in the Age of AI (Teaching@Tufts)
- Generative Artificial Intelligence (AI) (TTS)