
Project-Based Learning With AI
By Nick Covaleski, Assistant Director at CELT
This interview is with Thomas Van de Velde, lecturer at the Gordon Institute and a former tech executive who now teaches entrepreneurship courses – such as Intro to Making. Van de Velde is in the process of building TeachPilot, an AI-powered platform that will be leveraged as a tool for experiential, project-based learning. Nick chatted with Professor Van de Velde to learn more about this platform, his motivations for building it, and what insights he has for other Tufts faculty who may be interested in using AI for experiential education.
Can you tell me a little bit about yourself and your Intro to Making course?
Thomas: I’m passionate about democratizing technology and empowering students to become makers, regardless of their background. My Intro to Making course is designed to show students that they can harness technology themselves through hands-on making – without fear or prior knowledge.
The course focuses on four core learning outcomes: building confidence that anyone can be a maker, developing proficiency in tools from 3D printing to connected devices, learning to collaborate in multidisciplinary teams, and acquiring an entrepreneurial mindset through the “build-measure-learn” loop.
We don’t teach these skills in isolation. For example, when students report they’re losing plants due to irregular watering, students build real IoT monitoring systems to water their plants. They prototype, test with actual plants, get peer feedback, and iterate until they have something that works. It’s not just about learning tools. Students develop the confidence and mindset to tackle any challenge with technology.
In simple terms, what is the AI-powered learning platform that you’re building for this course, and what were your motivations as a teacher for building it?
Thomas: The platform is a project-based learning environment that mirrors real-world engineering tools but acts more like a skilled coach than a solution provider. When students get stuck, it doesn’t give them code to copy – instead, it helps them understand the concept(s) they’re missing and guides them to build their own solution through reflection. This virtual workspace doesn’t optimize for quick results but rather deep learning.
For example, when a student tries to read sensor data incorrectly, the AI doesn’t just fix their code. It might say: “I see you’re trying to read data from a sensor. Think about what type of signal a moisture sensor provides – is it digital (on/off) or analog (range of values)?” Then it provides a mini-lesson on analog vs digital pins, shows the pattern for reading analog values, and has them apply it themselves to their specific sensor. The decision on what to do is with the student, not the AI.
My motivation came from seeing how ChatGPT and similar tools were short-circuiting real learning. Students would paste in their broken code, get a “fixed” version, and learn nothing. They’d become dependent on AI rather than developing their own problem-solving skills. I wanted technology that makes students better builders, not just faster copy-pasters.

What are the main benefits that this technology offers to your students? What do you hope they will gain most from using it?
Thomas: The key benefit is that students develop genuine problem-solving skills through guided struggle. Unlike ChatGPT, which risks removing the struggle entirely, I want to maintain productive challenges while preventing destructive frustration.
I see students gaining at least five key things from using this experiential learning platform:
- Deep understanding through guided discovery: Students figure problems out themselves with just enough scaffolding, which strengthens conceptual mastery.
- Confidence in agency: every solution is genuinely theirs, built through their own reasoning
- Resilience through productive struggle: Students learn that being stuck is temporary and solvable with the right approach, building persistence in problem-solving.
- Stimulate metacognition through team collaboration: AI-guided reflections position the platform as a team coach, facilitating discussions about what worked, what didn’t, and how processes can improve.
- Safe practice of professional communication: Students rehearse interviewing and requirements-gathering with AI-generated stakeholders before engaging with real partners, reducing fear and enabling thoughtful iteration.
The AI acts like training wheels that gradually come off. Early on, it might provide more structured guidance. But as students demonstrate competence, it pulls back – maybe just asking “What have you tried so far?” or “What do you think that error means?”
Most importantly, students can’t “game” the system by getting the AI to do their work. There’s no shortcut to copy. They have to engage with the concepts, understand the patterns, and build their own solutions. When they present their plant monitor to the greenhouse manager, they can explain every line of code because they truly wrote it themselves.

What recommendations do you have for other Tufts faculty who are interested in using AI as a tool for experiential learning?
Thomas: Scaffold the AI interaction thoughtfully and make the learning progression explicit.
In my software leadership course, for example, I have students first write user stories on index cards – no AI allowed. Then they peer-review in groups, learning from each other’s approaches. Only after they understand the fundamentals do they review their work with AI, and eventually use AI as an assistant for writing new stories. The sequencing matters enormously, as students who start with AI often become dependent on it. This progression ensures they develop core skills before augmenting them with AI. As I tell students: “You’re learning to write user stories, not learning to prompt AI for user stories.”
For AI tools specifically, look for ones that guide rather than give. The AI should:
- Ask clarifying questions instead of assuming intent
- Provide concepts and patterns, not complete solutions
- Encourage students to explain their thinking
- Gradually reduce support as competence grows
Van de Velde emphasizes that AI should be introduced with careful scaffolding and always in service of deeper learning, not shortcuts. For faculty exploring experiential approaches, the challenge is to design learning sequences where students first build foundational skills and then use AI to extend their capabilities.
This is just the beginning – TeachPilot represents one experiment in how AI can support project-based learning, but its future will be shaped by educators themselves. Faculty who are interested in exploring or contributing can visit TeachPilot.ai to get involved.
See also