Jayanthi Mistry – Teaching with Technology

Jayanthi Mistry, Ph.D. (Eliot-Pearson Department of Child Study & Human Development) undertook an Instructional Technology Exploration Program (ITEP) project in Fall 2017 with the goal of improving her students’ interactive processing and analysis of video segments to illustrate concepts and points highlighted in lectures.

Photo of Jayanthi Mistry
Jayanthi Mistry, Ph.D, Eliot-Pearson Department of Child Study & Human Development

Teaching challenge I wanted to address:
I teach an undergraduate class titled ‘Childhood across Cultures’. This course typically enrolls 50-70 students, and the general teaching challenge I face is to engage students in discussion and interactive learning in class to facilitate application of knowledge. In Fall 2017, in consultation with David Grogan of Tufts’ Educational Technology Services, I explored the use of technology to create a more impactful and deeper understanding of the course learning objectives via in-class video-viewing and reflection exercises.

To address this challenge, David and I designed different 4 video-viewing exercises to implement and compare over the course of the term. In the first video-viewing exercise, I used the traditional method I had used previously (i.e., view video segment immediately followed by completing a response sheet). In the subsequent three video-viewing exercises, we embedded questions at 3 or 4 time points in the video segment, using the Qualtrics Survey tool. I tried out various question formats in Qualtrics that enabled cataloguing and presenting overviews of student responses so I could both gauge students’ understanding and guide further discussion.

Educational Technology

Qualtrics is a versatile survey tool provided for free to Tufts faculty and students. Learn more . . .

What worked well and lessons learned:
After trying the traditional method I had used in the past as the first video-viewing exercise (written responses after viewing the video-segment), followed by the Video-with-Embedded Questions format, it was immediately apparent that the Embedded Questions format elicited more interactive discussion and reflections on the video segments. Students commented that the embedded questions helped to guide their attention while viewing the video segments. Student feedback comparing both methods highlighted the benefits of seeing the aggregated responses (e.g., word clouds, heat maps); the aggregated responses enabled them to both validate their own reflections and note varying perspectives.

However, there were challenges that I will need to address in using this method again. The issue of not having enough time to respond to open-ended questions came up consistently each time. It was also clear that designing questions thoughtfully to use the varied formats for sharing aggregated student responses is critical and perhaps the most time-consuming aspect of the project.