Is the traditional college classroom and its main instrument, the lecture, under siege by the forces of change? Given all the research on learning in the last twenty years, how archaic does the classic image feel of an instructor at a podium, or pacing back and forth, with a series of overheads (or these days PowerPoint presentations) projected in large while he or she imparts wisdom to an audience numbering in the hundreds?
And yet, for all the talk of “disruptive education” and for all the estimable experiments going on at colleges across the country, change seems only to be nipping at the heels of tradition.
Take, for example, MOOCs, often heralded as the vanguard of a revolution in higher education. No one can deny their phenomenal growth. By one authoritative count there are over 4000 MOOCs being offered to a billion students worldwide. One must applaud the sheer numbers and marvel at the striking desire on the part of humans to educate themselves. However, if you scratch the surface, what do you have? A billion students being lectured to.
Is it purely a function of the medium, the fact that MOOCs are videotaped or streaming presentations? Perhaps. Certainly the signal from a single fixed camera is infinitely easier to set up and control — compared, say, to three cameras (one wide shot of the room, one audience shot, one close up of the speaker) and the requisite editing or live switching a multiple camera setup necessitates.
Or is it a function of cost? The typical single fixed camera set up is by far the least expensive and can be easily and cheaply reused for any number of courses. In a number of instances, such a setup is totally automated. The speaker simply pushes a button.
Clearly, common sense tells us that both these factors come in to play. And yet, I submit that they’re not the essential elements driving this deeply embedded reliance on lecturing.
So what are the drivers? For MOOCS and for much of what passes for teaching at brick and mortar colleges?
First of all, the lecture is a form of teaching designed for people who aren’t trained to teach. I know this sounds like a strange and perhaps harsh thing to say about my colleagues. But most university faculty are trained to write scholarly papers, articles, reports, and books. Few if any received guidance or critique on teaching when they began their careers. Left to their own devices, what did they do? They emulated their professors. From their training in research, they knew very well how to organize and assess information and craft an argument. And lecturing allowed them to package research as presentation. A perfect fit!
Even more central to the persistence of lecturing is its status as a “teacher centered” form of education. A power relationship is established, one that subordinates the student to the teacher. The teacher controls that which is of value, dolling it out in small pieces as he or she sees fit. Even when the teacher allows questions, they’re almost always sandwiched in after the end of the lecture for the day, almost always for clarification purposes, and almost never to call into question the teacher’s mastery.
Is there hope for the future? Will lecturing slowly give way to more interactive, participatory, and experiential forms of teaching and learning?
Perhaps so, if more educators take seriously the research being done on learning. For starters, there’s a 2014 study conducted by investigators from the University of Washington and the University of Maine that found “students in traditional lecture courses are 1.5 times more likely to fail than students in courses with active learning.”
Sooner or later, rather than requiring students to memorize and then regurgitate information, shouldn’t we be teaching them how to think?