In the last few days a fierce battle has been waged among professors about whether the lecture still has an important role in undergraduate education. This tempest might seem to be a perfect illustration of the claim that academics fight fierce battles over the smallest of stakes. However, in this case, I think the stakes are high indeed, as this battle reveals deep divisions among professors about what undergraduate education is all about.
The battle broke out when University of North Carolina-Chapel Hill history professor Molly Worthen published a New York Times op-ed lamenting the decision of so many professors to abandon lectures in favor of “student-led discussion” and technology-driven learning. (So many professors have quit lecturing, she noted, that classrooms are no longer equipped with lecterns! When Professor Worthen insisted on having a lectern for her courses, it took a week for UNC to track down and deliver one to her classroom.)
But what makes the lecture format so important? In Professor Worthen’s view, it’s because it teaches students important skills: to take notes, to build their attention span, to synthesize complex material, and to absorb an argument:
Lecture courses [are] an exercise in mindfulness and attention building, a mental workout that counteracts the junk food of nonstop social media.
Professor Worthen’s defense of the unfashionable practice of lecturing to undergraduates provoked heated response from many quarters.
Damon Linker, writing at The Week, weighed in, arguing that yes, lectures are crucial to undergraduate education, but not only because students need acquire a certain set of skills. Instead, lectures are crucial because students need to master a certain body of knowledge:
Why do students of history need teachers who will stand at the front of a classroom and lecture? Because history is hard. It presupposes the knowledge of thousands of facts (names, dates, events) and how they fit together into an enormously complicated, multi-dimensional causal sequence. Until the students absorb those facts and grasp that causal sequence, “group work” and other forms of interactive learning are premature.
Others retorted that lectures had been abandoned for a very good reason: today’s undergraduates simply aren’t up to listening to an hour-long lecture. Slate’s Rebecca Schuman, for example, derided defenders of the lecture such as Professor Worthen and Mr. Linker as simply naïve about the capacities of most students:
Many undergraduates learn about Plato—or, more realistically, they zone out during lectures on Plato. Often, they’re zoning out because their professors have planned that lecture for—speaking of Plato—the “ideal” student, the perfect one, the one who does synthesize as he or she learns. … But real students are often too busy (working 40 hours a week, taking care of family members, or even just being college students) to [appreciate a lecture].
The only way to reach these students, Dr. Schuman argues, is exactly the sort of student-centered, technology-driven methods that Professor Worthen dismisses.
As someone who spent 11 years as a college professor, it seems to me that none of these stances on the place of the lecture in undergraduate education is gets it quite right.
Of course, Dr. Schuman is right that too many students zone out during a lecture on Plato—but the students who can’t get engaged in a (well-crafted, well-delivered) lecture on Plato simply don’t properly belong in a liberal arts college.
Lectures do have an important place in an undergraduate education—but not for the sake of building up skills such as note taking, as Professor Worthen argues, nor for building up a collection of facts, even facts arrayed in a multi-dimensional causal sequence, as Mr. Linker seems to suggest. Acquiring skills and a collection of facts are not the ends of a liberal education.
The 20th century British philosopher Michael Oakeshott, in his essay “The study of ‘politics’ in a university,” offered the best explanation I have read about the ends of liberal arts education:
What a university has to offer is not information but practice in thinking; and not practice in thinking in no manner in particular but in specific manners each capable of reaching its own characteristic kind of conclusions. And what undergraduates may get at a university, and no where else in such favourable circumstances, is some understanding of what it is to think historically, mathematically, scientifically or philosophically, and some understanding of these not as ‘subjects’, but as living ‘languages’ and of those who explore and speak them as being engaged in explanatory enterprises of different sorts.
Learning to speak the language of historians, or of mathematicians, scientists, or philosophers, lets one have a sense of what it means to explain the world as interpreted and understood within one of these academic disciplines. And, just as one cannot learn a foreign language by simply starting to speak but by first listening to someone fluent in that language, students learn the language of history by listening to historians, by listening to how historians come to explain events and tie them together into an explanatory account—and that listening, at least in introductory classes, means lectures.
As undergraduates listen to lectures and start to acquire the language of an academic discipline, they will surely develop important skills and become acquainted with many facts—but those accomplishments are secondary to the accomplishment of learning to explain the world in manner characteristic of one of the academic disciplines. Of course, as undergraduates develop skills and build up a sufficient body of knowledge, they will be ready to converse in the language of an academic discipline, which is why seminar courses are appropriate to upperclassmen.
Yes, undergraduates still need to listen lectures—but not to acquire skills or learn facts, but to begin to become at least passably fluent speakers of a disciplinary language.