This past fortnight, there’s been a lively blogosphere debate about whether college students should be allowed to use their laptops in the classroom. At Slate, University of Missouri-St. Louis instructor Rebecca Schuman argued in favor of letting students use laptops if they wish, even though, as she writes:
having a laptop open in a large lecture is basically like wearing a sign on your head that says, “I’ll be spending the next hour shopping for shoes. . . .”
Schuman wrote that telling students they can’t use laptops “infantilizes these 18-to-22-year-olds.”
Contra Professor Schuman, Dartmouth computer science professor Dan Rockmore made the case for banning laptops at The New Yorker:
The act of typing effectively turns the note-taker into a transcription zombie, while the imperfect recordings of the pencil-pusher reflect and excite a process of integration, creating more textured and effective modes of recall.
Case Western University School of Law professor Jonathan H. Adler made much the same case over at The Volokh Conspiracy, arguing that the use of laptops encourages
verbatim transcripts of class sessions [that] are not particularly useful. . . . The efficiency that laptops facilitate comes at the expense of thinking.
The debate about laptops in college classrooms is an extension of the debate about whether or not primary school students should be taught cursive and penmanship—because if primary students aren’t taught penmanship, how can university student write rather than type classroom notes?
Note that there’s complete agreement about the superiority of the old-fashioned pen-and-paper approach to college note-taking—Professor Schuman is just as persuaded as Professors Rockmore and Adler that students learn more when they take notes by hand and without the distractions of the internet.
So, what’s the source of disagreement between Professor Schuman and Professors Rockmore and Adler?
It’s a disagreement that extends over many more issues than just this minor skirmish about laptops: a disagreement over the extent to which faculty owe their students guidance during their college years.
In the last decades, colleges have taken an increasingly hands-off approach to students, abandoning in loco parentis and treating students as consumers whose preferences should direct the college experience.
But, really, should college be a do-it-yourself, figure-it-out-on-your-own experience? Even when the sticker price at some elite private colleges exceeds $60,000 per year, and public university tuitions have risen at rates far above the rate of inflation over the last couple decades? In no other sector of the economy would people accept a do-it-yourself product at these prices.
No, universities and their faculty owe their students guidance about how to think at the college level. As British philosopher Michael Oakeshott wrote:
What a university has to offer is not information but a practice in thinking . . . . [W]hat undergraduates may get at a university, and nowhere else in such favourable circumstances, is some understanding of what it is to think historically, mathematically, scientifically, or philosophically, and some understanding of these not as "subjects," but as living "languages" and of those who explore and speak them as being engaged in explanatory enterprises of different sorts.
You can’t gain an understanding of academic disciplines as living languages if you’re a mere “transcription zombie.” You can gain this understanding only if you’re making the language your own by writing it down in your own words—which means, in effect, with pen and paper, not a laptop. Telling students that they can’t use a laptop in the classroom doesn’t “infantilize” them—instead, it gives them a chance to learn to enter into learned discourse and thus to join the conversation of mature, well-educated adults.