As a professor at one of the many schools that have responded to the coronavirus threat by canceling in-person classes, I find myself wondering whether our students will discover what many of us secretly fear: that the classroom isn’t really necessary for learning.
Observers have been pointing to all sorts of ways in which the current moment might change the world forever. From the serious — the twilight of the handshake or the permanence of the face mask — to the trivial — the end of locker-room access after sporting events — life is expected to be different even after the virus burns itself out. The same may be true of traditional classroom instruction.
As critics have pointed out for years, the technology of learning has scarcely budged in two centuries. Students still congregate in a space run by a teacher, an authority figure whose knowledge they seek and whose rules they must follow in order to get it.
Yes, there have been innovations in classroom design, including some quite important ones (like the larger windows that became fashionable in the years after World War II) and some minor gimmicks (like having students perch on medicine balls rather than in chairs). But such changes have preserved the traditional vision of how learning works. It’s that tradition that a few weeks or months of at-home study will likely challenge.
Let’s start with the simplest consequence. When young people stay home, watching the teacher on a computer screen, they are likely to be doing other things, too. There’s no longer room for doubt that distraction by the internet during class has a negative effect on learning, which means that the current trend toward banning laptops has a rational basis.
With schools closing everywhere and students “attending” from home, internet use during class time will go up. So it’s easy to predict that students will learn less from their teachers than they would during live instruction. Surely this is an argument for getting the classrooms up and running as swiftly as possible.
Or is it? Perhaps the conclusion that internet use interferes with learning is itself an artifact of the classroom model. Consider: How do we know that freedom to browse interferes with learning? We give the experimental subjects examinations and record the scores. At first blush, this approach seems sensible. Although the idea of a single end-of-term test has been declining at the college level, written examinations of some sort remain the standard for determining classroom performance. So when we say that the internet interferes with learning, what we really mean is that it causes students to perform less well on tests.
But suppose, as many scholars argue, the notion of the “test” is itself outdated. At the very least, the closed-book examination is arguably a holdover from the days when the ability to remember was most important. If you look at the world outside the classroom, however, memory retrieval is becoming much less important than skill at getting a digital device to tell you what you need to know.
Preliminary results from the National Institutes of Health’s massive Adolescent Brain Cognitive Development Study, which follows thousands of test subjects, tell us that the more time young people spend looking at screens, the less well they perform at tasks involving memory and cognition. A depressing trend, to be sure, and one likely to be reinforced over the next few weeks or months as more and more students take their classes online.
But before we resign ourselves to contemplation of a general decline in cognitive ability, let’s consider what such results as these don’t tell us: how well young people who are glued to their screens will perform at tasks involving memory and cognition when aided by a digital device. That’s the sort of world for which the technological revolution is training the young. Those who find this vision attractive are unlikely to see the traditional classroom model as anything but a hindrance.
Perhaps the future lies in models like the Agora school in the Netherlands, where teenagers decide for themselves how to spend their time. There are no teachers, but there is plenty of space for undirected play, much of it with tools aimed at helping them learn. Young people are free to use computers and phones as they wish, because, after all, both will exist in the outside world for which education is preparing them. Advocates insist that the students do as well as or better than their more traditionally instructed counterparts.
Or perhaps the future lies along a different path. But it’s likely to be a path away from the traditional model. The authority of the classroom teacher has been in decline for some while, as students more and more see themselves as the proper judges of the legitimacy of a teacher’s words and actions. As early as sixth grade, students question the teacher’s authority. It’s not uncommon to hear young people in college and professional school wonder aloud why they need to go to class at all. Why can’t they just read the assignments and take the test?
Maybe soon they will.
All of this was anticipated during an earlier and larger crisis: World War II. The armed forces, faced with an influx of recruits whose level of education varied, put out an urgent call for materials the soldiers could use to teach themselves everything from mathematics to English composition to bookkeeping. Scholars of the day speculated about “the implications which this extensive program of self-teaching materials may have for civilian education.”
Back then, the enormous inertia of classroom teaching proved more than sufficient to restore the traditional model once the war ended. But the digital revolution is nowadays pushing hard the other way. I don’t think this is a battle tradition can win.
Don’t get me wrong. I don’t expect the traditional classroom model to vanish in my lifetime. But right now the whole world is experimenting with a different method of instruction, and that experiment is bound to cast further doubt on how we’ve been teaching for centuries.-Bloomberg