There is no doubt that formal education today is based on an information scarcity model. The methods used are firmly rooted in the past with lectures in higher education still accounting for over 90% of all formal learning events. Referring to an earlier post, I quote:
As Graham Gibbs recently wrote in the Times Higher:
More than 700 studies (referring to Gibbs work) have confirmed that lectures are less effective than a wide range of methods for achieving almost every educational goal you can think of. Even for the straightforward objective of transmitting factual information, they are no better than a host of alternatives, including private reading. Moreover, lectures inspire students less than other methods, and lead to less study afterwards.
For some educational goals, no alternative has ever been discovered that is less effective than lecturing, including, in some cases, no teaching at all. Studies of the quality of student attention, the comprehensiveness of student notes and the level of intellectual engagement during lectures all point to the inescapable conclusion that they are not a rational choice of teaching method in most circumstances.
A review by Hughes and Mighty written in the more recent past (2010) reinforced Bligh’s damning indictment of lecturing as learning events written over 40 years ago. The recent article in The Atlantic by Corrigan looks at the debate about lecturing and says about those defending and supporting lecturing:
In some ways these apologia accentuate the dividing line in the lecturing debate. They praise various aspects of lecturing, while criticizing alternative methods. These rhetorical moves reinforce the idea of a two-sided debate, lecturing vs. not lecturing. Their skirting of the research on the subject puts them on the less convincing side, in my view.
This is appalling – For some educational goals, no alternative has ever been discovered that is less effective than lecturing, including, in some cases, no teaching at all.
And yet, lecturing still accounts for over 90% of all learning events in higher education and I can only speculate about primary or secondary education, but would be surprised if it was much less.
That we live in an age of unprecedented and ubiquitous information abundance (at least in the developed world) is beyond argument. Since this is the case, why do we cling to learning models that were developed a thousand years ago?
MOOCs – the next big thing from about 2001 to 2013 – have been offered, and taken up, by hundreds of thousands of students as an alternative to formal education. Like most (again, over 90%) of the online provision available, educators have taken the worst we have to offer (no alternative has ever been discovered that is less effective than lecturing) and dished it out in larger and larger portions. There have been a few shining examples that have taken advantage of an information abundance model (cMOOCS), but they are few and far between.
There are a number of stakeholders involved, and I will consider each one of them, in turn, over the next few weeks. I may miss some, but the ones I will consider will be: the institutions (and administrators), the teachers, the students, parents, and employers. As a spoiler, I will say right now that the biggest factor is expectation.