Willful Blindness & Education

Both the education and the higher part of higher education is broken.Research is the only game in town and as that relies more and more heavily on private (read: commercial) funding the research game becomes more and more private (and trivial).

Research is the only game in town and as that relies more and more heavily on private (read: commercial) funding the research game becomes more and more private (and trivial).

In my last post, I presented the sorry state of affairs in equipping our graduates with thinking skills. The ability to engage in formal operational thinking may be inherent, but the skills necessary to use formal operational thinking must be taught. With up to 40% of our graduates unable to engage in formal operational thinking, we aren’t doing a good job of teaching it. This is what the higher in higher education stands for, higher thinking skills.

The education part of higher education refers to the methods we use to teach our students the higher thinking skills that higher education stands for. Constant pressure to crank up the research output means that more and more teaching is becoming less and less. Efficiency in teaching means large (or maybe small) lectures. The evidence tells us that around 90% of teaching in higher education is done through lectures. Lectures don’t work! As Gibbs writes:

More than 700 studies (referring to Blighs work) have confirmed that lectures are less effective than a wide range of methods for achieving almost every educational goal you can think of. Even for the straightforward objective of transmitting factual information, they are no better than a host of alternatives, including private reading. Moreover, lectures inspire students less than other methods, and lead to less study afterwards.

For some educational goals, no alternative has ever been discovered that is less effective than lecturing, including, in some cases, no teaching at all. Studies of the quality of student attention, the comprehensiveness of student notes and the level of intellectual engagement during lectures all point to the inescapable conclusion that they are not a rational choice of teaching method in most circumstances.

Corrigan looks at the debate about lecturing and says about those defending and supporting lecturing:

In some ways these apologia accentuate the dividing line in the lecturing debate. They praise various aspects of lecturing, while criticizing alternative methods. These rhetorical moves reinforce the idea of a two-sided debate, lecturing vs. not lecturing. Their skirting of the research on the subject puts them on the less convincing side, in my view.

Lectures don’t work to teach higher order thinking skills. I can’t tell you the number of times I hear – “But my lectures are different!”.

Given all of the evidence demonstrating that lectures don’t work to teach our students how to think, why do we still use them? Unless a working academic has not engaged in a single conversation about teaching in the last 30 years (and I daresay there will be some), they will have heard that lectures don’t work. Given that Bok reported (in “Our Underachieving Colleges”) that fewer that 5% of working academics will read anything about teaching in a given year, is it any surprise that nothing changes.

The story of Libby, Montana best illustrates the concept of willful blindness – I’ve provided a link, but reprint it here because it is important to know:

The town had a vermiculite mine in it.

Vermiculite was used for soil conditioners, to make plants grow faster and better. Vermiculite was used to insulate lofts, huge amounts of it put under the roof to keep houses warm during the long Montana winters. Vermiculite was in the playground. It was in the football ground. It was in the skating rink. What she didn’t learn until she started working this problem is vermiculite is a very toxic form of asbestos.

When she figured out the puzzle, she started telling everyone she could what had happened, what had been done to her parents and to the people that she saw on oxygen tanks at home in the afternoons. But she was really amazed. She thought, when everybody knows, they’ll want to do something, but actually nobody wanted to know.

In fact, she became so annoying as she kept insisting on telling this story to her neighbors, to her friends, to other people in the community, that eventually a bunch of them got together and they made a bumper sticker, which they proudly displayed on their cars, which said, “Yes, I’m from Libby, Montana, and no, I don’t have asbestosis.”

But Gayla didn’t stop. She kept doing research.The advent of the Internet definitely helped her.

She talked to anybody she could. She argued and argued, and finally she struck lucky when a researcher came through town studying the history of mines in the area, and she told him her story, and at first, of course, like everyone, he didn’t believe her, but he went back to Seattle and he did his own research and he realized that she was right. So now she had an ally.

Nevertheless, people still didn’t want to know.

They said things like, “Well, if it were really dangerous, someone would have told us.” “If that’s really why everyone was dying, the doctors would have told us.” Some of the guys used to very heavy jobs said, “I don’t want to be a victim. I can’t possibly be a victim, and anyway, every industry has its accidents.” But still Gayla went on, and finally she succeeded in getting a federal agency to come to town and to screen the inhabitants of the town — 15,000 people — and what they discovered was that the town had a mortality rate 80 times higher than anywhere in the United States.

That was in 2002, and even at that moment, no one raised their hand to say, “Gayla, look in the playground where your grandchildren are playing. It’s lined with vermiculite.”

This wasn’t ignorance. It was willful blindness.

It is easy to say that what happened in Libby has nothing to do with higher education. Academics ignoring the evidence about lecturing and not teaching students higher order thinking skills, and even defending their practices in the face of overwhelming evidence that it is just plain wrong, is willful blindness. But nobody dies – do they?

I would argue that they do. An example of what these higher order thinking skills are illustrates what I mean:

  • purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or conceptual considerations upon which that judgment is based (Facione, 1990, p. 3)

People who do not or can not engage in higher order thinking skills don’t grasp the use of evidence in argumentation. Evidence means nothing.

What do you think it was that allowed the residents of Libby to keep on denying what was happening in their town in the face of overwhelming evidence. To them, evidence means nothing!

What do you think it is that allows normal everyday people (some with higher education degrees) to keep on denying global climate change in the face of overwhelming evidence. To them, evidence means nothing!

Because of our almost exclusive focus on delivering information, with most of us (and our students) carrying around most of the world’s information in our pockets, we ignore our duty to teach people to think. We willfully ignore the evidence around us and it is costing people their very lives, not to mention the enormous cost to society when the majority of the people on our planet cannot or will not engage in formal operational thinking.

The cost of our willfully ignoring what we know to be true is beyond imagination. We can do better than that. We must do better than that.


How could we take something as natural and wonderful as learning and turn it into education?

Cognitive Development and Higher Education

Cognitive development across the lifespan throws up an interesting problem for us here in Higher Education.There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

The quandary faced by the higher education community lies in the final stage of cognitive development proposed by Piaget. The formal operational thinking stage that emerges at adolescence. As a person develops through their childhood, a normally developing child will reach a cognitive developmental milestone, acquire whatever skills that are attached to that stage of thinking, and move on.

As an example, as a young child, one of the stages is called egocentrism. Simply put, in this stage (finishes at about age four), a child thinks that everyone sees and experiences the world the same way that they do. If a child in this stage is viewing a scene and they were to ask you about something they were seeing, they wouldn’t be able to conceive the concept that you were not able to see exactly what they were, regardless of where you are. However, once a child passes through the stage, that doesn’t happen again in their lifetime. I doubt very much that you have experienced this recently because once the stage is passed it is simply the way you think.

This type of fairly linear developmental pattern holds true for virtually every cognitive developmental stage that we go through. However, this is not true of the final, formal operational thinking stage. Although the ability to think in a formal operational stage emerges during adolescence, thinking in this way requires teaching and practice. This is the only stage of cognitive development that is this way. All of the rest of the stages we simply acquire, but the formal operational thinking stage only bestows on us the ability to think that way, not the thinking itself.

Why is this a quandary for higher education? Because the higher part of higher education refers to the thinking that has to be developed for the expression of formal operational thinking. It doesn’t just happen, it has to be taught and practiced. We tend to call this thinking critical thinking and expect that our students arrive with this ability in place and ready to be fully expressed during their higher education. When it doesn’t happen, we are filled with disappointment and blame the secondary school system or the students themselves for not being prepared.

The research demonstrates to us that only a few (about 10%) of the adult population are ever fully equipped with formal operational thinking skills – whether or not they have received any higher education. Between 30% and 40% of the population lack the ability to engage in this type of thought completely. The remaining 50 to 60 percent have some formal operational thinking skills ranging from barely demonstrating that they have any to usually, but not always using them.

Given that we are now educating about 40% (or more) of the general population, how can it be that we are only seeing about 10% able to consistently use formal operational thinking skills to solve problems and analyze information? Because our model of “sit down, shut up, face the front, memorize, and regurgitate” used in 90% (or more) of the higher education classrooms neither teaches or requires the use of formal operational thinking skills.

The skills I’m talking about would include some of the following:

  •  a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture (Bacon 1605) 

  • the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action 
(Paul, 1987) 

  • self-guided, self-disciplined thinking which attempts to reason at the highest level of quality in a fair-minded way (Elder)
  • the mental processes, strategies, and representations people use to solve problems, make decisions, and learn new concepts (Sternberg, 1986, p. 3) 

  • the propensity and skill to engage in an activity with reflective skepticism 
(McPeck, 1981, p. 8) 

  • reflective and reasonable thinking that is focused on deciding what to believe or do (Ennis, 1985, p. 45) 

  • thinking that is goal-directed and purposive, “thinking aimed at forming a judgment,” where the thinking itself meets standards of adequacy and accuracy (Bailin et al., 1999b, p. 287) 

  • judging in a reflective way what to do or what to believe (Facione, 2000, p. 61) 

  • skillful, responsible thinking that facilitates good judgment because it 1) relies upon criteria, 2) is self-correcting, and 3) is sensitive to context (Lipman, 1988, p. 39) 

  • the use of those cognitive skills or strategies that increase the probability of a desirable outcome (Halpern, 1998, p. 450) 

  • seeing both sides of an issue, being open to new evidence that disconfirms your ideas, reasoning dispassionately, demanding that claims be backed by evidence, deducing and inferring conclusions from available facts, solving problems, and so forth (Willingham, 2007, p. 8).
  • purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or conceptual considerations upon which that judgment is based (Facione, 1990, p. 3)

I have written extensively about the state of higher education today, but our failure to deliver on our historical core purpose beggars belief. We can do better than this.


How could we take something as natural and wonderful as learning and turn it into education?

How good are these techniques?

Scholarship of Learning

Dunlosky et al (2013) published a brilliant paper that looked at a number of techniques that are used to learn material in an academic setting. They tested the various techniques, and produced a pretty good assessment on just how good the techniques were. The techniques ranged from the testing effect (very good technique) to highlighting what you want to remember (poor technique for learning). I have reproduced their table below for you to have a look at.

I think they might be mistaken in their rankings. This feeling is based on anecdotal experience and how often each of these techniques are used in the learning process. I think re-reading and highlighting are by far the most useful for learning – based on how often they are used as the principle method of learning 🙂

The entire concept of the Scholarship of learning is based on just how wrong we are about…

View original post 216 more words

Reflective Practise

We had our annual Teaching & Learning Conference here at Bangor, with the Honourable Steve Wheeler as our keynote, and had a great time. However, I gained new insight into the meaning of reflective practise during the event.

I led a workshop on curriculum planning in the 21st century. My plan was to introduce what we know about lecturers, how the world has changed, and then what we should be doing about it in the classroom – never really unfolded like I was hoping.

As I introduced what we know about lecturers, I talked about teacher cognition, and our current curriculum planning techniques. The easy one is current curriculum planning – we plan our curriculum based on the way we were taught. After all, the evidence for the quality of the programme is incontrovertible – it is me!

Teacher cognition is somewhat related, and is succinctly summarised by a former student Dan, in one of his blog posts from the past.

  • teachers’ cognitions can be powerfully influenced by their own experiences as learners;
  • these cognitions influence what and how teachers learn during teacher education;
  • they act as a filter through which teachers interpret new information and experience;
  • they may outweigh the effects of teacher education in influencing what teachers do in the classroom;
  • they can be deep-rooted and resistant to change;
  • they can exert a persistent long-term influence on teachers’ instructional practices;
  • they are, at the same time, not always reflected in what teachers do in the classroom.
  • they interact bi-directionally with experience (i.e. beliefs influence practices but practices can also lead to changes in beliefs).

In other words, by the time you get to be a lecturer, you already know what a good teacher, and that view of teaching is almost impossible to change.

As the discussion began, teacher cognition became the primary philosophy. I know what I am doing, and I’m good at it, and I can see no reason to change… was the prevailing theme of the discussion. In spite of the overwhelming evidence as to the ineffectiveness of lectures as a method of learning, lecturers were idealised and worshipped by some of the participants. Lectures are to inspire (future post) and uplift students (Chris’s comment here says it all …I have enjoyed learning, and been inspired to learn more but this hasn’t happened in the lecture. I don’t think I have ever really been inspired by a lecture), not simply convey information.

I had a couple of students present during the session, and one of them said that he had really thought that teacher cognition was a theoretical construct that had little bearing on how things were really done, but now realises how real it is.

This exchange with practitioners who are engaged enough in teaching to spend a day at our conference made me realise that I had misunderstood what was meant by reflective practise. I now realise that reflective practise is largely about reflecting anything away from themselves that might suggest that what they are doing isn’t ideal. Reflective practise is about defensiveness. Reflective practise is about teaching practise – it has nothing to do with learning at all.

Teaching and learning are completely divorced in HE with teaching focused almost entirely on the teacher. Laurie Taylor’s (Accustomed as I am) take on lecturers would be a whole lot funnier if it didn’t strike so close to home.

How have we made something as exhilarating as learning, as oppressive as education?

Lecturing

I have written and argued with colleagues about the value of a lecture for many years. However, one of my colleagues came and sat down with me the other day, and we had a great discussion about lecturing.

She managed to pin me down on exactly where I stand, and so I thought I’d share my thoughts with you.

I’m not against giving or listening to a great talk. I have done both. I’ve received awards for my speaking ability, and I’ve been inspired by talks I’ve heard over the years. In my class feedback, I am repeatedly asked to give more lectures (I deliver a few) because I am a great lecturer (I won’t).

What I’m against is lecturing. By lecturing, I mean what is traditionally done in university classes the world over. The lecturer prepares material from some source, condenses it, organises it, prepares bullet point  slides highlighting all the important points, and then stands up and talks about the points. Sometimes, lecturers will take great pleasure in saying that they won’t use powerpoint, so that makes the experience better. The lecturer then repeats this process for the requisite number of hours across a semester for what we call a module or a class. There are variations on the theme, but it is essentially a let me tell you what you need to know approach with the teacher doing all the work, and the students passively having knowledge poured out upon their heads from on high. One of the variations included the use of clickers in order to make the experience truly two way, with the students actively engaged and taking control of the learning process (should have been in marketing – could have made a lot more money). All of this is lecturing – the kind of lecturing that I think is poor as a teaching tool.

However, I have been to lectures that really make you think. Lectures that present the world in a way that causes cognitive dissonance or presents a viewpoint that I have never considered before. A good keynote does that, and they are occasional events. Presenting this kind of a talk (lecture) week after week to the same group of people, and expecting them to go away inspired and scratching their heads in thought every time they listen to you is unrealistic. To pull one off occasionally is practical, but to fill three hours a week for 12 – 15 weeks, it just doesn’t happen.

In other words, I am not opposed to occasional lectures to students that are inspiring and powerful, conveying a message that makes the students think. I am opposed to lectures that simply go over material that a student is expected to learn. There are better ways to foster information exchange, we just don’t use them. We use lectures because they are easy.

Teacher Centred Pontification

I had a chat with a friend of mine who loves to lecture – and does a great job of it. We were discussing the merits of a lecture, and I found myself wondering what is the biggest single problem with lecturing as a form of teaching.

It is the perspective.

When I (or anyone else) stands up and lectures, they are presenting information from their own perspective. The examples are from their own perspective. The interpretation is from their own perspective. The thinking is from their own perspective.

Is this, necessarily, a bad thing. I don’t think so, if you are talking about the auditory transference of information. When I go to a keynote (a form of lecture), I expect to get a point of view. It is presented to make me think about something in (usually) a different way.

If I go to a talk at a conference (usually about 20 minutes), I am learning something new that a researcher has done, and can ask questions to gain understanding.

This is not the case with university lecturing. Everything about a university lecture is teacher centred. The perspective, the experiences, the understanding. A university lecture is rarely a one-off event (like a keynote), but is a series of talks to transmit information. The students are passive participants in the process. They look forward and take notes on what is said. And they do this over and over again.

The roots of lectures lie in the days before printing when you had to listen to an (or the) expert, and write down everything they said – because that was the only way to find out. Until recently (in the last 150 years or so), lectures were not given to hundreds at a time, and the atmosphere made it more like a seminar.

The model for lectures is based on religious sermons. Gather together and listen to the authoritative, infallible word. We still (in the UK) cling to the notion that academic judgment is final. When we are lecturing, we are ministers of the word, and what we say is unassailable by the congregation.

What a pathetic way to learn. Especially when there are alternatives.

If not lecturing, then what?

I have posted several articles about lecturing, lecture theatres, information abundance – all of which have a common thread. Given the changes that are happening all around us, how can we (HE) adapt to maintain our purpose.

Keith Hampson writes about Christensen’s disruptive innovations, attempting to bring precision to the concept. I prefer to keep the concept loose, and look at the disruptive innovation we are facing in HE as being the move from information scarcity to information abundance. in the face of a complete shift in the foundation upon which we are built, what do we do now?

In my earlier post on information abundance, I stated that I thought we need to focus on teaching and assessing higher cognitive and academic skills wrapped around some content that will engage students.

I have been teaching my Science of Education module for a couple of years now based on a philosophy of information abundance, and maximal engagement (using evidence based practise). The results have brilliant (from both my own, and the students point of view – click to read their unabridged comments).

The ideas I have drawn from in order to develop my teaching according to my understanding of information abundance rely on a presentation given by Peter Nicholson back in 2009 (a couple of decades ago by online standards – but still great). I have expanded his three principles to five, and adapted them to fit.

The principles underlying the design of my teaching are 1) the ubiquitous nature of information today, 2) the change in the perception of knowledge today from there being a “stock” of knowledge to there being a “flow” of knowledge, 3) the emergence of crowd-sourcing as a source of expertise, 4) the change in emphasis from content to skills (especially higher thinking skills), and 5) the importance of academic reputation and academic expertise in the accreditation and recognition of formal learning.

This, in combination with utilising the research outlining how to engage adult students in a learning environment (see Jones’ MUSIC model of engagement), has led to a highly successful method of teaching that is leading to a new post-graduate programme based entirely on my philosophies.

We’ll have to see how well it works out, but if it is half as successful as the single undergraduate module, it will be immensely popular, and will be available as a residential, online, or blended learning experience.