Willful Blindness & Education

Both the education and the higher part of higher education is broken.Research is the only game in town and as that relies more and more heavily on private (read: commercial) funding the research game becomes more and more private (and trivial).

Research is the only game in town and as that relies more and more heavily on private (read: commercial) funding the research game becomes more and more private (and trivial).

In my last post, I presented the sorry state of affairs in equipping our graduates with thinking skills. The ability to engage in formal operational thinking may be inherent, but the skills necessary to use formal operational thinking must be taught. With up to 40% of our graduates unable to engage in formal operational thinking, we aren’t doing a good job of teaching it. This is what the higher in higher education stands for, higher thinking skills.

The education part of higher education refers to the methods we use to teach our students the higher thinking skills that higher education stands for. Constant pressure to crank up the research output means that more and more teaching is becoming less and less. Efficiency in teaching means large (or maybe small) lectures. The evidence tells us that around 90% of teaching in higher education is done through lectures. Lectures don’t work! As Gibbs writes:

More than 700 studies (referring to Blighs work) have confirmed that lectures are less effective than a wide range of methods for achieving almost every educational goal you can think of. Even for the straightforward objective of transmitting factual information, they are no better than a host of alternatives, including private reading. Moreover, lectures inspire students less than other methods, and lead to less study afterwards.

For some educational goals, no alternative has ever been discovered that is less effective than lecturing, including, in some cases, no teaching at all. Studies of the quality of student attention, the comprehensiveness of student notes and the level of intellectual engagement during lectures all point to the inescapable conclusion that they are not a rational choice of teaching method in most circumstances.

Corrigan looks at the debate about lecturing and says about those defending and supporting lecturing:

In some ways these apologia accentuate the dividing line in the lecturing debate. They praise various aspects of lecturing, while criticizing alternative methods. These rhetorical moves reinforce the idea of a two-sided debate, lecturing vs. not lecturing. Their skirting of the research on the subject puts them on the less convincing side, in my view.

Lectures don’t work to teach higher order thinking skills. I can’t tell you the number of times I hear – “But my lectures are different!”.

Given all of the evidence demonstrating that lectures don’t work to teach our students how to think, why do we still use them? Unless a working academic has not engaged in a single conversation about teaching in the last 30 years (and I daresay there will be some), they will have heard that lectures don’t work. Given that Bok reported (in “Our Underachieving Colleges”) that fewer that 5% of working academics will read anything about teaching in a given year, is it any surprise that nothing changes.

The story of Libby, Montana best illustrates the concept of willful blindness – I’ve provided a link, but reprint it here because it is important to know:

The town had a vermiculite mine in it.

Vermiculite was used for soil conditioners, to make plants grow faster and better. Vermiculite was used to insulate lofts, huge amounts of it put under the roof to keep houses warm during the long Montana winters. Vermiculite was in the playground. It was in the football ground. It was in the skating rink. What she didn’t learn until she started working this problem is vermiculite is a very toxic form of asbestos.

When she figured out the puzzle, she started telling everyone she could what had happened, what had been done to her parents and to the people that she saw on oxygen tanks at home in the afternoons. But she was really amazed. She thought, when everybody knows, they’ll want to do something, but actually nobody wanted to know.

In fact, she became so annoying as she kept insisting on telling this story to her neighbors, to her friends, to other people in the community, that eventually a bunch of them got together and they made a bumper sticker, which they proudly displayed on their cars, which said, “Yes, I’m from Libby, Montana, and no, I don’t have asbestosis.”

But Gayla didn’t stop. She kept doing research.The advent of the Internet definitely helped her.

She talked to anybody she could. She argued and argued, and finally she struck lucky when a researcher came through town studying the history of mines in the area, and she told him her story, and at first, of course, like everyone, he didn’t believe her, but he went back to Seattle and he did his own research and he realized that she was right. So now she had an ally.

Nevertheless, people still didn’t want to know.

They said things like, “Well, if it were really dangerous, someone would have told us.” “If that’s really why everyone was dying, the doctors would have told us.” Some of the guys used to very heavy jobs said, “I don’t want to be a victim. I can’t possibly be a victim, and anyway, every industry has its accidents.” But still Gayla went on, and finally she succeeded in getting a federal agency to come to town and to screen the inhabitants of the town — 15,000 people — and what they discovered was that the town had a mortality rate 80 times higher than anywhere in the United States.

That was in 2002, and even at that moment, no one raised their hand to say, “Gayla, look in the playground where your grandchildren are playing. It’s lined with vermiculite.”

This wasn’t ignorance. It was willful blindness.

It is easy to say that what happened in Libby has nothing to do with higher education. Academics ignoring the evidence about lecturing and not teaching students higher order thinking skills, and even defending their practices in the face of overwhelming evidence that it is just plain wrong, is willful blindness. But nobody dies – do they?

I would argue that they do. An example of what these higher order thinking skills are illustrates what I mean:

  • purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or conceptual considerations upon which that judgment is based (Facione, 1990, p. 3)

People who do not or can not engage in higher order thinking skills don’t grasp the use of evidence in argumentation. Evidence means nothing.

What do you think it was that allowed the residents of Libby to keep on denying what was happening in their town in the face of overwhelming evidence. To them, evidence means nothing!

What do you think it is that allows normal everyday people (some with higher education degrees) to keep on denying global climate change in the face of overwhelming evidence. To them, evidence means nothing!

Because of our almost exclusive focus on delivering information, with most of us (and our students) carrying around most of the world’s information in our pockets, we ignore our duty to teach people to think. We willfully ignore the evidence around us and it is costing people their very lives, not to mention the enormous cost to society when the majority of the people on our planet cannot or will not engage in formal operational thinking.

The cost of our willfully ignoring what we know to be true is beyond imagination. We can do better than that. We must do better than that.

How could we take something as natural and wonderful as learning and turn it into education?

Cognitive Development and Higher Education

Cognitive development across the lifespan throws up an interesting problem for us here in Higher Education.There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

The quandary faced by the higher education community lies in the final stage of cognitive development proposed by Piaget. The formal operational thinking stage that emerges at adolescence. As a person develops through their childhood, a normally developing child will reach a cognitive developmental milestone, acquire whatever skills that are attached to that stage of thinking, and move on.

As an example, as a young child, one of the stages is called egocentrism. Simply put, in this stage (finishes at about age four), a child thinks that everyone sees and experiences the world the same way that they do. If a child in this stage is viewing a scene and they were to ask you about something they were seeing, they wouldn’t be able to conceive the concept that you were not able to see exactly what they were, regardless of where you are. However, once a child passes through the stage, that doesn’t happen again in their lifetime. I doubt very much that you have experienced this recently because once the stage is passed it is simply the way you think.

This type of fairly linear developmental pattern holds true for virtually every cognitive developmental stage that we go through. However, this is not true of the final, formal operational thinking stage. Although the ability to think in a formal operational stage emerges during adolescence, thinking in this way requires teaching and practice. This is the only stage of cognitive development that is this way. All of the rest of the stages we simply acquire, but the formal operational thinking stage only bestows on us the ability to think that way, not the thinking itself.

Why is this a quandary for higher education? Because the higher part of higher education refers to the thinking that has to be developed for the expression of formal operational thinking. It doesn’t just happen, it has to be taught and practiced. We tend to call this thinking critical thinking and expect that our students arrive with this ability in place and ready to be fully expressed during their higher education. When it doesn’t happen, we are filled with disappointment and blame the secondary school system or the students themselves for not being prepared.

The research demonstrates to us that only a few (about 10%) of the adult population are ever fully equipped with formal operational thinking skills – whether or not they have received any higher education. Between 30% and 40% of the population lack the ability to engage in this type of thought completely. The remaining 50 to 60 percent have some formal operational thinking skills ranging from barely demonstrating that they have any to usually, but not always using them.

Given that we are now educating about 40% (or more) of the general population, how can it be that we are only seeing about 10% able to consistently use formal operational thinking skills to solve problems and analyze information? Because our model of “sit down, shut up, face the front, memorize, and regurgitate” used in 90% (or more) of the higher education classrooms neither teaches or requires the use of formal operational thinking skills.

The skills I’m talking about would include some of the following:

  •  a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture (Bacon 1605) 

  • the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action 
(Paul, 1987) 

  • self-guided, self-disciplined thinking which attempts to reason at the highest level of quality in a fair-minded way (Elder)
  • the mental processes, strategies, and representations people use to solve problems, make decisions, and learn new concepts (Sternberg, 1986, p. 3) 

  • the propensity and skill to engage in an activity with reflective skepticism 
(McPeck, 1981, p. 8) 

  • reflective and reasonable thinking that is focused on deciding what to believe or do (Ennis, 1985, p. 45) 

  • thinking that is goal-directed and purposive, “thinking aimed at forming a judgment,” where the thinking itself meets standards of adequacy and accuracy (Bailin et al., 1999b, p. 287) 

  • judging in a reflective way what to do or what to believe (Facione, 2000, p. 61) 

  • skillful, responsible thinking that facilitates good judgment because it 1) relies upon criteria, 2) is self-correcting, and 3) is sensitive to context (Lipman, 1988, p. 39) 

  • the use of those cognitive skills or strategies that increase the probability of a desirable outcome (Halpern, 1998, p. 450) 

  • seeing both sides of an issue, being open to new evidence that disconfirms your ideas, reasoning dispassionately, demanding that claims be backed by evidence, deducing and inferring conclusions from available facts, solving problems, and so forth (Willingham, 2007, p. 8).
  • purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or conceptual considerations upon which that judgment is based (Facione, 1990, p. 3)

I have written extensively about the state of higher education today, but our failure to deliver on our historical core purpose beggars belief. We can do better than this.

How could we take something as natural and wonderful as learning and turn it into education?

Mistakes are Useful?

The most alarming part of this post is in the middle where we find out that learners “…are more concerned with grades than they are with learning. This causes the supposedly smart students to take less risks in order to get better grades. Students that take more risks are punished with bad grades.” How many symphonies have not been written in order to protect a GPA?


If you have heard the phrase that “we learn from our mistakes” you may wonder why mistakes are unacceptable in schools. The very places that we go to learn. In school, the more mistake you make the more you are scorned. Only the students that happen to give the teacher the exact answer that they want seems to succeed in the current system. However, this is not how learning works in the real world. When we make mistakes we learn not to repeat them and we find out what does work and what does not work.

According to (Tugend, 2011) in our current education system, children are more concerned with grades than they are with learning. This causes the supposedly smart students to take less risks in order to get better grades. Students that take more risks are punished with bad grades. So in other words (Tugend, 2011) is saying…

View original post 284 more words

Conformity and Education

A new post I just put up there.

Scholarship of Learning

I have written before about the drive for conformity in education. Given the massification of education which has led to huge classrooms with, literally, hundreds of students being taught, conformity is essential. It has become, unabashedly, one of the central and core tenants of education. When I wrote about conformity three years ago, I focussed on the loss of creativity in the learning process. However, I now believe that there is a much greater cost to our society than the simple loss of creativity. I now believe that the greatest cost that society bears as a result of the enforced conformity from the youngest to the oldest students in education is a personal tragedy borne by, literally, millions of students and former students.

That students of all shapes and sizes are forced into a mold by the educational “system” is without disagreement. Students, at least for a significant portion of…

View original post 572 more words

How We Know

Scholarship of Learning

I know that this blog post will be old news to most of us, but I think it needs reiterating within the present context of my thinking – how do we find out what we believe in, or what are the methods of knowing?

According to Peirce (1877), there are three methods of knowing charles_sanders_peirceinformation, method of authority, method of tenacity, a priori method, and the scientific method. I will review each one of them, and consider how they impact us in our society today. I will consider the method of tenacity and the a priori method first

In both the method of tenacity and the a priori method, there is often no way to identify where knowledge of a belief came from, it just is. The fundamental difference is the willingness to change a belief.

In the a priori method, the belief is there because it…

View original post 1,169 more words

Reason and Moral Development

Last week I posted about the lack of ability to engage in deductive reasoning in the general adult population. As well as the problems I highlighted there, one aspect that deserves further attention is the effect that has on moral development.

Piaget assumed that all people, when they reached adolescence, would progress naturally from his “concrete operational” stage to the “formal operational” stage of cognitive development. The formal operational stage is where we see deductive reasoning emerge. However, research since Piaget’s proposal has let us know that not all (in fact a minority) of adults reach a formal operational stage of cognitive development. This is because it does not emerge naturally, but must be taught, and in our test, test, test world of education today, there is no room for teaching students how to think.

Moral development relies directly on the ability to reason, with Kolberg’s moral development stages tied neatly to Piaget’s cognitive development stages. What this means, is that the majority of people do not move beyond a concrete operational stage of moral reasoning. Here is a table outlining the stages of moral development.


Concrete operational thinkers don’t progress beyond stage 4 in their moral development. As the next table shows, there are few adults who progress beyond Stage 4 in their moral reasoning.


Why is this a problem? If you read the description of stage four moral development, you can see that there is little thinking involved. At this stage, people simply follow the rules. Right and wrong are defined by the law, and the highest moral authority is the government of the day. Whatever laws are passed defines the morality of the day for the vast majority of people.

Think of Nazi Germany, and the laws they passed targeting a group of people. With stage four moral reasoning, because it is written in law, it is the right thing. Institutional racism or bigotry become, not only okay, but right, because they are legal. Simply looking out on the events of today, and you can see the same thing happening again, both in North America and in parts of Europe.

One of the evidences that there is a lack of reasoning ability in America today is the emergence of Donald Trump as the frontrunner in the Republican race for the Presidency. Given how politics in the USA tends to swing between parties, this means that he is likely to be the next President. He is using the same language and techniques to target and oppress Muslims in America that Hitler used on the Jews 70 years ago.

Because of the failure of education to train people to think, there is an inability to engage in moral reasoning that will stop both the current, and the onrushing atrocities that are hurtling toward us. If, what is on the horizon, actually happens, we have to face the fact that we, as educators, have been complicit in shaping the society that would allow this to happen.

As the most powerful force shaping society today, we need to do better. We need to break out of the memorize and regurgitate model of education, and teach people to think. In the age of information abundance, we don’t need to focus exclusively on content, and yet, for all the innovations in education over the past ten years, that is still our predominate model. When are we going to really engage in meaningful discussion to fix what is broken.

The Dearth of Reason

Thinking is the hardest work there is, which is probably the reason why so few engage in it.

Henry Ford

Reasoning has been divided into two basic types – inductive reasoning and deductive reasoning.

Inductive reasoning is universal, emerges at a very young age, and is fundamentally attuned to the structure of the brain and how memory is stored. Inductive reasoning is the emergence of a general principle from the experiences of a person. A toddler shows basic inductive reasoning when, after touching several hot surfaces, they decide that hot surfaces burn. After this, there is an almost universal reaction to telling them something is hot – they clutch one hand with the other, and with a very concerned look say “hot” (or something like that). Deductive reasoning at its best.

Deductive reasoning, on the other hand, is not natural, and must be learned. The cognitive functioning that is necessary to engage in deductive reasoning develops during adolescence –  the ability to engage in abstract thought processes. However, deductive reasoning is difficult to carry out, and normally becomes evident after formal instruction in deductive reasoning. This is the type of thinking that Henry Ford was referring to.

Unfortunately, the number of adults who ever learn to reason deductively is not high. Studies in the 1960s and 1970s demonstrated that as few as 40% of North American adults are unable to use deductive reasoning to solve problems and understand the world, with the ability being directly linked to educational attainment. More recent studies have suggested that the number of people who are able to engage in deductive reasoning has dropped from about 40% to as low as 20%. This is alarming for a number of reasons.

First, it demonstrates a serious shortfall in the education system. With our obsession in education for memorizing more stuff and finding the right answer, there is no room left for teaching people to think.

Why is this a problem? Obviously, with so large a proportion of the population unable to use deductive reasoning, and society is still functioning – or is it?

Being unable to use deductive reasoning means that an individual is unable to follow the logic that is used to reach a conclusion that is based on deductive reasoning. It is not that a person doesn’t want to, they are simply unable to because of a lack of training.

Why does this matter? Because there is a growing chasm between the scientific world and society in general. Most of the members of our society are cognitively unable to follow the arguments scientists use to demonstrate what they are finding, and scientists can’t understand why the members of our society just don’t look at the data and come to the same, obvious conclusions that they have. The lack of deductive reasoning means that members of society are simply unable to follow the logic, and so must turn to other sources to find out the truth.

Thing about climate change, or immunization. Within the scientific community, and among generally well educated members of society (and there is a strong correlation) who can engage in deductive reasoning, there is confusion about how there can even be a controversy. For those who can use deductive reasoning, there is no controversy. The facts speak for themselves when they are followed through the logical sequence that leads to a conclusion. The science is absolutely solid.

The lack of ability to engage in deductive reasoning for a majority of participants in a Western Democracy is problematic, to say the least.

Another reason, which will have to be dealt with in a future blog post, is the effect that the lack of deductive reasoning ability (or formal operational thinking in developmental terms) has on the development of moral reasoning.

We can do better than this – if we are willing to look closely at ourselves and embrace the necessary changes.