Cognitive Development and Higher Education

Cognitive development across the lifespan throws up an interesting problem for us here in Higher Education.There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

There is fairly widespread agreement that Piaget got his developmental stages pretty close to the mark as he described how people develop from infancy through to adulthood. Although there is some argument about the details, with some adjustments that have been made here and there, the basic premise has pretty well stood the test of time.

The quandary faced by the higher education community lies in the final stage of cognitive development proposed by Piaget. The formal operational thinking stage that emerges at adolescence. As a person develops through their childhood, a normally developing child will reach a cognitive developmental milestone, acquire whatever skills that are attached to that stage of thinking, and move on.

As an example, as a young child, one of the stages is called egocentrism. Simply put, in this stage (finishes at about age four), a child thinks that everyone sees and experiences the world the same way that they do. If a child in this stage is viewing a scene and they were to ask you about something they were seeing, they wouldn’t be able to conceive the concept that you were not able to see exactly what they were, regardless of where you are. However, once a child passes through the stage, that doesn’t happen again in their lifetime. I doubt very much that you have experienced this recently because once the stage is passed it is simply the way you think.

This type of fairly linear developmental pattern holds true for virtually every cognitive developmental stage that we go through. However, this is not true of the final, formal operational thinking stage. Although the ability to think in a formal operational stage emerges during adolescence, thinking in this way requires teaching and practice. This is the only stage of cognitive development that is this way. All of the rest of the stages we simply acquire, but the formal operational thinking stage only bestows on us the ability to think that way, not the thinking itself.

Why is this a quandary for higher education? Because the higher part of higher education refers to the thinking that has to be developed for the expression of formal operational thinking. It doesn’t just happen, it has to be taught and practiced. We tend to call this thinking critical thinking and expect that our students arrive with this ability in place and ready to be fully expressed during their higher education. When it doesn’t happen, we are filled with disappointment and blame the secondary school system or the students themselves for not being prepared.

The research demonstrates to us that only a few (about 10%) of the adult population are ever fully equipped with formal operational thinking skills – whether or not they have received any higher education. Between 30% and 40% of the population lack the ability to engage in this type of thought completely. The remaining 50 to 60 percent have some formal operational thinking skills ranging from barely demonstrating that they have any to usually, but not always using them.

Given that we are now educating about 40% (or more) of the general population, how can it be that we are only seeing about 10% able to consistently use formal operational thinking skills to solve problems and analyze information? Because our model of “sit down, shut up, face the front, memorize, and regurgitate” used in 90% (or more) of the higher education classrooms neither teaches or requires the use of formal operational thinking skills.

The skills I’m talking about would include some of the following:

  •  a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture (Bacon 1605) 

  • the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action 
(Paul, 1987) 

  • self-guided, self-disciplined thinking which attempts to reason at the highest level of quality in a fair-minded way (Elder)
  • the mental processes, strategies, and representations people use to solve problems, make decisions, and learn new concepts (Sternberg, 1986, p. 3) 

  • the propensity and skill to engage in an activity with reflective skepticism 
(McPeck, 1981, p. 8) 

  • reflective and reasonable thinking that is focused on deciding what to believe or do (Ennis, 1985, p. 45) 

  • thinking that is goal-directed and purposive, “thinking aimed at forming a judgment,” where the thinking itself meets standards of adequacy and accuracy (Bailin et al., 1999b, p. 287) 

  • judging in a reflective way what to do or what to believe (Facione, 2000, p. 61) 

  • skillful, responsible thinking that facilitates good judgment because it 1) relies upon criteria, 2) is self-correcting, and 3) is sensitive to context (Lipman, 1988, p. 39) 

  • the use of those cognitive skills or strategies that increase the probability of a desirable outcome (Halpern, 1998, p. 450) 

  • seeing both sides of an issue, being open to new evidence that disconfirms your ideas, reasoning dispassionately, demanding that claims be backed by evidence, deducing and inferring conclusions from available facts, solving problems, and so forth (Willingham, 2007, p. 8).
  • purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or conceptual considerations upon which that judgment is based (Facione, 1990, p. 3)

I have written extensively about the state of higher education today, but our failure to deliver on our historical core purpose beggars belief. We can do better than this.


How could we take something as natural and wonderful as learning and turn it into education?

Advertisements

The Dearth of Reason

Thinking is the hardest work there is, which is probably the reason why so few engage in it.

Henry Ford

Reasoning has been divided into two basic types – inductive reasoning and deductive reasoning.

Inductive reasoning is universal, emerges at a very young age, and is fundamentally attuned to the structure of the brain and how memory is stored. Inductive reasoning is the emergence of a general principle from the experiences of a person. A toddler shows basic inductive reasoning when, after touching several hot surfaces, they decide that hot surfaces burn. After this, there is an almost universal reaction to telling them something is hot – they clutch one hand with the other, and with a very concerned look say “hot” (or something like that). Deductive reasoning at its best.

Deductive reasoning, on the other hand, is not natural, and must be learned. The cognitive functioning that is necessary to engage in deductive reasoning develops during adolescence –  the ability to engage in abstract thought processes. However, deductive reasoning is difficult to carry out, and normally becomes evident after formal instruction in deductive reasoning. This is the type of thinking that Henry Ford was referring to.

Unfortunately, the number of adults who ever learn to reason deductively is not high. Studies in the 1960s and 1970s demonstrated that as few as 40% of North American adults are unable to use deductive reasoning to solve problems and understand the world, with the ability being directly linked to educational attainment. More recent studies have suggested that the number of people who are able to engage in deductive reasoning has dropped from about 40% to as low as 20%. This is alarming for a number of reasons.

First, it demonstrates a serious shortfall in the education system. With our obsession in education for memorizing more stuff and finding the right answer, there is no room left for teaching people to think.

Why is this a problem? Obviously, with so large a proportion of the population unable to use deductive reasoning, and society is still functioning – or is it?

Being unable to use deductive reasoning means that an individual is unable to follow the logic that is used to reach a conclusion that is based on deductive reasoning. It is not that a person doesn’t want to, they are simply unable to because of a lack of training.

Why does this matter? Because there is a growing chasm between the scientific world and society in general. Most of the members of our society are cognitively unable to follow the arguments scientists use to demonstrate what they are finding, and scientists can’t understand why the members of our society just don’t look at the data and come to the same, obvious conclusions that they have. The lack of deductive reasoning means that members of society are simply unable to follow the logic, and so must turn to other sources to find out the truth.

Thing about climate change, or immunization. Within the scientific community, and among generally well educated members of society (and there is a strong correlation) who can engage in deductive reasoning, there is confusion about how there can even be a controversy. For those who can use deductive reasoning, there is no controversy. The facts speak for themselves when they are followed through the logical sequence that leads to a conclusion. The science is absolutely solid.

The lack of ability to engage in deductive reasoning for a majority of participants in a Western Democracy is problematic, to say the least.

Another reason, which will have to be dealt with in a future blog post, is the effect that the lack of deductive reasoning ability (or formal operational thinking in developmental terms) has on the development of moral reasoning.

We can do better than this – if we are willing to look closely at ourselves and embrace the necessary changes.