The Dr is in the room

As I have been really thinking about the shape and future of HE, I have been thinking about the role of expertise in tomorrows teaching.

In this weeks Times Higher, Quintin McKellar mentions the long established link between research expertise and teaching, stating that “Although the evidence for the contribution of research activity to teaching excellence is thin, what exists is largely positive…”. Does this thinking belong to an age of information scarcity?

What I mean by this, is that an expert researcher is expert at creating information, knowledge, and understanding – content in HE parlance. Given that information and knowledge is readily available all around us, and given that the research game leads to hyper-specialisation in knowledge fields, can we still argue that knowledge generators are the best teachers?

I look at my colleagues – many of whom are top researchers in their fields – and I see a passion about something that is so specialised and esoteric that I wonder how inspirational they sound to students looking for a degree. HE lecturers with strong research backgrounds aren’t interested in teaching – they do it as a part of their job, but their passion is in asking the questions and seeking the answers. They teach according to the time honoured tradition of the profession – by lecturing. Even when they are faced with a group of six students in their classes, they still stand in the front reading their Powerpoint slides out to the group – punctuated with an occasional anecdote.

I was talking recently to the head of another department here at the University, and he said to me that – everyone can basically teach pretty well. He is wrong – dead wrong. Everyone can stand up and read lecture slides. We even get some teachers who can perform well while they are reading their slides (they might even have the information memorised) – but it is still the same basic delivery.

I have proposed a Masters course about applying the principles of psychology to education taking into account the emergence of information abundance thinking along the lines espoused by Roger Shank (so far approval in principle – hooray). As I sat with a few of my colleagues – all of whom are deeply skeptical about something that doesn’t involve powerpoint slides, essays and formal exams – I asked them about their understanding of the principles of psychology (they were all PhDs in psychology). They were somewhat offended that I would even ask, and when I pressed them on applying the principles they understood so well to their teaching, I had an amazing brush off – as though I hadn’t even asked.

Anyway, I believe that the use of hyper-specialists to deliver excellence in teaching has had its day. They want to research – we need good and dedicated teachers (not wannabe administrators that I find in so many teaching institutions).

Advertisements

Disruptive Innovation or Paradigm Shift

Thomas Kuhn (1962) introduced the concept of paradigm shift as a way to describe how a prevailing understanding in science is replaced with a new understanding in light of overwhelming evidence that the explanations currently being used to understand the world are no longer adequate. Although originally limited to science, the idea of a paradigm shift has been applied to other areas of understanding (e.g. the impact of the internet on economics).

A disruptive innovation, as espoused by Clayton M. Christensen, is an innovation that changes the way something is done, and has specific reference to economics. A disruptive innovation introduces a change that allows new markets and ideas to emerge that are usually (almost universally) resisted by existing institutions, but which can, when properly exploited by a new player, undermine and eventually destroy existing institutions. Kodak is a recent example of a massive institution that expended too much energy defending old technologies and failing to properly embrace the disruptive innovation represented by digital imaging.

So what is the difference between disruptive innovations and a paradigm shift in today’s HE world? From my perspective, we are facing a number of disruptive innovations in education that, taken as a whole, represent an underlying paradigm shift.

The Higher Education sector is facing upheaval. Innovations in how information is stored, organised, transmitted and retrieved have been disruptive to traditional libraries and publishing houses. We are beginning to see the innovations in communication having a significant impact on traditional scholarly publication (finally). The way information is packaged and delivered to students is being jostled about in traditional institutions. Non-traditional ways of reaching students have emerged in the form of on-line classes, courses and programmes (even if they simply try to replicate traditional teaching methods). Psychology is just beginning to disrupt the ways we think about teaching and learning. New models of thinking about the organisation and purpose of teaching and learning are being discussed. Gamification, experiential learning, skills based approaches, problem based learning, social networking and social learning – all innovations that challenge our ways of thinking about teaching and learning in the C21.

I would classify all of these as disruptive innovations. Taken individually, they represent new ways of thinking about or doing something that provide opportunities and new markets for education. Private providers are rushing to grab a piece of the emerging (and lucrative) educational pie that has resulted from these innovations (and others) and the inevitable upheaval that has followed.

The paradigm shift that I think is taking place in education is an overarching change in the way the world works, and is represented in many of the innovations we find so disruptive. Digitisation has moved us from an information scarce world to an information abundant world. The shift is from information scarcity to information abundance. The implications of that shift are enormous. We (society) has invested heavily and (eventually) embraced every innovation that has made information more abundant, and many of these innovations are milestones in the development of civilization. The great libraries, the discovery of inexpensive paper making techniques, the invention of the printing press, the transmission of sound and pictures, the advent of computing, and the emergence of the internet. We no longer live in a world of information scarcity, and we have and are watching the emergence of a world of information abundance.

The implications of this paradigm shift are only just beginning to dawn on a few of us. What does it mean to learn in a world where all there is to learn is freely available to you right where you are! Now!

You don’t have to go somewhere. You don’t have to be told what there is to be learned. You don’t have to be constrained by the availability of an expert. You don’t have to have your information filtered by anyone. You don’t have to have your learning organised by anyone else.

In this world of information abundance, why do we need to gather to centres of learning? Why do we need to listen to an expert when all they have to say is freely available? Why do we need to buy a textbook of organised and re-presented information? Why do we need to organise learning around a single subject? Why is memorised knowledge still the key to attainment in education when knowledge access is ubiquitous? Why do we sit and listen, by the hundred, to what we need to learn so we can parrot the information (and a bit more) back?

What is the value added?

There was a time when information was scarce, and the methods used to learn were appropriate. That time has passed, and we are participating in a paradigm shift. Embrace it, its going to happen anyway.

Education is Gamified

In very general terms, let me describe a game to you.

You enter the game and you decide on a number of scenarios to participate in. Within each scenario, you have a number of time limited tasks to accomplish, for which you receive varying levels of reward points, depending on how successful you are at the task. The individual task points accumulate within each scenario to provide you with some overall prize for each scenario. Many of the scenarios have various levels that you can complete, following roughly the same model of rewarded tasks. Finally, if you manage to successfully complete a number of scenarios, you finish the game, and get your big final reward. A brilliant game that can be a lot of fun, and is loaded with rewards and prizes along the way with a big one at the end.

Isn’t this our education system?

I have read a number of blogs (eg. here and here) and heard a number of talks, in the last year, about the gamification of education. On the surface, it sounds like a great idea, however, It strikes me that the principles of gaming have been derived from education, not the other way around. I think that some of the fundamental problems with education can be attributed to the built in gamification.

Students get caught up in the tasks, scenarios, and levels looking for the big reward at the end as though that is the only reason they are participating. Learning disappears into the pursuit of grades for assignments, classes and finally, a GPA or degree classification. In a game, it is the points and rewards that keep players engaged. Unfortunately, in education, it is the grades and degrees that motivate too many of our students.

So I ask, why should we be trying to introduce more gamification into education when it is already responsible for eclipsing learning for so many students?

Bloom’n Skills

Bloom’s Taxonomy has guided educators for more than 50 years, and has survived as a useful categorisation of learning (at least in the cognitive domain) in spite of the numerous critics. In the original work, the cognitive domain was split into six skills: knowledge, comprehension, application, analysis, synthesis, and evaluation. In the age of information abundance, where do these skills fit, and how are they taught in HE.

The focus of too many classes in HE lies on teaching knowledge and comprehension with lip service paid to analysis. In fact, academic snobbery has led some areas (basic STEM subjects) to abjure application as unworthy of consideration. Research funding and academic kudos favours basic research as opposed to applied sciences. Synthesis and evaluation are considered desirable, but beyond the reach of most undergraduates. I believe that this is because content rules. The argument usually given (by teachers) for not really getting to grips with synthesis and evaluation is because the students don’t grasp the basics to a level that would allow them to practise the higher order cognitive skills. That argument would be easier to buy if the understanding of the function of sodium-potassium adenosine triphosphatase (sodium pump) in the firing of a neurone wasn’t considered basic knowledge.

There is too much knowledge today, and the problem is growing exponentially. I noticed an advert in this weeks New Scientist that said something like – 90% of what will be known in 50 years hasn’t been discovered yet. To make matters worse, in the age of information abundance, all of that knowledge will be (and is quickly becoming) readily available. I can’t wait until, sometime in the bright content laden future, I can take a 15 week final year undergraduate class on how the tertiary structure of sodium-potassium adenosine triphosphatase effects the intracellular calcium balance in a centre-surround inhibitory neurone in primary visual cortex. Once I fully grasp (and demonstrate memorisation by regurgitation in a high stakes final) the process, I’m sure I’ll have enough basic knowledge to begin to synthesise what I know across my field of study (psychology). I’m not certain how that is going to help me organise the recreational time at the Senior’s Home where I will get a job in management because of my good undergraduate degree (it is all psychology, isn’t it?).

When are HE educators going to realise that their job isn’t to make clones of themselves, but to equip students with higher order thinking skills that will make them valuable contributing members of society?

We don’t have to focus on the minutia of our research specialisms in our teaching. In fact, I believe that in doing so, we are endangering the foundations of the institutions we live in. Content! Content! Content! We need to refocus our teaching on the higher order skills with content (and it doesn’t matter what content) as a wrapper. The skills need to be the learning priority.

Students should be practising analysis, application, synthesis and evaluation over and over. They need to learn and practise the skill of finding and organising knowledge – something the information scarcity model says that we need to do for them. We write textbooks, condense them into slides, and present them as re-usable learning objects for the students to learn from. And now, thanks to digitisation, we can replicate our entire information scarcity model, easily and cheaply, on-line. Now we can produce and distribute textbooks and slides from every class ever taught by every lecturer. We can record all of our mindcrushingly dull lectures for endless replay online. We can (and do) produce more content than ever before. We live in content heaven!

When are we going to start teaching the students to find, evaluate, organise, synthesise, and present information for themselves? When are we going to evaluate the process, the skills, instead of the content that they regurgitate (along with a couple of other tidbits they pick up from further reading) in coursework or a formal exam?

More and more detailed knowledge that greater numbers of graduates have demonstrated a passing familiarity with isn’t what society needs of its graduates. Higher order cognitive skills that can be applied to unfamiliar problems is what is needed.

We can teach analysis, synthesis, and evaluation, it is just easier to grade knowledge.

Skills or Experience

I was reading Roger Schank’s blog last week, and loved his free online courses post. There are a number of points that I found refreshing to read (it is good not to be alone), however, I wonder about his complete faith in experiential learning.

The reason I wonder about it has to do with my own experience. Fifteen (or so) years ago (once you have been doing this so long, everything was just a few years ago – even the 80’s) when the skills agenda was all the rage here in the UK, I was tasked to develop a skills programme. At the time they were key skills, and they were slowly transformed into employability skills. I don’t think anyone cared what they were called, they were all the same thing. It was one of the first times I found a real disillusionment in education (prior to that, I just got on in my own little world).

According to the skills agenda, we were to embed skills into the curriculum for the students to acquire. As I went to workshops, seminars, conferences etc. etc. etc., I found that this was almost universally done in one of two ways – require the students to give a talk as an assessment in a module, and you could tick off the oral communication skill (one off embedding), or, run a key skills module for all students in their first year, and they would have the necessary skills to succeed at university (and hence, life).

Both of these approaches are phoney, for entirely different reasons. Giving someone the experience of doing something (public speaking) does not constitute teaching, and skills are, by their very definition, something that you need to initially acquire at some baseline level, and then improve on over time. By the same token, having a single class in the first year that covers a wide range of key skills does not mean that you are skilled at anything, it means that you have begun acquiring a skill at some baseline level. My attitude toward skill development is that it is something that needs to be taught and then guided. It takes time and energy. Too many of the learning opportunities that focus on experiential learning fail to ensure that the students are properly skilled at a baseline level prior to being exposed to an experience that requires the use of that skill. There is too much reliance on problem based learning without adequate background preparation.

That is why I see experiential learning as a vital and necessary component of the educational future that focuses on skills. Teach the skills explicitly (at least the basics) and then guide learners through their experiential learning environments supporting and guiding them (reminding them of the skills they have begun to acquire) and incrementally withdrawing that support while they hone their skills in a safe learning environment.

I think the skills have to be the fundamental focus, and the experiences planned and implemented to strengthen and support the skill development process. Not a world away from Roger’s thinking, but a subtle enough difference to miss the point. A person is valuable to society for what they can do, not for what they have experienced.

Marking Experts

Earlier this week, I published a blog about the application of behaviourist principles to education, and in there, I mentioned that I was going to blog about expertise.

Within the context of my blog, I was talking about expertise in marking. I think that scholars who spend their lifetimes studying a topic become experts, however, I have serious doubts about those same scholars becoming experts at marking students‘ work in their respective fields.

There is good psychological research on gaining expertise, and if we have a look at some of it, becoming an expert entails more than being assigned the 11:00 a.m. slot on Tuesday mornings for the next 12 weeks to wax lyrical about your favourite subject. The acquisition of expertise is a complicated process.

First of all, it takes time. In The Cambridge handbook of Expertise and Expert Performance, Ericsson estimates that it takes about 10,000 hours of doing something to become an expert. It isn’t just the time element that makes you an expert (Ericsson & Lehman, 1996), there are other elements as well.

Just by the first requirement, the number of years that it would take to become an expert teacher is high. If I teach a class for 45 hours/semester (3 hours/week for 15 weeks), I would have to teach 222 classes to reach the 10,000 hours of teaching. Over a 40 year career, that would be 5.5 classes per year. I know that there are lecturers who teach that much, but that means that to become an expert takes 40 years. And that’s becoming an expert in teaching, not an expert in marking.

Every year, as the semester draws to a close, I have a significant amount of marking to do. It feels like it takes at least 10,000 hours every year, but if I look at it objectively (difficult to do in May), I really spend around 40 hours each semester marking my students’ work. That means I need to have 250 semesters of marking to reach the number of hours required for expertise. I won’t live that long, nor do I think I would want to live that long, given the excitement involved in marking just one more script.

That’s just the time element. The practise element makes me certain that there are very few real experts at marking out there. One of the hallmarks of expertise is the “…seek(ing) out particular kinds of experiences, that is, deliberate practice” (Ericsson, Krampe & Tesch-Romer, 1993). I have never had a colleague ask if they can do some of my marking just for fun, or just because they want to gain more experience. Starkes & Ericsson (1993) note that deliberate practice is one of the primary predictors of the attainment of expertise. I have a number of colleagues who have urged me towards expertise by offering me the opportunity to pick up a bit of practise with their marking loads (although grateful for their thoughtfulness, I have always politely declined), but I have never really met anyone who has sought out opportunities to mark.

We may have expertise in our fields of study, but we certainly are not experts at marking work. In my previous blog, I talked about how really difficult marking can be.

In a typical essay, we are expected to make observations on a number of dimensions of writing. This weeks example included: a well structured argument, use strong sources, show evidence of critical thinking, throw in a bit of originality, write with an easy to read style, use proper spelling, punctuation and grammar, and get the content right. As I said, marking then entails judgement on a hypercomplex problem, providing a simultaneous evaluation on the multifaceted dimensions, and awarding an appropriate level of credit for the work.

Besides the fact that, cognitively, we really can’t do this – we have e very real limit on the number of things we can hold in our minds at the same time – we don’t spend time willingly engaged in the process so that we can become much better. At best, we become good at the job, and at worst, we just get through any way we can.

We are forced to make simultaneous judgments on the work, simply because the individual elements that we evaluate are interspersed and woven into a complex piece of work. The research shows us that reliability between markers is low, and we try to find ways to increase it (e.g. better marking criteria). When are we going to admit that we are never going to be anything better than okay at marking.

Going back to my previous blog, if we isolate behaviours we want to focus on, and the simply focus on one or two of them, we could do the job, and the students would learn in a more focused manner.

Given the difficulties in marking, I am always relieved to find, year after year as I attend examination boards where the work of individual students is reviewed before they graduate, that the grades awarded across the variety of classes, assessments and years, ends up being fairly closely related. Even if we aren’t experts, we do a pretty good job of picking out the good, the bad, and the ugly.

Behaviourism and Learning

I know that a number of psychologists will tell you that behaviourism and learning are just two names for the same thing, and they might be right is some way, but the learning I am talking about isn’t based on rats running a maze, but students learning higher thinking skills. After all, higher thinking skills is what the higher stands for in Higher Education. The rats running a maze is where the behavioural principle come from that I want to focus on. And for those who thought there was a cognitive revolution in psychology, behaviourism is still very much alive and kicking (as it should be – even though I am a party to the revolution). The principles are still as valid today as they were when they were articulated 50 years ago.

So what does behaviourism have to say about learning (as in education)? Actually, it can tell us quite a bit. I have been reading and writing about assessment this past year, and it astounds me that the assessment methods, for as much as we talk about innovation, are still very much based on tradition. Given how much we know about how we learn and master a skill, why are assessments still all done, basically, the same way?

Let me explain. Behaviourist interventions are the best interventions we have for changing behaviours (I’m not talking about a philosophy here, but about what the evidence says). The basic principle for a behaviourist intervention is as follows. Identify a behaviour that needs changing, isolate that behaviour, take a baseline measure of the behaviour, introduce the intervention, and then measure the effect of the intervention. It can be a bit more complicated than that, but not much. You could think of education and learning as changing thinking and behaviours. After all, that is essentially what we are trying to do. Looked at through a behavioural prism, we don’t do a very good job of changing behaviours and thinking. It isn’t that the students (our subjects) aren’t willing, it is just that they struggle to figure out what we want.

What I mean by that can be understood if you think about a student submitting an essay for grading. You might say that innovative assessment means that an essay is only one tool from a large toolbox that is available, however, the point I am making applies to most what is available and used. In an essay, we expect that the student will have a well structured argument, use strong sources, show evidence of critical thinking, throw in a bit of originality, write with an easy to read style, use proper spelling, punctuation and grammar, and get the content right. That means that we are evaluating (at least) nine dimensions on a single piece of work. We expect the student to produce a multidimensionally superb piece of work that we then take in for marking. We, in as little time as possible (given that there are 183 sitting on your desk due back next Tuesday), become an expert judge (a future blog on this concept) on a hypercomplex problem, providing a simultaneous evaluation on the multifaceted dimensions, and awarding an appropriate level of credit for the work. And then we wonder about the lack of reliability between markers

So much for isolating something that we need to change, introducing an intervention, and then measuring the outcome of the intervention. Why can’t we design a higher level learning environment that isolates the skill or content that is targeted, and then introduces an intervention that continues until a mastery level of achievement is reached before introducing something more complicated.

In my Science of Learning module, I focus the students on the providing evidence from an acceptable source for their blog entries and incorporating that evidence into a well structured argument. Two higher thinking skills that the students practise over and over (at least 14 times) during the semester. One of the criticisms that I faced the first time I taught this way is that final year undergraduates shouldn’t all be able to get top marks in a class, there needs to be more spread recognising their varying abilities. However, I asked, in our examination board, if the students meet the criteria that I set, and they meet it well, shouldn’t they receive full credit for having done so? The exam board agreed with me. They also agreed that the learning outcomes ( providing evidence from an acceptable source and incorporating that evidence into a well structured argument) for the module were more than appropriate for a final year class. Just because the bulk of the students master it shouldn’t make their accomplishment any less.

I think it is a shame that this kind of learning is so rare in education. It doesn’t have to be.