researchED 2017

My obligatory annual missive from behind the supposed enemy lines…

Prologue

“How many of you are on Twitter?” asked Tom Bennett in his opening address. “And if not, why not?”

Well, FWIW… obviously I can’t speak for anyone else, but having stuck pretty effectively to a self-imposed Twitter and blogging ban for a fair few months now I can report a significantly increased sense of general well-being, not to mention productivity in, well, more productive areas. It also gave me the advantage of coming to this researchED conference only dimly aware of the inevitable school holiday eduTwitter strife that seems to have become increasingly hostile and intolerant of opposing views in recent months. And there was me thinking we’re supposed to be teachers

Bananas!

With a grim sense of obligation, I had circled Nick Gibb on my programme for session one; however when the time came my feet steered me instead to Christian Bokhove’s entertaining talk entitled: ‘This is the new M*th!’ Unusually for a researchED talk about myths, Christian was not concerned with a feverish search for the ”next Brain Gym”. Rather, this turned out to be a cautionary tale about the folly of shooting myth-busting rounds from the hip before checking you aren’t packing a mouldy banana.

There is something kind of adorable about people who think they’re myth busting, peddling unsupported counter-myths of their own. Christian ended by echoing Labaree’s (1998) suggestion that perhaps in social science we should learn to accept a “lesser form of knowledge” than can be achieved in the natural sciences – a notion that resonates with my own session on the complex nature of social reality, and its consequences for school leaders and education researchers, later in the day.

A doozy for another time

In session 2 I went to see Lucy Crehan talk about her guided tour of the world’s educational high flyers. I had to duck out early to get set up for session 3, but what I heard made me want to read her book, Cleverlands. The extent to which educational (and cultural) practices can be imported from other cultures is a doozy of a question that I suspect we will be wrestling with for many years to come.

Pure passion and dedication

In session 3 I introduced a session on collaborative inquiry with my UCL Institute of Education colleague Mark Quinn and four teachers from City of London School. This year, Mark and I have worked with a group of teachers at the school, helping them investigate some aspect of their practice through collaborative inquiry. It was a fascinating session which was summed up by Vivienne Porritt – who helped start the IoE/CLS collaboration in the first place – saying “the point is not always to raise standards. The point is to provide a vehicle for professional learning and development.” This is worth underscoring. There are some who seek to dismiss things like action research and lesson study on the basis that it doesn’t always lead to immediately discernible gains in student outcomes. But in my experience – both in terms of my own teaching practice, as well as that of others – negative findings are often far more interesting, and far more valuable – enabling far richer professional development – than some contrived project which designed so you can say “look how effective I am”. The talk seemed to go down well, at least if this tweet is anything to go by:

Here’s a video of our session (mercifully the camera was only turned on once I had shut up), in case you’d like to see it.

More questions than answers

After lunch I went to Carl Hendrick and Robin MacPherson’s session, enticingly entitled “What does this look like in the classroom? Bridging the gap between research and practice.” Unfortunately however, this session did not quite fulfil its promise and I left with many more questions than answers.

Carl and Robin began by listing some ‘barriers to research’, which was fairly familiar territory to researchED aficionados: jargon, paywalls, time, money, workload, one size fits all… of which more later. They also included the intriguing suggestion that sometimes research evidence is a “solution in search of a problem” – however I could have done with some worked examples, because when I look at the education system I see mainly the opposite.

Carl then made the strong assertion that doing things like lesson study and action research is not worth the while, on the basis that he’d read something by Dylan Wiliam in which he outlined his doubts. I’d like to read this – the only comment I’ve seen by Dylan Wiliam on lesson study is the following, from a keynote he once gave, in which he seems to lend his strong support to lesson study:

Genuine peer observations, working with people at the same level as you in the system, also work well, when the agenda for the peer observation is set by the person being observed, rather than the person doing the observation. I think this is a model for lifelong teacher development. It’s what the Japanese teachers do through “lesson study”. It’s not something that you do in order to get good and then stop doing. The great thing about teaching is that you never get any good at it; you never crack it. That’s what makes it so frustrating, so challenging, and yet so rewarding.

Instead of trying to work out for ourselves what we should be doing through systematic inquiry, Carl suggested instead that teachers should just read the things he’s been reading in recent years. I didn’t see the slide for long enough to capture the whole list, but it wasn’t very long and it included Barak Rosenshine’s article about ten principles of instruction, Daisy Christodoulou’s book Seven Myths about Education, and I think the Kirschner, Sweller and Clark paper on minimally guided instruction: in other words, pretty much the standard reading list of your common or garden self-flagellating former prog who’s seen the light and has taken to the rooftops proclaiming neotrad tropes with all the zeal of a recent convert.

To be fair they did suggest more than just reading these works: they suggested reading them collaboratively in journal clubs. Carl said they recently did a one year project at Wellington in conjunction with Harvard University – a “trifecta” of research literacy with fortnightly journal clubs for students, teachers and senior leaders. He didn’t share any data to show what the impact of this programme was – indeed it’s not clear whether the programme included an evaluation of itself. One would assume not, because that would be action research and the whole point of this is that it’s not action research. The fact that it lasted only a year is interesting – what happened afterwards? It’s one of several questions I would have liked to ask at the end, but Q&A did not appear to be on the menu despite finishing with a few minutes to spare.

Obviously I have no truck with teachers reading research – it’s really important that we do. But reading stuff is not enough. In case it really needs pointing out, reading research and engaging in research are not mutually exclusive activities. Clearly, you can do both and indeed they complement one another in powerful ways. At the IoE we have a phrase – we encourage teachers to engage both with and in research. The value comes from combining the two in focused, small-scale research enquiries that enable you to simultaneously become a critical consumer of research and a critical observer of your own practice. These twin activities gently rearrange parts of your professional identity that the mere reading of research (or indeed one-sided polemics) cannot reach.

I have two main objections to this reading-based, bargain bucket vision for professional development. First, the claim that the evidence for action research and lesson study isn’t really there. Because as Vivienne Porritt pointed out above, the thing about action research and lesson study is that it’s not necessarily about improving outcomes in an immediately measurable sense. Obviously, we all want to raise standards (although in a zero-sum system in which grade proportions are pre-determined, we can only do so at the expense of other students; a conundrum for another time). But what’s really valuable about things like action research and lesson study is that thinking hard about how to connect the evidence from the literature with the evidence from your own data sheet – in other words, really seeking to answer the question ‘what does this mean for my classroom?’ in a rigorous, systematic way – this requires deep reflection, thoughtful deliberation, grasping the nettle of implementation and all the soul searching, self-doubt, problem-solving and collaboration that drives powerful professional learning and development. That’s the point of it.

Finally, it is worth returning to one of the barriers to research Carl and Robin pointed to at the start of their talk: the folly of adopting a one-size-fits-all mentality to bridging the gap between research and practice. John Hattie might say “teachers shouldn’t be Researchers”, but since he does say “teachers should be rigorous evaluators of their own impact” the distinction is kind of semantic. Carrying out research with a small r, as a basis for impact self-evaluation – I remain more convinced than ever that this should be the cornerstone of teachers’ professional development at every level throughout the system. Because as I pointed out in both my sessions on Saturday, if we just follow for example what the EEF Teaching and Learning Toolkit tells us about “what works”, there’s an almost 50:50 chance that we’ll be making things worse. And in the absence of some form of systematic inquiry to evaluate the impact of our own practice – we can’t even know which it is.

No doubt the forthcoming book by Carl and Robin will clear up some of these messy matters – I very much hope so.

Brilliant timing

In session 4 Lauren Ballaera, Director of Research and Impact at the Brilliant Club, talked about “Using cognitive science to evaluate impact”. This fascinating session made my head spin somewhat, because it offered some serious challenges to the talk I was about to deliver in session 5.

In case you aren’t aware, the Brilliant Club is a widening participation charity which has mobilised the PhD community into action. They run two main programmes – Scholars (doctoral students running sessions for disadvantaged kids) and Researchers in Schools (doctoral grads in the classroom). Lauren’s unenviable task is to evaluate the impact of these programmes across a range of student outcomes, and through time – no mean feat, not least when you’re surrounded by hundreds of people with PhDs!

Lauren walked us through a number of technical and logistical issues involved in such work, carefully picking apart the problems and outlining potential ways around them. This was where I started to think I started to think maybe – just maybe, it is possible to overcome some of the challenges I was about to outline in the next session, which was entitled: Silver bullets, magic wands and tooth fairies: Why school leaders and education researchers need to embrace complexity. The video is below.

All that complexity left me feeling quite dizzy, and so I skipped the final session.

Until the next time… Hasta la Victoria Siempre!

References

Labaree, D. (1998). Educational Researchers: Living With a Lesser Form of Knowledge. Educational Researcher (27), 4-12.

Leave a Reply