Paul MacLellan digs into the problem with research from Durham, a secondary school teacher and a journal editor
In June I gave a presentation on the gap between teaching and education research at the ResearchED Maths and Science conference. This is the transcript of that talk.
Last year, three education researchers from Durham University tried an interesting experiment. They carried out an intervention with several primary schools – an intervention that should have had a positive impact on their students' learning. But the researchers weren't interested in the outcome for the students – at least, not directly. What they were interested in was how the teachers implemented the intervention.
The intervention was based on research published in 2007 by John Hattie and Helen Timperley on enhanced feedback. Their research highlighted the characteristics of effective feedback and formalised those characteristics into a structure teachers could apply in their teaching. The effectiveness of the implementation of enhanced feedback was not in question in the Durham study: several previous studies have shown an effect size of around 0.6.
One of the Durham researchers, Stephen Gorard, says: 'You could imagine, though I'm not keen on this translation, that is about four months extra progress in reading or maths.'
Stephen and co-workers, Beng Huat See and Nadia Siddiqui, were looking to see if teachers could take a piece of research with proven benefits and apply it in a way that had a positive outcome for their students.
Beng Huat See explains: 'The aim of the project was to see if the schools could engage with research evidence and use it to inform their instruction. So the schools came together – the heads, lead teachers and subject leaders – and they read the paper by John Hattie and Helen Timperley, extracting information to help them understand the different types of feedback strategies to be used in a classroom. Afterwards, they went back to their schools and cascaded the training to the other teachers.'
Nine primary schools took part, involving all Year 2–6 pupils over a whole academic year.
Once the training had been given to all the teachers, the researchers went in to the schools to evaluate how enhanced feedback was being implemented. 'Our job as evaluators was to go and observe how the training was being implemented in the school and the challenges the teachers faced,' Beng Huat explains.
The researchers observed lessons and interviewed teachers, subject leaders and pupils. And they also evaluated the academic achievement of students in the schools and compared with other schools in the borough and nationally.
So, what did they find? Well, attainment didn't improve. A quote from the study: 'Overall, the data indicate that there is no convincing evidence of a beneficial impact on pupil outcomes from this intervention.' This is quite surprising given that this technique has been shown to be effective over several previous studies.
What went wrong? Beng Huat has some ideas: 'I think there are several reasons. One is the academic language in which the paper was written is not accessible to practitioners. Another is that there were not enough examples of how to apply enhanced feedback for the teachers to use.'
The problem was in the delivery of the enhanced feedback model. Initially, the teachers were confused about the structures around feedback the research put in place. 'They thought that they knew what feedback was,' says Beng Huat. 'At the first training session they said, "We are already using feedback in school. Any good teacher already uses feedback. Why are we doing this?" When they were given the paper to read, which explains the different levels and processes of feedback, they became confused.'
During the initial training, one teacher said, 'I agree with [Hattie] about the impact of feedback, but this is what we all do in our classes. This is what we're doing already.'
More tellingly, the research leads from the schools struggled to understand the research paper. One teacher commented, 'I need a translator to understand what this article is saying. I just cannot understand what [Hattie] means and what he wants us to do.'
'Throughout the intervention the teachers were excited and keen to use the research. But there was a lot of misunderstanding and misinterpretation of the evidence,' says Beng Huat. The teachers found the language of the research article impenetrable – it was not written for an audience of practitioners.
Then there was the other problem: the research article didn't provide examples or resources practitioners could use in a class. These could be a simple as clarification of technical terms with several examples, or as in-depth as videos of successful implementations of the intervention being used for training.
Research papers are hard to read
But Stephen still feels the biggest hurdle to teachers implementing education research is just how difficult research articles are to read. 'The biggest problem with using evidence with teachers is how badly written the papers are. I don't mean they leave out content that practitioners want. I mean the academic English researchers write in is completely unnecessary and makes it really hard for everybody – including academics – to read. And, of course, it makes it especially hard for teachers.'
This is a problem we already know about. In May, the Education Endowment Foundation published two independent reports concluding that teachers aren't willing to invest the time to engage with education research and senior leaders aren't willing to invest the resources to support them in doing so.
Gary Davies, a physicist and soon-to-be physics teacher, commented that this seems absurd. If education research can make teachers better at teaching, the reports suggest that teachers don't want to improve. Their managers don't want them to, either.
Gary suggested that it's more plausible that teachers and managers don't think they can effectively use the outputs of education research to improve their teaching. Research isn't worth their time.
And to be fair, their intuition is bang on – the Durham study shows that. There's a gap between the results of academic education research and teaching practitioners.
Teachers don't have time
Kristy Turner is a chemistry teacher with 11 years’ experience of secondary school teaching. She is a school teacher fellow at the University of Manchester and Bolton School Boys' Division. Kristy spends three days a week teaching organic and general chemistry in Manchester, and two days a week teaching in school.
I asked her about a typical teacher's engagement with education research. 'I think for my colleagues teaching full-time in school, with the associated pressures of marking, assessment and planning, there's very little time to engage with any educational research. There are some formal mechanisms in some schools for disseminating research – my school, for example, has a learning development group that meets every half term. But that's only one hour each half term.'
So the pressure from a teacher's side is time. But, as we've already seen, there are other barriers to using research.
Kristy says: 'If you want to explore research by reading a full journal article there can be some access difficulties depending on the journal. You might encounter a paywall.
'If you manage to overcome that barrier, there's the difficulty of getting your head around the way it's written – quite often the language in the publications is very difficult. They're almost written to make the author sound clever to other people within that area of research, and not written to make it easy for someone who wants to apply it.
'So, there's a lot of time spent just trying to understand what the research is about before you even get round to thinking about how you can apply that to your classroom. Quite honestly, unless it's easy to digest, I don't think teachers have the time to spend decoding something that in the end might not be of any use to them.'
Kristy highlights three problems. First of all, access to research. If you've spent any time trying to read research literature, you will be familiar with coming up against paywalls. Secondly, it's difficult for a teacher to know if a piece of research will be relevant to their teaching. And lastly, of course, the problem of language we've already encountered.
I asked Kristy if she herself reads education research. She said: 'I do, but I don't think I'm typical as a teacher. For my school work, I tend to read it quite informally – I mainly discover things through Twitter or other social media. That way I can get the bite-size summary, rather than reading the full publication. But that's different to how I use research in my higher education role.'
I also asked if she applies education research results in her classes. Kristy said she uses a few flipped learning techniques that came from some studies she's read, but not much beyond that.
But, Kristy's role as a part-time university lecturer and tutor gives her a slightly different perspective on education research. Last year she decided to get her hands dirty. 'Since I started here at Manchester, I've started doing educational research myself. I published my first paper recently in the Journal of Chemical Education, which is about a modelling activity I developed for the classroom. Producing that paper gave me a lot of insight into how other people are generating and communicating their ideas. It's quite a simple activity, but when I wrote it up to fulfill the needs of a scholarly publication it ended up being 14 pages long.'
Since that paper came out, Kristy has been working on translating the work for practitioners herself. 'Publishing the paper made me see how my work fit into a wider context,' she says. 'However, now I'm distilling what I did into a teacher version to put on Learn Chemistry, because I know teachers aren't going to read the paper. The best way of me communicating my idea to teachers is either to present it face-to-face with someone or put it in a short, open access document.'
What about the pressures researchers face? Kristy mentioned her article on a simple classroom activity became 14 pages long just by necessity of publishing it as research.
I spoke to Keith Taber, editor of the journal Chemistry Education Research and Practice and professor of science education at the University of Cambridge, and asked him what the purpose of education research is.
'The purpose of educational research is to support teaching and learning, primarily through teachers. The idea of educational research is to provide a knowledge base and expertise to support effective teaching and learning.'
Researchers and teachers are different groups, and papers are written for researchers
Keith is very clear that education research must have an impact on the classroom. But he also agrees research articles tend to be written for an audience of other researchers, rather than teachers.
'I think education is perhaps different to some other fields. So, if you have a field like biochemistry or transition metal catalysis, for example, then when people write research articles they will be written for other researchers in the field. In educational research, that tends to be the case as well. It's not necessarily that researchers couldn't write for a different audience of practitioners, but the research journal is there to inform and develop its field of research.'
He explained that researchers have to convince editors and reviewers that their studies contribute something new. 'The main purpose of a research article is to make a case for new knowledge. So when somebody publishes a research article, they are saying, "I have found out something new." Therefore, the article has to have the technical structure that another expert in the field reading it can say, "Yes, I understand what you've done. I understand the nature of the data you've collected and the analysis you've done. And I can see there is a logical argument that, given that data and analysis, we do now know slightly more about the topic than we did before."'
This is, of course, quite obvious. We know research has to be rigorous and stand up to scrutiny. 'I think the research article, by its very nature, has to support the kind of writing that makes a technical argument for new knowledge,' says Keith. 'That's not necessarily the best format for communicating the significance of research to practitioners.'
Perhaps the problem of getting education research into classrooms comes down to that disconnect between the two audiences: the researchers for whom the research articles are written, and the teachers who want to apply the research in their teaching. As Keith points out, this isn't a problem faced by other areas of academic inquiry, where the audience and the practitioners are the same group.
It looks like the role of translator the teacher in the Durham study called for might be an essential part of solving the problem. I asked Keith if he agrees: 'Oh, absolutely, yes. Ultimately, researchers have to be specialists. You can't expect all research to be understandable to a general practitioner. So it does need to be interpreted.'
Researchers don't have the incentives to translate research for teachers
I suspect that when teachers struggle to find those interpreters of research, they either give up or try to bridge the gap themselves. I asked Keith if researchers bridge the gap from their side – what do they do to engage with practitioners? 'I think some do more than others. And to some extent, that depends on their own professional context. So, a number of people who do research will interact directly with existing teachers or student teachers as part of their work. Some people do it by writing other kinds of material – you write a research article, then you also write a practitioner article. And, in a sense, if you only ever write research articles, and you never also write for teachers, then you're only doing half a job.'
But here's where another problem becomes clear. Keith acknowledges the professional pressures that researchers face are related to their research output in high-impact journals. Researchers may want to do the full job, but they're only being professionally evaluated on half of it.
Where does this leave us? If teachers are becoming more aware of the benefits of evidence-based teaching, what needs to be done to help them apply it?
There need to be conduits for education research
Many feel researchers need to do more to reach out to practitioners. This might be by including many teacher-focused examples of any intervention within the research article itself, as suggested by the Durham researchers. Or by avoiding jargon in research articles. Kristy offered one possible improvement to academic literature: 'I would really like to see all papers in an educational journal have an abstract for teachers – not the academic abstract that papers already have, that is written in the same way to the rest of the paper. I'd like a short, clear summary that distills out the important points for teachers.'
Keith raised the possibility that in an ideal world teachers themselves could bridge the gap: 'If I was in charge of the education system, I would say we want fully professional teachers who are able to engage with research and who are given time and space to engage with research and try things out for themselves. I think that would be the ideal. In England at the moment that is certainly not the direction things are going. So we need to make sure we do more to communicate the implications of research to teachers.'
Wishful thinking aside, Stephen Gorard feels there's a gap that needs to be filled: the role of translator. 'We need something that converts research evidence into something practitioners can use.'
This is a role we think Education in Chemistry can help with, at least for chemistry teaching. But we can do more, and others need to do more, as well. Publishers, researchers, school leaders and maybe even teachers need to pitch in to create conduits for education research.