On observations and inset

In common with a large number of schools around the country we had an inset day today.  And in common with a lot of schools, I am sure, some parts of the site felt more like this

Damp cave

than a warm, welcoming place of work!  Never mind.

Happily, we spent most of the day in the warmest part of the school on what was one of the most productive and interesting inset days that I can remember.  The publication of the new Ofsted inspection framework led us, as an SMT, to think carefully about the conversation that we would have with an inspection team about the quality of teaching and learning.  It’s clear that in an environment where inspectors are not simply making their own deus ex machina judgement of the quality of lessons the SMT need to have a convincing, evidence-based picture of teaching and learning to start the conversation.  There are a range of pieces to the jigsaw – exam results, work scrutiny, student perception surveys – but there is no escaping the fact that lesson observations play a part too.  So since September, as an SMT, we have been into more lessons than we did in the previous two or three years combined.  This has been a big culture change in the school, which has always been pretty ‘light touch’ previously and it has not been the easiest culture change.  What we have really tried to emphasise, however, is that a greater level of observation leads to more effective, targeted professional development.  Happily (after a pretty rigorous debate) we decided not to grade the lessons we observe as an SMT, but rather to provide direct feedback to teachers after the lesson (or in as timely manner as possible) and to actively feed back our findings into professional development.  That’s where today’s inset picked up.

Our lesson observations focus on three things (the Big Three, as we call them):

  • the lesson environment: are students challenged, engaged, passionate about their learning?  Are they being given the chance to develop independence of thought?
  • ‘Closing the Gap’: what strategies are visible, in the lesson and in exercise books or notes, to make marking and feedback effective?
  • the use of the rafls: this is a particular one to a school using the excellent realsmart platform

Since September we have been building a clearer than ever picture of best practice in these three areas.  Today’s inset was about sharing this practice.  All departments were encouraged to bring along examples of effective Closing the Gap and rafl strategies and we set up a marketplace in the library.  Working in cross-departmental groups so as to further facilitate the sharing of ideas and practice, the morning was spent on a carousel around the displays and presentations of best Closing the Gap and rafl practice.  Staff were encouraged to ask questions and make suggestions but, most importantly, to pick up ideas to take back to their department time in the afternoon.  There was a great atmosphere and a wealth of fascinating material to look at.

Inset Geography feedback

20150105_110549 20150105_103931

But this brings me to the most tricky part of the observation/professional development equation.  It is pretty clear to us that there are some fantastic classroom practitioners out there and some teachers, at all stages of their career, who could benefit from some targeted training.  It seems pretty pointless a member of SMT popping into (for example) the Head of English’s lesson for the rest of the year: we know that we are going to see relentlessly high expectations, stretching material and fantastic probing questioning.  What would be far, far more valuable would be to get a couple of teachers for whom questioning is a known issue into her lessons.  But here’s where across the board we are hitting a logistic and, to an extent, cultural barrier.  From the one side there is the understandable reluctance to give up a scarce and valuable free period and from the other there is the quite justified sense that this adds to an already large burden of work – taking on responsibility for an aspect, no matter how small, of another teacher’s professional development just seems like too much, particularly when you’d need to go into the other teacher’s lesson in one of your own scarce free periods to really complete the cycle…  There can also be that awful sense of ‘what’s in it for me?’ For now I’m not sure how we are going to get over this one – any suggestions and thoughts are welcomed!

Advertisements

What’s the state of the evidence on… Teaching Latin

I studied Latin at GCSE. My younger brother studied it to A level and then as a part of a Classics degree. I really enjoyed studying Latin and have always rather regretted the fact that I didn’t carry on with it in the 6th form.  Much to my delight, the comprehensive school in which I completed most of my teacher training offered Latin! The state grammar school in which I currently work does not teach Latin, having dropped it as a subject about a decade ago. I have always maintained that it should offer Latin but I have decided to challenge my position on this. What’s the evidence on the impact of Latin? Does the evidence suggest that there is fact to back up my conviction?

From Cambridge Latin Course: to my dismay they seem to have abandoned the original illustrations...

From the Cambridge Latin Course: to my dismay they seem to have abandoned the original illustrations…

Latin teaching

The teaching of Latin is a minority pursuit. In 2014 there were 9,129 entrants for GCSE Latin (only one board, OCR offers it), compared to, say, 75,992 for GCSE Drama. It is overwhelmingly concentrated in the private sector – 70% of entrants were in fee-paying schools. The key reasons offered by organisations promoting the teaching of Latin, such as the Joint Association of Classics Teachers (JACT) can be summarised as:

  • It develops the ability to learn and use your native language
  • It enhances understanding and aids the acquisition of modern European languages such as French, Spanish Italian and German
  • It develops ‘higher order’ skills such as logic, reasoning and mathematics

Does evidence exist to support these claims?

Overall, there is evidence to support the first of these claims. The study of Latin can be proved to aid the acquisition of native language skills. Much of the research on this has been conducted in English-speaking countries although research by Reiss and Reiss (2005) found similar effects on the development of German speakers’ native language skills. Another study, that of Sparks et al (1995), had similar findings, purporting to show that Latin learners out-perform non-Latin learners in in tests of vocabulary and reading. There are weaknesses to these claims, however. One criticism is that there is an issue with selection bias: the evidence shows that Latin learners have a higher level of prior attainment than students of other languages: in 2012 83% of Latin GCSE entrants had achieved above the expected level at the end of Key Stage 2, compared to 42% for French and Spanish. Furthermore, there is no directly comparable evidence to exam whether the same benefits are accrued by studying another, alternative language. Finally, there is interesting research that suggests that the mechanism of teaching Latin is key to improvements in native language skills. In short, to teach Latin effectively to an English student you have to teach them about English grammar and rules (Mavrogenes, 1977).

There is also some evidence to support the second claim: learning Latin may well help learning modern foreign languages. However, evidence from Germany (Haag and Stern, 2003) suggests that it is less effective at supporting the learning of a modern foreign language than learning a further modern foreign language (e.g. learning French is a better aid than Latin to learning Spanish). Finally, there is little evidence to support the claim that learning Latin helps develop mathematical or reasoning skills.

The place of evidence

Ultimately, a question such as this one exposes the limits of evidence. Robert Peal has written eloquently about the way in which evidence can inform opinion, but it cannot replace it. Latin is a case in point (which case? Ablative? Dative? Locative?). Just as the importance of Shakespeare to English language, culture and history means that I would advocate teaching it in secondary schools even if could be empirically proved that it had the effect of reducing attainment in Maths, for its supporters Latin has a cultural and historical significance that goes beyond the bounds of evidence.

References

Haag, L., and Stern, E., (2003), ‘In search of the benefits of learning Latin’, Journal of Educational Psychology, 95 (1)

Mavrogenes, N., (1977), ‘The effect of elementary Latin instruction on language arts performance’, The Elementary School Journal, 77 (4)

Sparks, R., Ganschow, L., Fluharty, K., and Little, S., (1995), ‘An Exploratory Study on the Effects of Latin on the Native Language Skills and Foreign Language Aptitude of Students with and without Learning Disabilities’, The Classical Journal, Vol. 91, No. 2

Reiss, W., and Reiss, C., (2005), ‘The State of Latin Instruction in Germany Today’, The Classical Journal, Vol. 101, No. 2

Reflections on researchED 2014

Bw2BlOlIEAA7ffw 

The start of term has meant that I have not turned this around as quickly as I would have liked but here, nonetheless, is my personal view of researchED 2014.

Before anything else is said, a massive dumper truck of praise must be emptied on top of Tom Bennett for his work on leading the push towards research in education, on Helene Galdin-O’Shea for organising the logistics of the day, on the staff of Raine’s Foundation School for hosting and on the hundreds of teachers who are so engaged and committed to their profession that they travelled from far and wide on the first Saturday of a new term! The atmosphere throughout the day was one of positivity and hope. Except for when it seemed for a horrible few minutes that the canteen would run out of food.

Having picked up my bag and pen (I think I finally left with my third bag of the day, having contrived to leave two of them behind in various Maths rooms along the way!) I started my day in the main hall to listen to Tomsett, Quigley and Coe. Their talk provided some nice ideas, an overview of what they were doing and a broad introduction to why research is important but there was so much mic changing – here’s Alex, now John, now Rob, now John again! – that it ended up being a touch hard to follow. John Tomsett was convincing in his conviction that research won’t embed within schools unless there is a leadership culture that is absolutely committed to it. A good start to the day.

My second session was with Katie Ashford. As a History teacher I am absolutely convinced by the power of the narrative, but my conviction is only a personal, anecdotal one and I went along with high hopes that I would hear about the evidence underpinning narrative-based teaching. I have to confess to being a little disappointed. Katie’s talk was anecdote and personal experience driven, rather than research-led.

 

My take on the presentation could well have been clouded by the fact that I had gone too long without eating anything and the hand tremors had set in! I had to sneak out to grab a bite to eat so I may well have missed the true research heart of the presentation. Meanwhile, back in the Hall, Dylan Wiliam was in full flow and Twitter was in raptures. Half of me wished I had stayed in the hall – but sneaking out would have been a nightmare.

Recovered and ready for session three I grabbed a seat in a maths room to listen to Dave Weston. Simply fantastic. Without pitching his talk as a sell for NTEN, Weston set out a compelling, research-backed vision of the place of enhanced professional development in schools. Full of ideas and wisdom, the crowd was visibly and audibly enthused. I really hope that his organisation grows and prospers – I’ll be signing up as soon as I can!

Daisy Christodoulou on assessment was very interesting indeed. Her visualisation of the domain and the sample raised some interesting points, and she was honest and open about her changing ideas and attitudes towards assessment. Daisy was sceptical about the policy of “teaching to the test” and dismissed as unrealistic and undesirable the suggestion that a test could be designed that should be taught to: you can’t realistically test the entire domain, only ever a small sample.

Most interestingly, particularly from my perspective as a History teacher, was her support for the idea of multiple choice tests.

 

This is something that I am going to read more about and take back to my department and discuss in detail: I think that there is real potential in this approach. After a good lunch (school dinner with pud!) I spent 40 minutes in the company of Andrew Old and his crisp, clear logic. It provided a helpful guide to approaching Twitter discussions and what to look out for in the arguments.

Then, in a break from my original plans, I went to listen to Sam Freedman. As I have already mentioned in an earlier post, I am currently working one day a week in the DfE, so the view from the outside of a former insider was too good to pass up! It proved to be both instructive and entertaining and certainly cast a perspective onto the limits of research and evidence in a policy framework often driven by ideology or whim!

Whilst I was with Andrew Old and Sam Freedman the key political talks were taking place downstairs in the Hall. My decision to miss them was not one I regretted…

I finished my day listening to Rebecca Allen, and her PhD student Sam, talk about how to get Journal Clubs up and running. Sam combined references to ‘House of Cards’ with a slightly cheeky bit of research of his own (read these abstracts and rate them for accessibility and interest) whilst the sounds of a steel band floated through the open window – a useful and thought-provoking end to the day.

Show me the research!

Show me the research!

It has now been a week since the event. It was a great day and I will certainly be there again next year (there will need to be a hunt for a larger-scale venue: classrooms are not going to be big enough anymore!). The direction of research as a trend within education remains contested and uncertain, however. I remain worried that the value of well-designed, quantitative, large-scale studies could be put at risk by a mass push towards teacher involvement in research. The first step towards making schools more research-led organisations is absolutely not to make more teachers do research. Rather, it is all about facilitating the better communication of and uptake of the findings of high-quality, professional-led research. At its best, that’s what researchED and its new website is about.

researchED 2014: my plan for the day

It’s less than a week now until researchED 2014 and I’ve even remembered to pre-order my lunch.  I’ve spent a little bit of time over the last couple of days looking at the schedule for the day, working out which sessions I plan to attend and thinking about what I want to get out of the day. 

Looking at the schedule I really hope that Tom and his contributors avoid that ‘difficult second album’ phenomenon.  Although I couldn’t make it last year there was, by all accounts, a genuine buzz and energy about the programme; a joyous shaking off of shackles and an optimistic, even zealous, sense of energy.  Tom laid into Brain Gym and Learning Styles and the crowd roared with laughter.

Since September 2013 the issue of research in schools has developed.  The EEF has expanded its programme and BBC News even covered the appointment of Wellington College’s Head of Research!  The debate about research in schools is now less about puncturing the myths and mocking the nonsense that teachers have swallowed and more about coming up with workable new ways of doing things. 

So with this in mind, here is my individual and subjective route around the day’s sessions and my personal thoughts on what I’ll be there for and what I’ll catch up with later online.

Research ed schedule

Session 1            

Spoilt for choice here.  I’m going to definitely give the Donald Clark session a miss – its title suggests a retrospective look at mistakes of the past rather than  thinking about a new direction.  Likewise the Mroz and Shaw session in the Chapel – a little too specialised on journalism for me.  A session at one conference on the importance of another conference sounds like something I can watch on the highlights package: sorry, BELMAS.  Buch could well end up preaching to the choir and the Muijs session sounds a little Donald Rumsfeld to me.  The Lagrange session sounds interesting, but I’m torn between the Coe, Tomsett, Quigley session in the Old School Hall and the Steve Higgins session on professional development and research.  Overall, although I think that there is a massive role for CPD in embedding research practice I’m taking the hint from the venue and I’ll be watching the trio on the main stage in session 1.

Session 2            

Some excellent sessions here, with some real specialisms on offer.  I’ll give phonics a miss, being a secondary teacher myself.  Dylan Wiliam is an obvious draw and he’s always fascinating: his title suggests controversy and a thought-provoking 40 minutes.  John David Blake and Robert Peal are going to be giving their particular theses a run-out.  Both are well worth listening to.  I’m going to be going to Katie Ashford’s session, though.  The historian in me wants to hear more about the power of narrative in the classroom!

Session 3            

No contest, here, for me.  Although Andrew Old’s session with Mike Cladingbowl and Sean Harford will be a pain to miss, I have been meaning to hear David Weston speak for a while.  His NTEN model of teacher development is fascinating and I need to find out more.

Session 4            

A couple of really niche sessions here between the two big draws in the Hall and the Chapel.  Although I am sure that Martin Robinson will be on fine form, it’s the hard-nosed focus on assessment of Daisy Christodolou that will see me through to the lunch break.

Session 5            

I’m going to give Tristram Hunt a miss.  I want to keep my focus on school-based ideas and project, ideally from practicing teachers or academics.  If he says anything controversial or daft it’ll be all over Twitter in an instant anyway… I’m going to give Bob Harrison a miss too, unless I spot Crispin Weston filing into the Science Studio audience.  Nope, having decided to miss his interview in Session 4, I am going along to hear Andrew Old on rational argument in the Chapel.

Session 6            

In government he may be, but Nick Gibb is not likely to do anything more than offer platitudes and encouragement to the audience so I won’t be in the hall for his session.  Cladingbowl on inspections will be fascinating, I’m sure – he is developing a great reputation as a man who listens to teachers and shows real sense.  I’m not sure, however, how he’ll be focusing on the place of research so I’ll be off to listen to Katharine Birbalsingh talk about how evidence shows that performance related pay does not work.

Session 7            

I could be flagging by this stage!  Ben Goldacre will no doubt be fascinating but, with little clue on the schedule, I am going to guess he’ll be offering a re-hash of his thoughts on how teaching can become, like medicine, an evidence-based profession.  Part of me really wants to go and listen to Tom Sherrington, but this is a man at the end of his first week in a new headship: surely prep for this session won’t have been his biggest priority over the last few days! Of the remaining sessions I’m torn between Rebecca Allen’s and Laura McInerny’s.  On balance, because I think that they are a fascinating idea, I’ll be finding out more about journal clubs.

Phew!  Tom has given himself the final slot for thanks and some light comic relief and then it will be out, blinking, into the early evening and a massive Twitter catch-up session on what I’ve missed!

 

What’s the state of the evidence on…? Oral Language Interventions

What are oral language interventions?

Oral language interventions, often considered under the rather unattractive term oracy, are can be described as “what the school does to support the development of children’s capacity to use speech to express their thoughts and communicate with others, in education and in life” (Alexander, 2010: 10). Although this has clear and obvious synergy with the teaching of literacy, or English at secondary school, the impact of such interventions goes beyond this subject. The development of effective oral communication skills has a learning benefit across all subjects because talk is a feature of all lessons. Oral language interventions can, in this respect, have a meta-cognitive aspect: they seek to develop not just the ability to communicate but also to think.

 Is there consensus about its impact?

 The EEF toolkit rates the research in this field highly. They note that there is extensive evidence, including a range of recent randomised controlled trials and meta-analyses, pointing towards oral language interventions having a moderate impact. In their translation of effect sizes, they characterise such interventions as having a +5 months impact.

Key research in this field

Comparative research into classroom talk in the UK, USA, India, Russia and France carried out in 2000 suggested that in the UK the quality of student participation in discussion was low. The report characterised student talk as being dominated by short answers to teacher questions; a closed process (Alexander, 2000). Following this report, a group of academics at the University of Cambridge set out to develop and trial methods of improving the quality of talk in UK classrooms. Their approaches became known as ‘Dialogic Teaching’ and ‘Thinking Together’.

Robin Alexander

Robin Alexander

Aside from the work of some independent academics in the United States (the work of Lauren Resnick is of particular note), the most significant research into the effectiveness of oral language inteventions has been carried out by Cambridge academics such as Neil Mercer and Robin Alexander, the figures who developed the strategies. A relatively small number of key names therefore feature very heavily in the literature on oral language interventions.

Examples of these studies include Neil Mercer’s work showing the positive impact of oral language interventions on verbal reasoning (Mercer, 2000) and attainment in science (Mercer et al, 2004). In 2006 Mercer and Sams published research showing a significant impact (an effect size of .59) on maths attainment of oral language interventions. All of these studies were methodologically comprehensive and designed with strict controls. The studies share a common conclusion: “improving the quality of children’s use of language for reasoning together would improve their individual learning and understanding” (Mercer and Sams, 2006: 526). Furthermore, the authors make the significant claim that this research presents hard evidence of the effectiveness of group work – although the authors couch it in the most impenetrable edu-babble imaginable (“validity of a sociocultural theory of education by providing empirical support for the Vygotskian claim that language-based social interaction has a developmental influence on individual thinking”, p.526)!

A great deal of the most up-to-date and relevant research in the field of oral language interventions is a legacy of the 2008 Bercow Report into provision for children with speech, language and communication needs (SLNC). The government’s response to this report led to the formation of the Better Communication Action Plan. A key strand of this plan was a comprehensive review of the literature, identifying what research suggested about best practice in developing oral language skills (Law, Beecham and Lindsay, 2010).

Research into the impact of oral language interventions is generally divided into the examination of 3 ‘waves’. Wave 1 interventions are general, whole-class strategies for developing oral language skills. Wave 2 consists of targeted interventions, focusing on pupils with an identified oral language need. Wave 3 concerns specialist interventions, aimed at the very small minority of students with particular, often unique, oral language needs. The bulk of research focuses on Wave 2 interventions.

Talk of the Town

Talk of the Town

 Where to go next?

The fact that there is already convincing research in this field has encouraged the EEF to fund a range of large-scale studies to further develop the evidence base. The particular focus of continued research is on classroom-based interventions to tackle SLCN. Key projects ongoing are:

  • Talk of the Town. A 62 school effectiveness trial testing the impact of the Talk of the Town framework. This project will report in 2016.
  • Language for Learning: An effectiveness trial involving 34 schools testing the impact of a specific 20-week programme, delivered by trained teaching assistants to improve the communication skills of students entering primary school with poor oral language skills. This has the potential to be a low-cost, easily replicable programme to tackle poor communication skills at the start of a child’s school career. It reports in Spring 2015.

If you teach in a school with significant numbers of children with SLCN and you are not familiar with The Communication Trust’s website, then hot-foot it there! It provides an excellent model for how teachers can use evidence to make informed choices about a range of interventions and what to look for in terms of outcomes.

References

Alexander, R., (2000), Culture and pedagogy; international comparisons in primary education, Oxford: Blackwell

Alexander, R., (2012), Improving oracy and classroom talk in English schools: achievements and challenges. Available online here.

Ainscow, M., Gallannaugh, Kerr, K., (2012), An Evaluation of The Communication Trust’s ‘Talk of the Town’ Project, 2011-12. Available online here.

Law, Beecham, Lindsay, (2012), Effectiveness, costing and cost effectiveness of interventions for children and young people with speech, language and communication needs (SLCN). Available online here.

Mercer, N. (2000) Words and Minds: how we use language to think together. London:Routledge.

Mercer, N., Dawes, L., Wegerif, R., & Sams, C. (2004) ‘Reasoning as a scientist: ways of helping children to use language to learn science’, British Educational Research Journal, 30:3, pp.367-385.

 Mercer, N., Sams, C., (2006), ‘Teaching Children how to use language to solve maths problems’, Language and Education, 20:6, pp.507-528

What’s the state of the evidence on…? Peer tutoring

This is the first in what will become a series of brief outlines (second briefing, on Oracy, now available) of the state of research in key areas of school policy and teaching practice. One objective of these posts will be to provide a summary definition of the intervention or policy, and to highlight judgments of their efficacy. A second objective will be to highlight the research that has been conducted in the field, to highlight its strengths and weaknesses, and to offer suggestions as to future research directions. I am not aiming to produce a  comprehensive review of literature and research, but if you feel there is an important study that has been omitted, please comment or contact me on Twitter.

What is Peer Tutoring?

Peer tutoring is a blanket term used to describe a range of specific approaches that share as a common feature learners working with one another to provide mutual support and instruction. The most commonly used peer tutoring approaches are cross-age learning, peer-assisted learning and reciprocal peer tutoring.

Peer Tutoring (image credit: Maswlsey Community Primary School)

Peer Tutoring (image credit: Maswlsey Community Primary School)

Cross-age tutoring

This involves the paring of an older student in a tutor role with a younger student as a tutee.

Peer-assisted learning

This is a structured programme in which participants take part in two or three sessions of approximately 30 minutes each week, generally in order to develop their maths or reading skills. It is also extensively used in higher education establishments.

Reciprocal peer tutoring

Reciprocal peer tutoring sees students of similar age and ability working collaboratively, alternating between the roles of tutor and tutee.

Is there consensus about its impact?

There is a broad consensus that peer tutoring is an effective strategy. The EEF Toolkit rates peer tutoring highly, suggesting that it has a significant impact on outcomes. It ranks it as having a “+6 months” effect size.  Research suggests that tutors can benefit from peer tutoring as much as tutees.  Hattie, in Visible Learning, concludes that peer tutoring has an effect size of .55.

Key research in this field

In 2013 a meta-analysis of the academic impact of peer tutoring was conducted by a team of American academics led by Lisa Bowman-Perrott. Her research summarised the key reasons why peer tutoring programmes tend to have a positive impact: the core components of peer tutoring (frequent opportunities to respond , increased time on task, regular and immediate feedback) have each been “empirically linked with increased academic achievement” (Bowman-Perrott et al, 2013: 39).

Bowmann-Perrott et al based their study on an analysis of multiple single-case pieces of research: investigations conducted with a limited number of students in a single institution. They limited their analysis to investigations meeting “standards for high-quality single case research design”: of 1,775 articles focusing on peer tutoring published between 1966 and 2011 only 26 (1.46%) met these criteria (Bowman-Perrott et al, 2013: 44). In total, their meta-analysis included a total of 938 participants.

This meta-analysis found that the mean effect size of peer tutoring was .69 at elementary grade level and .74 at secondary grade level. “The overall effect size was found to be moderately large, indicating that greater academic gains were achieved by students engaged in peer tutoring interventions than nonpeer tutoring instructional arrangements” (Bowman-Perrott et al, 2013: 49). It should also be noted that the study calculated effect sizes using TauU, a “relatively new effect size measure” that is not directly comparable to the more familiar Cohen’s d effect size calculation (Bowman-Perrott et al, 2013: 41).

John Fantuzzo

John Fantuzzo

As with any area, there are some researchers and academics who have made peer tutoring their specialism. John Fantuzzo, currently at Penn Graduate School of Education, has dedicated much of his career to investigating the impact of peer tutoring. As well as the academic impact of peer tutoring, Fantuzzo has written about the positive social and behavioural impact of peer tutoring programmes (Ginsburg-Block, Rohrbeck, Fantuzzo, 2006).

Keith Topping is another researcher whose work on peer tutoring has been influential and robust. Topping’s particular area of expertise is the use of peer tutoring to develop reading, although he has also been part of studies examining the impact of peer tutoring in maths. In 2011 Topping led a RCT in 80 Scottish primary schools that investigated the impact of peer tutoring on reading. Topping found that peer tutoring led to significant gains in reading and tutor and tutee self-esteem (Topping et al, 2011: 3). The findings of this study led to the EEF deciding to undertake a follow-up project in England. Topping notes that effective tutor training is a significant factor in the success of peer tutoring programmes. His works, particularly 2001’s Thinking, Reading, Writing, are the key starting point for any teacher considering implementing a peer tutoring reading programme.

Keith Topping

Keith Topping

Where to go next?

There is a great deal of evidence from a range of well-designed and rigorously analysed single case studies suggesting that peer tutoring is effective. The cumulative nature of this research lends weight to the findings but, as Bowman-Perrot et al note that, “missing from the peer tutoring literature are recent reviews that report effect sizes with confidence intervals” (Bowman-Perrot et al, 2013: 40). Building on the work of Topping et al (2011), the EEF is currently carrying out a RCT that seeks to develop research in this field, involving multiple schools in a range of contexts. Their focus is on a cross-age tutoring programme targeting improvements in Year 7 and Year 9 literacy. This study will publish its results in Spring 2015 and it should prove to be a major contribution to this field.  Likewise, the EEF is also funding a RCT exploring the impact of peer tutoring in maths.  The Shared Maths project also builds on work led by colleagues of Topping in Scotland, and it will publish results in Autumn 2015.

References

Bowman-Perrott, L., Davis, H., Vannest, K., Williams, L., Greenwood, C., Parker, R., (2013), ‘Academic Benefits of Peer Tutoring: A Meta-Analytic Review of Single-Case Research’, School Psychology Review, 42:1, pp.39-55

Ginsburg-Block, M., Rohrbeck, C., Fantuzzo, J., (2006), ‘A meta-analytic review of social, self-concept and behavioral conduct outcomes of peer assisted learning’, Journal of Educational Psychology, 98, pp.732-749.

Topping, K., (2001), Thinking, Reading, Writing: A Practical Guide to Paired Learning with Peers, Parents and Vounteers, London:Continuum

Topping, K., Miller, D., Thurston, A., McGavock, K., Conlin, N., (2011), ‘Peer tutoring in reading in Scotland: thinking big’, Literacy, 45:1, pp.3-9

Performance Management Panic

It’s performance management time again. All around the country, teachers are sitting down with their line managers to review the year that is drawing to an end and to set targets for 2014-15. With a greater than ever focus on the place of research in schools you can bet that from Newquay to Newcastle, Canterbury to Carlisle, there are teachers who have been set the target of leading research in their schools. For some this will have been expected and requested; a logical focus, based on their skills and interests. For others, however, this new responsibility will be unexpected, and will cast them beyond their existing experiences or skills. Where, then, should such a teacher start?

Firstly, it would be a great idea to become more familiar with the background story of research in schools and to develop a sense of its worth and value. Getting hold of a copy of Tom Bennett’s Teacher Proof (get the school to buy it from the CPD budget!) will open eyes to the unevidenced snake-oil that has been peddled to schools over recent years and should foster a greater sense of scepticism. A contrast to Bennett’s humorous, machine-gun delivery is Ben Goldacre’s (the Guardian’s ‘Bad Science’ columnist) DfE commissioned report Building Evidence into Education. This outlines how teaching can become an evidence-driven profession in the way that medicinedid in the final decades of the twentieth century.

Secondly, reach out for help. Find out what research experience there is already within the school; look for colleagues who are already interested or receptive. You will be more likely to be successful if you have a team of people working with you, sharing ideas and spreading the load. Look to other local schools to see what they are doing about research. If there is a Teaching School nearby then it has a duty to be engaging and leading research: get in touch with them to find out what help and support they can offer. Finally, if you’re not already doing it, start to read and follow the range of bloggers, almost all practicing teachers, who are writing about their research experience. Alex Quigley’s blog would be a great place to start.

Next, you need to get clarity about what the school’s expectations of research are. I have written about the different types of research that exist in Degrees of Engagement. By research, does your head mean classroom-based action research, exploring small scale changes to practice? If so, then studying the example of schools that have embedded an action research culture is a great place to start. Read the blogs written by headguruteacher on the subject or look at an example of a school’s action research journal at www.sandagogy.co.uk.

Alternatively, they may want to see a more rigorous, quantitative piece of research that explores the impact of pupil premium money in the school, say. As I have previously written, I think that this is where schools should really be focusing their attention and resources. It is only through wider engagement with methodologically sound, quantitative research that we can build a secure evidence base for the profession. If this is what the aim is (and I hope it is!), then the EEF’s DIY evaluation guide is the place to start. This document is absolutely superb. As an introductory guide to undertaking high-quality research in a school it really cannot be faulted.

Finally, be realistic. Aim for quality, rather than quantity. One well-planned, rigorous and convincing piece of research will have more use and value than a glut of ad hoc, disorganised and anecdotal classroom studies. Try to focus on something that can be repeated next year, building a more convincing and useful data set. Think about targeting an intervention or policy that others schools are also likely to be using (and investigating), to maximise the relevance of the work. Stick at it. Don’t give up. Good luck, and let me know how it goes!

Degrees of engagement, part 2: producers

Engagement with research is about consumption and production.  In my previous post I argued that barriers to being effective consumers of research can, and really should, be overcome.  The effective production of research is a tougher challenge where the lack of ‘research literacy’ within schools is a more significant obstacle.

Educational research can take a range of appearances.  One level, not uncommon in schools, is to undertake action research.  There are lots of resources available on line and in print that help teachers to plan and carry out small scale projects to investigate the impact of changes to practice.  Sometimes teachers conduct these independently, sometimes as part of a school-wide professional learning community.  There are plenty of great examples of schools where this practice is an embedded part of teaching and professional development.

Research purists can get sniffy about the value of this.  Certainly, most of it is qualitative, rather than quantitative and by its very nature, action research is limited in terms of wider applicability.  All of this is true, but offsetting these concerns is the fact that it represents an effort to think critically about one’s practice, to be open-minded to change and innovation.  If the worst case scenario of small scale action research is that it’s a waste of time then the best case is that it can be a direct route into more meaningful and rigorous research.

By this I mean a determined effort to produce a piece of research that is quantitative, methodologically sound, reliable.  For schools receiving the pupil premium this is a must. The two questions that Ofsted will be guaranteed to ask about pupil premium are: What are you spending it on? and; What impact is it having?  Small scale, anecdotal, qualitative research won’t cut the mustard in response to this questions.  But it’s not just applicable to schools with intakes targeted by pupil premium.  All schools have a duty to deliver value and to monitor the impact of what they do.

However, designing a piece of research that assesses the impact of an intervention demands a real level of research literacy.  For a school that has no one on its staff with the skills and experience of designing controlled research, this is a problem.  Notwithstanding the fantastic resources produced by the EEF to help schools design investigations into the impact of interventions, there is a real need for research literate teachers to act as guides and mentors to schools beyond their own.  Hopefully this can become a feature of the school-to-school support networks that are now taking off around the country. If you’ve got the skills you’ve got a duty to share them!

Finally, there is the participation in the very highest level of research.  By this I mean the sort of multi-school investigation that might involve thousands of students.  Here, engagement in the production of research is a partnership between research literate teachers in schools and educational researchers with the logistical experience and know-how to conduct mass experiments.  Again, the work of the EEF is ground-breaking here.  Partnerships such as those recently announced by the EEF are at the heart of answering the question I posed at the end of my blog about consumers of research.  If research is going to be used to inform decisions, then consumers in schools have to see its relevance, applicability and value.

So a hierarchy of research in schools, in terms of usefulness and value.  Action research at the bottom, quantitative, rigorous single school studies in the middle, multi-school, professionally coordinated studies at the top of the pile.

Making sense of effect sizes

In my previous post I wrote about the importance of teachers becoming effective, informed consumers of research.  I mentioned the work of John Hattie in the post, citing his work as an example to the kind of research that schools should, as a bottom line, be engaging with.  It goes without saying that Hattie’s meta-analysis of educational research findings in Visible Learning has become one of the most talked-about and influential books in teaching.  For some teachers, however, the presentation of the findings in terms of effect size is real barrier to understanding.  What does an effect size of 0.29 mean?  What does that ‘look like’?  What is an effect size?!

The calculation of effect sizes had its origins in the desire to compare studies that focused on a similar problem but used different measures.  For example, imagine that there are two studies both focusing on the impact of reducing class sizes.  One reports its findings in terms of the difference in measured reading age of primary school pupils, the other report focuses on the impact of smaller class sizes on American SAT scores.  The calculation of an effect size for each study allows their findings to be directly compared.  In layman’s terms, an effect size is a quantification of the difference between two groups: the sample, and the test group.  It is reported as a number, and the higher the effect size number the bigger the visible difference between the control and test group.  But reporting effect size as a number does not necessarily aid easy understanding.  Efforts have therefore been made to ‘translate’ the raw number into something more easily visualised.

The EEF Teaching and Learning Toolkit is based on effect size.  Here, the effort to ‘translate’ the effect size data is based around the idea of ‘months’ progress’.  Effect sizes are mapped to a scale that suggests how many extra months’ worth of student progress an intervention is equivalent to, compared to average student progress.  On this scale an effect size of 0.27 to 0.35 is considered to be the equivalent of four months’ additional progress, an effect size of 0.70 to 0.78 to eight months’ additional progress.  This is useful, and certainly makes things easier to understand, but when I first came across the Toolkit I was still left looking for another way of visualising the concept.

What makes most sense to me now when thinking about what an effect size ‘looks like’ is the rule-of thumb comparison first suggested by Jacob Cohen in 1988.  It is, unapologetically, a simplification but like many simplifications it breaks down a barrier and facilitates a deeper understanding.

Cohen, a psychologist, pointed out that an effect size of 0.2 is as visible as the height differences between 15 and 16 year old girls: something that could not be detected by the naked eye.  Asked to decide whether a Year 11 girl was 15 or 16 based simply on how tall she was you’d scarcely have a better than 50-50 chance of being right.  I’d be pretty sceptical about introducing something if the difference it made would only be that obvious!  Moving up the scale, an effect size of 0.5 (into the >0.4 zone that Hattie calls the ‘zone of desired effects’) is more visible.  Cohen’s rule of thumb comparison points to this equating to the height difference between 14 and 18 year old girls.  Spotting whether girls were in Year 13 or Year 10, say, based on their height would strike me as more obviously achievable: indeed, you’d have a 60% chance of being right if you always said that the taller girl was in Year 13.  Introducing something with this order of effect size (developing more effective teacher questioning skills, say) should lead to a pretty visible difference.   An effect size of 0.8, then, is more obvious still: the difference between 13 and 18 year old girls.  A shift from 14 years to 13 years doesn’t sound like it should lead to much of a difference, but this is about when the pubescent growth spurt really kicks in.  As a rule of thumb 80% of Year 8 girls will be shorter than the average Year 13 girl’s height.  An intervention that is going to lead to a difference that’s as obvious as the difference in heights between Year 8 and Year 13 girls has to be worth trying, right?

Degrees of engagement, part 1: consumers

There is an undeniable grass roots interest in educational research building today.  However many schools, including those who are keen and interested in this movement, cite the lack of ‘research literacy’ amongst staff as a barrier to real engagement.  How can this be tackled?  I’m making no original statement when I say that engagement with research can be divided into issues to do with consuming research and producing research.  This post will focus on the first of these aspects.

Firstly, from a personal perspective, having completed an MA in Education, I would strongly advocate that school leadership teams encourage staff to undertake further academic research.  The process of re-engaging (critically!) with research, studying how to design effective research, and evaluate findings was invaluable.  In my opinion, heads should make this a priority: if you’ve no one on the SMT with a further degree in education or school leadership then you should look at getting someone, or yourself, signed up!  But this is not a ‘quick fix’ and it’s not an option that is possible, either financially or logistically, for some people.  How, then, to develop research literacy in a more direct manner?

As a basic standard, senior leadership teams should be active consumers of research, referring to it in their decision making process.  For schools in receipt of the pupil premium, accountable for how they decide to spend their allocation, this is a vital necessity.  Here, the hurdle to initial understanding is not a significant one.  It is hard to imagine that there can be many leadership teams that have not heard of the work of people such as John Hattie or seen the resources produced organisations such as the EEF.    The first stop towards becoming research literate is to become more sceptical.  Less inclined to make a ‘gut decision’.  Less credulous in the face of simple solutions to complex problems.  In short, I think that at a strategic level there is no significant obstacle to engaging with research as a consumer.

Solve all your problems instantly!

Solve all your problems instantly!

For an individual classroom teacher I would argue that there is perhaps a greater challenge.  Much of the accessible evidence available to teachers does focus on the relative impact of whole-school interventions rather than changes to an individual’s practice.  However, there are clear research findings that focus on teaching approaches that would allow a research-minded classroom teacher to evaluate their practice and make decisions about self-improvement.

In many respects the most significant hurdle to embedding the effective use of research within teaching over the coming years could be the paucity of relevant, coherent research!  A future blog post will consider the production side of the issue.