As another school year commences, there will be many embarking on 1:1 programmes either as part of - or the culmination of - a pilot. They will hopefully have done their due diligence, research, honed their vision for technology and have a robust project plan with a range of key performance indicators and CPD in place to support this change and be clear about the total cost of ownership. Nevertheless, there will be those within the school community who might be a bit more sceptical about the use of technology, asking the question, "where’s your evidence?" (that technology has an impact)? See for example Tom Bennett.
Where's your evidence (1)?
Perhaps one of the biggest surprises for me rolling out a 1:1 programme was the strength of feeling from parents of the need for evidence of the impact of mobile technology in the classroom. The evidence they were referring to was, invariably, that of improved examination results and, admittedly, the evidence here is some-what thin on the ground. However, this doesn’t take into account what the school is trying to achieve by using technology (the ‘why’). Neither does it take into account the impact (related to the school’s context) technology has on everyday learning and teaching; assessment, or collaborative work. Anecdotal evidence will reveal technology can have all sorts of impact on student outcomes; If you set out on a 1:1 journey with the plan to improve the percentage of students achieving A*-C then you will, most likely, be embarking on a fool’s errand. How would you know it was the technology that made the difference with so many different variables at play in any given year group? Ultimately, be clear about your why and let that be your driver.
The parents demanding 'the evidence', led me to conclude a number of things for schools embarking on a 1:1 roll-out. Firstly, be clear what evidence is out there, what it says and what its limitations are and how you have used the evidence in putting together your educational case for going 1:1. You will then be better placed to justify the investment, to help fend off criticism and, ultimately, to show you’ve done your homework and, to put it bluntly, know what you are talking about. It will also help steer you as you hone your ‘why?’ and detail your key performance indicators (KPIs) for your 1:1 programme. The following list is a good starting point relating to current 1:1 research, with an emphasis on tablet devices.
What's your why?
Secondly, and perhaps more importantly as this is about your school and your context, be clear about why you are embarking on a 1:1 programme. Simon Sinek’s book Start with Why? provides a usual starting point for thinking about your 'why' (see also the Ted video). It is the answers to this question which will feed into the KPIs (or ‘benefits review plan’) for your programme and provide you with the evidence to start to assess the impact of your 1:1 programme. This will, invariably be a mixture of qualitative and (harder to prove ) quantitative data. This is an important and necessary process to go through. This is a snapshot of how one of our schools has gone about assessing the impact of their 1:1 programme. The benefit description outlined here is closely tied to the learning technology vision (the ‘why’) (and makes no mention of improving examination results - although clearly that is hoped for).
Where's your evidence (2)?
Having a plan is one thing but compiling the evidence to evaluate what you are doing is another thing altogether. The following post provides a good starting point for the tricky part of compiling the evidence - the data - to evaluate your KPIs. In our experience it doesn't hurt to produce a bank of teacher-based evidence, from the ‘coalface’ so to speak (interviews, videos, student work). We recently had a number of teachers speaking passionately about the impact our new primary curriculum was having on student outcomes. They didn’t produce reams of data to back this up but rather spoke passionately about what they had observed. Should this type of evidence be dismissed out of hand? We think not. The results of the benefits review plan can then feed into the evaluation of the programme and help you set the next steps. I'll leave you with one thought. If you don't show any evidence of impact then, ultimately, aren't you just another person with an opinion.?