So, the World Cup is over, we’ve had a couple of weeks to recoup in the sun, and many of us are already back now in pre-season training. Before we know it, next season (or as some of you insist on calling it, next “academic year”) will be upon us. Before it is though, I thought I’d put pen to paper to remind myself of one of the big moments of last season – moving our Ofsted grade from 3 to 2.
The usual disclaimer applies: these are my personal thoughts, not necessarily shared by my organisation or colleagues. But I was there, I have the T-shirt, and this is what I think about it, and what lessons might be learned from it for how Ofsted interacts with colleges in the future.
As most of you will know, Learning and Skills inspections are based around three measures: Outcomes for Learners; the quality of Teaching, Learning and Assessment (TL&A); and Leadership and Management. Alongside these aspects, there is graded inspection of a number of subject areas, focusing only on TL&A. This framework is different then than the one for schools.
Under the previous inspection regime, it felt like everything was data driven. It seemed that data-based judgements had been made long before the inspection ever began and that what inspectors were really looking for was confirmation of their own grading hypothesis. This time round, it did feel that there was much more of an emphasis on teaching and learning. Data was still important, but this time it felt that data was confirming judgements made on teaching and learning, not the other way round.
Our outcomes data last time suffered, I believe, from legacy systems issues resulting from the merger of two colleges which were formerly good and outstanding. A key action post-inspection then was to get to grips with the data, to ensure it was owned, agreed and acted upon by all staff at all levels.
For me there were three main messages from the data this time round which set the context for the inspection and, I think, gave confidence to inspectors that their teaching and learning judgements were justified. Firstly, across both age groups and all levels, there was a clear upward trajectory of long qualification success rates (QSR), from where they had been to where you would now expect them to be. When the impact and sustainability of our more recent innovations were being challenged, this three-to-four year picture of sustained QSR improvement was a convincing riposte.
Secondly, there was the SEPI (Socio-Economic Performance Indicator) Report which showed that our cohort, from the 5th percentile of most deprived/disadvantaged students in the country, were out-performing students from the top quartile, i.e. the leafiest suburbs. And thirdly, we had destinations reporting that showed a very high percentage of completers moving on to higher levels of study or into employment, including apprenticeships.
This new inspection regime gives primacy to teaching and learning, and we always knew that this was where we would stand or fall in inspection terms. The new regime is much less easy for a college to ‘manage’. What inspectors see that week in terms of TL&A is much less easy to control than a ten-month old data set that you know back-to-front and inside-out. And those inspectors get everywhere now, both in, and outside of classrooms!
If there’s anything I’ve obsessed about in the past two years, it has been our focus on teachers, and giving the quality of our teaching and learning the highest priority. But doesn’t that sound like a ridiculous thing for a college leader to say?! It’s like the Chief Operating Officer of Toyota saying, “Hey chaps. You know what? For the next while we’re going to really focus on how we make motor cars – what parts we put into them and how we put them together.” Ludicrous.
And yet it’s right isn’t it? As Mike Sheehan recently said to the GMCG Conference, the priority discussions at corporation meetings and in Senior Management Teams since incorporation have been about Finance, Funding, Capital build programmes, data… basically, ABT – anything but teaching. All those things are still important, managing and developing a solvent college in the current funding environment is a challenging job, but if our “product” isn’t right, then there isn’t a good reason for a college to exist in the first place, solvent or otherwise.
So in the last couple of years, we have done what I’m sure many other good colleges have done. We’ve re-shaped and retrained our observation team, mirroring as closely as possible Ofsted practice (including, as I discussed here, graded observations). We’ve created space in each working week for teachers to meet, just to talk about teaching, sharing challenges and good practice. We’ve introduced peer coaching and a comprehensive rolling CPD programme, as well as holding a two day Teaching and Learning Conference for 1,200 of our teachers.
The outcome of all this activity has been a more accurate and a much improved lesson profile. Ofsted’s “Good or better” lesson observation profile coincided within 1% of our own. Inspectors spoke of vibrant and interesting lessons, and teachers who were ambitious for their learners, and with whom they had highly productive working relationships.
Our Leadership and Management grade has been confirmed as Good, although personally, I feel that what we have achieved is something a bit more spectacular than that. I know that size shouldn’t matter, but to say that we are Good feels akin to equating the competence required to bring a Pedalo back to its mooring in the boating lake with that needed to safely lead HMS Ark Royal through a six month tour of duty in the Gulf.
We have conducted a far reaching Strategic Review looking at what we do and why we do it. We have re-formulated our mission, vision and values, articulating what the point of the college is, where we are trying to get to, and how we are going to behave on the way there. Our day-to-day business-as-usual work has involved systematically reviewing how we do absolutely everything that we do. Not always been a comfortable process to be in, but the Ofsted report clearly thinks we have transformed the place and reinvigorated the staff.
So what are the lessons I’ve taken from this process with regards to Ofsted.
Firstly, we need an agreed agenda. Ten years ago Ofsted did not seem interested in the social and socio-economic impact of colleges. Five years ago there was movement towards ‘wider outcomes’ but QSRs dominated the discussion. Now it feels like we are getting close to a balanced agenda, focussing on the quality of teaching, learning and assessment, but in the context of impacts on progress and on learner destinations.
Secondly, we need joint understanding of that agreed agenda. As an old-fashioned ‘Satisfactory’ Grade 3 college we asked to be part of the Ofsted pilot to support colleges requiring improvement. Best thing we could have done. It gave us the opportunity to sense-check our proposed solutions and perceived progress by working with a number of senior inspectors to advise on sector best practice and reflect on how the Ofsted framework might see us.
Thirdly, we need to keep working to find a way to ensure that inspections do not usurp the proper and sensible running of colleges. Short- or no-notice inspections don’t achieve that. They just ensure that Ofsted-readiness becomes a perpetual, year-round industry for colleges liable to be inspected. We had 16 inspectors going through us like a dose of salts for a week, plus another few learning and visiting. It would be difficult to arrange the logistics of that at no notice, and however sensitively managed (and ours was) I don’t see how that model can be anything but disruptive to the normal running of a college.
There is a fourth thing too: the inspection teams themselves. Ofsted inspectors, like college managers, don’t do their job backed by divine right or papal infallibility. They are an honest bunch of people who do their best, but sometimes that best is too constrained by the bubble in which they live and work. There have been lots of suggestions on Twitter that inspectors should be compelled to teach as part of their Ofsted contract. But why stop there? Why not undertake periodic secondments onto college management teams, or even as college nominees?
I am convinced that we need external scrutiny of the quality that colleges offer, and external challenge on behalf of learners to any lack of ambition in that respect that colleges might show. It feels though that it should be a continuous quality improvement model where challenge and consultancy are undertaken in collaboration with, not sitting outside of, providers of learning and skills.