
Ofsted exists to keep children and learners safe, raise standards of their education and care, and improve their lives. That is our core purpose and driving mission. We know we must deliver our mission with professionalism, courtesy, empathy and respect for the professionals we work with. They – like us – dedicate their lives to improving the lives of children and learners. We are determined to do more to improve how we work with these dedicated professionals and provide parents and carers with the information they want.
Each and every day, our inspectors see brilliant practice in the nurseries, childminders, children’s homes, colleges, all forms of schools, young offender institutions, teacher training providers and the myriad other providers of education and care we inspect and regulate. We champion this brilliance through our reporting to parents and carers, government and the public.
Sadly, despite the best efforts of professionals, we also identify unacceptable practice. It is our job to call out practice that undermines children’s safety or robs them of their one chance at an education that allows them to thrive. Identifying poor practice is the first step in delivering urgent and necessary improvement and support.
All too often, we inspect education providers where aspirations for children and learners – especially those from the most disadvantaged backgrounds – are too low: where behaviour is unacceptable and where children’s one chance at fulfilling their potential is squandered. We regularly find local areas failing to provide adequate support for children and young people with special educational needs and/or disabilities (). And, tragically, we still uncover early years providers where children’s lives are at risk because of unacceptable standards of care.
It is our duty to call these unacceptable standards out.
The proposals we consulted on in recent months were developed in line with this mission – as well as the tens of thousands of pieces of feedback received through the Big Listen; the largest consultation Ofsted has ever carried out. In the aftermath of the tragic death of Ruth Perry, headteacher of Caversham Primary School, and the subsequent Coroner’s inquiry, we were determined to hear feedback from everyone connected with our work. We remain equally determined to maintain good working relationships with education professionals, while inspecting on behalf of children and their parents.
In the Big Listen, we heard clearly from parents and professionals that they wanted a move away from a single overall grade. We heard that parents wanted more information, with clear grades across the range of areas we inspect when we visit nurseries, childminders, schools, colleges and other education providers. Independent research found that this proposal was supported by a majority of the professionals. However, there was also a clear desire for more granular assessments of providers, taking into account context and showing clear areas for improvement.
The Ofsted report card combines the preferences of parents and professionals, delivering a system that continues to identify brilliance, drives high and rising standards across all providers and calls out unacceptable practice. In addition, it gives more nuance – both at a glance and for those interested in delving into greater detail.
It gives a more detailed picture of the strengths and identifies more precisely the areas for improvement. It increases the incentive to improve; the new ‘exceptional’ grade will identify the very best education practice in the country.
And our new approach improves accountability, rather than lessens it. Where standards are not yet high enough, we will return quickly to check that progress is being made.
We hope these measures, including the significant improvements made as a direct result of the feedback from professionals, will be welcomed. We will continue to engage constructively with all professionals, so we can learn from their feedback.
We know the vast majority of professionals understand the importance of accountability in keeping children safe and improving their lives. They want Ofsted to perform that vital role, but they want us to improve how we work. We believe our plan delivers on that ambition.
However, a small but vocal minority are calling for reduced accountability or removing grading altogether. We do not agree. And parents and carers do not agree either.
The changes we are introducing are fair and empathetic for professionals, but without losing sight of our core purpose to raise standards.
Our new report cards will include the nuanced content parents want. Our improved inspection practice – built on a new methodology, updated evidence-gathering processes, and bolstered training – will change the look and feel of inspection. We will embed transparency through all our work, clarifying our inspection practices and processes through the new operational guides, publishing our training, and being clear about the data we use. And we will add an extra inspector to inspection teams for schools to boost inspection capacity, to support leaders and to make sure we gather the evidence we need.
These reforms are further proof of how we are resetting Ofsted’s culture through our actions in response to the Big Listen, working with the education workforce to raise standards. Ultimately, children only get one childhood. This is why we are putting disadvantaged and vulnerable children and learners at the heart of what we do, as we continue to strive to keep them safe and improve their lives.
We are undergoing a major cycle of reform. This consultation response sets out what amounts to our biggest and most consequential changes.
Our response to the Big Listen in 2024 set out 132 actions to reform and change our work. We publicly track these actions in our Big Listen action monitoring reports. That response set an ambition to reset our relationship with those we regulate and inspect, working collaboratively with them to put children and learners first. As part of this ambition, we committed to revising our inspection framework and introducing inspection report cards.
On 3 February 2025, we launched a consultation on our proposals to improve education inspections and our new report cards for providers. The proposals covered:
The consultation ran from 3 February 2025 to 28 April 2025. It was open to the public and promoted widely through the media, our website and social media channels. We sought the views of key stakeholders and interested parties through a variety of methods.
The Department for Education () ran a parallel consultation to gather feedback on its approach to, and the principles of, school accountability; the introduction of school profiles; and its approach to improvement and support for state-funded schools.
We published the draft toolkits for early years, state-funded schools, non-association independent schools, , and as part of the consultation. We have now published the updated toolkits and operating guides. You can view them below:
The findings in this response are based on the feedback gathered through:
In total, we received more than 6,500 responses to the consultation. We heard from people from every education sector that we inspect. The education professionals were the largest group of respondents (75%), followed by parents/carers (21%). There was some overlap between these groups. The technical annex gives more detail on the methodology across different strands of data collection.
This evidence base has informed the final drafts of our toolkits and operational guides for education inspection, which we have published alongside this response.
We have listened. We have listened to parents and children through independent polling, focus groups, commissioned research, and the over 1,300 responses that parents and carers returned to the consultation. We have listened to over 4,900 professionals in the education sectors we inspect, through the responses they returned to the consultation, through our engagement events, through our meetings with education unions and representative bodies, and by hearing directly from leaders and inspectors during test inspections. And we have listened to experts, from our 7 external reference groups, from the , and from our direct engagement. We have listened and we have made changes as a result of their responses.
As a reminder, we consulted on a range of proposals to renew our inspection framework, and to make changes to our inspection materials and methodology, in order to increase transparency and build stronger relationships with those we inspect. The proposals were:
We proposed changes to a 30-year-old, well-established, extensively copied and well-understood approach to reporting on the quality of education provision.
Before we stopped using the overall effectiveness judgement last year, about 90% of schools were graded ‘good’ or ‘outstanding’, and as it stands now, 98% of early years settings and almost 80% of settings are currently graded ‘good’ or ‘outstanding’. This means most of the highly diverse range of education provision in England is summed up by 2 simple descriptors, which is unhelpful to parents and unfair to providers.
Research carried out for us as part of the Big Listen consultation had found that only 3 in 10 professionals (29%) and 4 in 10 parents (38%) supported single-word judgements for overall effectiveness.
By adding a 5-point grading scale across a range of evaluation areas, we can offer more differentiation and therefore more information for parents and providers. Report cards first and foremost have been, and should be, designed for parents and carers. They have multiple purposes, but they are primarily to advise parents and carers on who to trust with the care and education of their child, or to inform learners about their critical educational choices. This is why we listen to parents and carers first when it comes to reporting.
We think that parents have given us overwhelming backing for these reforms. We commissioned YouGov to carry out independent research on the views of parents on the proposed report cards. Two thirds (66%) of parents of school-age children independently polled by YouGov in a nationally representative sample told us that they want Ofsted to continue to grade schools using a scale, regardless of whether it was a 4- or 5-point scale (only 10% said they were opposed to this). Two thirds (64%) told us they agreed with the addition of an ‘exemplary’ grade (9% disagreed) – the fifth point on the 5-point scale. Two thirds (67%) told us they prefer the new report card to the way we currently report (15% said they preferred current reports). In focus groups YouGov held with parents of children in schools and nurseries, we heard similarly strong support for our plans.
Parentkind, a network for parents of school children, told us:
We are delighted that our new approach is thoroughly viewed as an improvement by parents.
According to a More In Common poll commissioned by Schools Week, 71% of parents said they felt that the new grading system is fairer on teachers (17% said the current system is fairer). We agree.
Professionals, sector representatives and education experts have offered a mixture of constructive feedback on our proposals. They have given praise for our plans to improve inspection practice, and criticism for propositions they thought were unnecessarily complex or under-explained. We also received challenge from teacher and headteacher unions that opposed our proposals to continue to give grades as part of inspections.
The polling feedback from parents tells us our proposals on report cards were broadly right. When it comes to inspection practice, it is crucial that we prioritise the views of those who experience inspections. This is why we have listened closely to the constructive feedback from education professionals that we have heard from the consultation, from direct engagement, and through our testing visits, to make many changes to inspection toolkits, methodology, and broader changes to our approach to inspection.
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
The 2 extremes then sit on the following page:
The ‘expected standard’ is in the middle of the page of the toolkit because this is what we would typically expect to see on inspections. It covers the statutory, professional and non-statutory guidance that providers are already expected to follow.
The ‘strong standard’, with its tighter definitions, looks for evidence of practice to be consistent, embedded and highly impactful. It sits to the right of the ‘expected’ standard.
An evaluation area will be graded ‘needs attention’ when the ‘expected standard’ of the particular evaluation area is not met because weaknesses or inconsistencies in practice have a negative impact on children, pupils and learners in general or on a particular group (see ‘inclusion’ section).
If we identify that standards for children and learners must be urgently improved, we will not hesitate to call it out.
We have further information on how we will support providers that receive this grade in the sections on:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
In either case, we will take proportionate regulatory action to ensure that the setting meets the relevant statutory requirements. This may include giving the setting actions. If any areas are graded as ‘needs attention’, we will reinspect within 12 months. If any areas are graded as ‘urgent improvement’, we will reinspect within 6 months.
The early years operating guide sets out the processes in more detail.
Figure 1: Placing a school into a category of concern
Following feedback from the consultation, we are taking the following actions:
Following feedback from the consultation, we are taking the following actions:
The end of each monitoring inspection will include checking the readiness for the removal of a category of concern. This will determine whether monitoring will continue or whether the school has improved enough to have a full inspection, in which the category of concern can be removed. We will set out the findings of each monitoring inspection and publish these in the report card.
If inspectors consider that a school in a category of concern has improved so that it can be taken out of a category of concern, they may deem that monitoring inspection to be a full inspection. They will then complete all the activities of a full inspection and produce a full report card, with updated grades.
We have taken steps to address these concerns, and will do more. So far, we have done the following:
We are taking the following actions:
We heard powerful feedback on how we propose to implement the government’s plan for new report cards for school inspection, which we are also applying across other education inspections. This has helped us to make changes to our report card design, toolkits and methodology (see the ‘Summary of changes’ section).
The response to our plans from parents was resoundingly positive.
We asked YouGov to carry out independent quantitative and qualitative research with parents. This found that parents were very familiar with Ofsted. Of the parents that YouGov surveyed:
However, parents also told YouGov in surveys and focus groups that the information we provide should be better. The findings of the YouGov research we commissioned suggest that what we proposed is a big improvement:
These findings were consistent with a More In Common poll commissioned by ‘Schools Week’ in February 2025. In this:
We also asked YouGov to research how useful the new reporting format was. This found that, of the parents surveyed:
We commissioned YouGov to run a series of focus groups with parents of school-age children and children in early years settings to find out their views on the report card format, the grading scale, and the evaluation areas.
Parents of school-age children were very positive about the 5-point scale:
YouGov also ran a focus group with parents who worked in education, mostly teachers in primary and secondary schools. This showed us that there was not an acute divide between parents in general and parents who are also education professionals on how we report. Parents who were education professionals also welcomed the changes:
In the YouGov focus groups with parents of children in early years settings, we saw a contrast in views. Parents who sent their children to group-based early years providers (such as nurseries) were far more positive about the report card format and scale than parents whose children were looked after by childminders. For parents of children who were looked after by childminders, the choice of provider was informed by factors other than Ofsted’s reporting, such as their relationship with the childminder being more personal.
The YouGov research also found that 84% of parents surveyed said the colour-coding system from dark green to red was useful. However, in its qualitative research, YouGov found that, while ‘the traffic-light colour approach is universally understood’, there were some challenges around accessibility. We found the same challenges in our user research, as the proposed 3 different shades of green did not have enough colour contrast for users to differentiate between them. Some alternative colour options were proposed for us to consider based on parents’ insights, which we have now adopted following extensive direct user testing.
In polling, 86% of parents surveyed said the labelling of the evaluation areas was useful. However, in focus groups, we were able to get specific views on the terms we had used. We found that terms such as ‘developing teaching’, ‘secure’, ‘causing concern’ and ‘exemplary’ were misunderstood or challenged. In response to these concerns, we renamed the evaluation areas and grading scale terminology.
The YouGov research was designed to hear from a representative sample of parents. Our online consultation attracted more negative views. Online respondents were not as in favour of the 5-point scale of the report card format as those in the YouGov polling sample. Many were also critical when it came to the number of evaluation areas proposed. Those who responded positively to the consultation said that the report card was ‘easy to use for everyone,’ and appreciated the parent-friendly layout that allows you to easily search for relevant information (unlike a full report in the previous style). Parents frequently asked for context alongside data in the report cards to help inform their views. Some were also concerned that early years providers were being evaluated using school-based metrics.
The proposals have generated a mixed and sometimes negative reaction from early years professionals, headteachers, teachers, professionals and providers.
Education professionals welcomed some of our changes through the consultation feedback, such as removing the overall effectiveness grade across all remits and the greater nuance and detail in the report cards. They also saw the value of publishing data but stated that any performance data would need to be accompanied by contextual information, such as data on the demographics of providers and insights on issues such as inclusion from the inspections.
However, we also heard many concerns about our proposed reporting system. We heard a range of views on different approaches to reporting, particularly different forms of grading scale. A common thread in the feedback was the preference for a more narrative-based inspection report, or a ‘met or not met’ grading system, as opposed to a scaled grading system.
Respondents from the early years and schools sectors offered similar views, but early years representatives were distinctly more positive than schools representatives. A major childcare and early years provider representative said they believe the new grading system will help providers to identify areas to improve more precisely, due to the 5-point scale providing more granular feedback on what providers need to improve.
Organisations representing school professionals, including headteachers and teachers, had a strong negative reaction to the report card proposals. This included a media campaign opposing our plans, which criticised: the principle of grading schools; the grading scale; the increased number of evaluation areas, despite also having concerns about one overall effectiveness grade; and the proposed use of colours.
Submissions from these organisations expressed strong concerns that there would be high-stakes pressure and increased workload associated with the proposals, including by using any form of grading scale. They were also concerned about how consistent inspectors’ grading would be across the proposed increased number of evaluation areas, and the practicalities of inspectors using the toolkits across a 5-point scale, due to weaknesses in the descriptors within them.
Consultation responses from early years and school professionals also shared concerns about the implications for workload, fairness, the grading scale, and the complexity of evaluation criteria.
Some school professionals liked the concept of sharing best practice and the recognition they would receive through the new ‘exemplary grade’ (now ‘exceptional’). However, they were hesitant about the workload with the proposed case-study submission approach. We have now removed this approach and instead will encourage schools and other providers to share their ‘exceptional’ practice through other ways (see toolkits for detail).
Through the user research we carried out, we were able to further explore some people’s preference for a more narrative-based inspection report. We were able to confirm the level of detail and narrative that users would like from the report cards, and have made changes in light of this.
We had far fewer respondents from the and sectors. Of those we did hear from, their views on our reporting specifically were similar to those from the early years and schools sectors.
One of the main concerns specific to the sector was that data on issues such as completion could be misinterpreted by people who have limited knowledge of the sector.
respondents also cautioned that there is no single source of achievement data that we could use for all provision types. Some said that specialist providers would not be well represented by qualification data because some or all learners would be working towards personalised learning goals, rather than external accreditation. Representatives of nominees in the sector were positive about the proposed 5-point scale, and their member survey indicated strong overall support for this approach.
Like other respondents, providers expressed concerns about the number of evaluation areas. Some organisations recommended that we reduce the total number to allow inspectors enough time to thoroughly cover everything in the toolkit.
We have set out:
Across Ofsted-led focus groups with early years professionals, and in many consultation responses, we heard broad encouragement around the clarity and relevance of the proposed toolkit.
Respondents to the consultation said that the toolkit provides transparent information about what inspectors are looking for and could be helpful for self-assessment.
In focus groups, early years providers generally welcomed the content and structure of the toolkit, as well as the focus on inclusion throughout.
Some responses, particularly from childminders, noted that the level of detail in the toolkit may need further adjustment to better reflect their specific context, for example the purposes of different provision types and the capacity of smaller settings to prepare for inspection.
There were broader questions and comments about the suitability of the toolkit for different early years providers. They wanted more detail about how inspectors would adapt it for different settings, and some suggested that it could be more closely aligned with the purpose and principles of their provision. Others recommended that the language in the toolkit should better reflect the language used by early years providers.
Some consultation responses suggested that there were too many evaluation areas for early years. There were some concerns that the breadth of the toolkit could have an impact on staff’s workload and well-being. Some were also concerned about the relationship between inspectors and early years providers. They highlighted the need for comprehensive training to support consistent and constructive inspections.
Across all education remits, on testing visits, inspectors, providers and respondents raised issues about the differentiation between the ‘secure’ (now ‘expected standard’) and ‘strong’ (now ‘strong standard’) grades across toolkits. This was also brought out strongly in responses from the schools sector, both through the consultation and direct feedback.
Many respondents felt that the toolkits did not do enough to acknowledge the challenges faced by schools in deprived areas or those with limited resources. They wanted us to revise the toolkit to better accommodate the context of schools and the realities they face, particularly in terms of resources and recruitment challenges, leadership structures, and the socioeconomic backgrounds of students. Some also raised concerns that schools may be judged unfairly in the achievement and attendance section.
Respondents encouraged us to make use of contextual data to improve inspectors’ understanding of the different degrees of challenge schools face. Another broad concern was the breadth of the overall reforms, such as the number of evaluation areas and associated toolkits, and the workload this may generate. However, the inclusion of staff well-being in the toolkits was welcomed.
We heard that the term ‘developing teaching’ was not clear and not everyone understood it. However, respondents welcomed our focus on professional development within this evaluation area.
We asked, in the consultation, whether the toolkit would work in practice for special schools and the we inspect. Respondents noted the importance of adapting the toolkits for these settings and ensuring that inspectors who inspect special schools and have the right level of expertise to make informed and consistent evaluations.
Respondents on behalf of independent schools broadly welcomed the alignment with state-funded schools.
Across all evaluation areas, independent school respondents stated that the toolkit should refer to existing professional standards and guidance, where these are available.
They also highlighted the need for inspectors to understand the context they work in, especially regarding small settings.
As with state-funded schools, independent school respondents stated that the toolkit would need to be adapted for the independent special schools and independent settings we inspect.
Feedback on the toolkits from professionals working in reflected a wide range of views and valuable insights.
The consultation responses noted the complexity of inspection for providers that have multiple types of provision. Some respondents noted that the number of potential evaluations – up to 20 – could be difficult to manage. Several stakeholders recommended merging evaluation areas to simplify the toolkit. Some focus group participants recommended further refining the evaluation areas to align more closely with the schools toolkit, for example expanding ‘leadership’ to ‘leadership and governance’ and strengthening the emphasis on ‘personal development’, as seen in the education inspection framework.
The consultation also revealed mixed views on the toolkit’s suitability across different provision. Some respondents felt the toolkit was well suited to 16 to 18 college provision; others found it less applicable or harder to use for independent learning providers and apprenticeship programmes.
The toolkit received some positive feedback through the consultation response. Respondents welcomed the overall relevance of the toolkits to . Some consultation respondents and focus group attendees noted that we could change some key terms in the toolkit to better align with the sector.
Several stakeholders felt that we needed to do further work to ensure that the standards in the toolkit were suitable and terminology clearly distinguished between trainee, teacher, leader and mentor, for example.
Consultation respondents raised specific concerns about how inspectors would measure achievement; they wanted further details on the extent to which success would be measured against trainees’ outcomes or the provider’s performance.
Our focus on inclusion was widely welcomed across all education remits. Respondents felt that inclusion was reflected as a core priority across all toolkits. A number of education professionals agreed that providers needed to get it right for the most vulnerable and disadvantaged in order to get it right for everyone.
Although many education professionals welcomed our proposed definition of inclusion, some suggested that we should leave a formal definition to the .
Education providers also wanted inspectors to recognise the context they were operating in. Across all remits, respondents raised the importance of recognising systemic challenges such as funding, parental responsibilities, availability of health services and availability of social care services. They cautioned that all of these have an impact on the extent to which they could be inclusive.
There were also some concerns about how the toolkit would be applied outside mainstream school settings. Consultation respondents noted that it would be challenging for small settings to evidence their inclusive practice during an inspection. Early years providers, in particular, questioned how inspectors will consider the impact a setting can have if they only care for a child for a fraction of the week. and respondents also cautioned that inclusion looks different for them, and that inclusive practice for adult learners has to be based on consent.
We have set out:
We heard thoughtful and constructive challenge on the ‘look and feel’ of inspection – our methodology and overall approach.
Consultation respondents liked the shift towards a more supportive, empathetic approach to inspections. They also appreciated the starting point of ‘expected standard’ (formerly ‘secure’) for schools, unless evidence suggested otherwise. Across all educational remits, professionals and inspectors who responded to our consultation or participated in testing visits stated that the new approach to inspection was more collaborative.
However, many education professionals in the online consultation were dissatisfied with the overall changes. They felt that the reforms did not go far enough in addressing their concerns. They were concerned that the increase in evaluation areas could lead to greater workload, more stress and negative impacts on well-being. Some were also concerned that the reforms would not adequately account for the unique contexts of individual providers, leading to unfair outcomes that lack nuance.
Respondents were also uncertain and unclear about what an inspection would entail without deep dives. There was a sense of heightened anxiety about the proposals on the new methodology, due to their familiarity with the current process.
Early years and schools professionals welcomed the changes to our methodology to make inspections feel more collaborative.
In focus groups, headteachers and leaders said that they welcomed the opportunity to explain the context of their provision during the planning call and the opportunity to demonstrate the work they are doing on inspection. They also generally liked the fact that inspections will be starting with ‘secure’ (now ‘expected standard’) in the methodology. Early years respondents to the online consultation also said that their workload might reduce as a result of this change. Some focus group attendees thought it was still unclear how the methodology and inspection process will work in practice and that the wording of the toolkits should be made clearer.
Consultation and focus group evidence suggested that school professionals particularly welcomed our removal of deep dives, as this would reduce the pressure on, and workload of, middle leaders. They also noted that this change would allow inspectors to spend more time with leaders and pupils.
There was also support for making senior leaders the focus of inspections, and the emphasis on professional dialogue and collaboration. Some school inspectors also welcomed the removal of deep dives; they felt that the methodology had become too narrow. Other inspectors wanted more clarity on what the renewed methodology would be like without deep dives.
There were mixed views about how the role of the nominee would work. Some thought it may help to give assurance on inspection; others thought it would be an unnecessary burden for smaller providers, especially childminders who work alone. Early years professionals and inspectors asked how the notification call would work in practice for childminders and other small settings.
Representatives of school group leaders welcomed our more nuanced approach to evaluating school performance; emphasis on inclusion; and commitment to identifying and sharing best practice to raise standards system-wide.
They also recommended ways to strengthen the validity and reliability of inspections, such as by simplifying the grading system, particularly around the use of ‘exemplary’; refining the wording of evaluation areas; merging the assessments of teaching and curriculum; and treating inclusion as an aggregated evaluation area. They also advised ensuring that the methodology supports consistent grades and reducing the volume of proposed monitoring activity.
Representative groups of school professionals, such as headteacher and teacher unions, were highly critical of all the proposals in the consultation. They raised concerns about the perceived increase in pressure on school leaders, the feasibility and fairness of inspections, and the lack of sensitivity to different school contexts. Respondents welcomed the removal of deep dives and supported our commitment to continue to emphasise inspectors’ professionalism, courtesy, empathy and respect.
The alignment of independent school inspections to state-funded school inspections was broadly welcomed, to maintain a level playing field across school types.
Focus group participants and online consultation respondents had mixed views on the removal of deep dives in inspections. Some felt that the removal would help ease the pressure on middle leaders during inspection. Others felt that the focus on subjects through deep dives had helped to drive up standards.
respondents were also concerned about whether the proposals would be appropriate in a range of provider types, and how inspectors would take account of the context of these providers.
In focus groups, providers spoke about their experiences of inspection. They said that they welcomed the focus on collaboration. Some mentioned their concerns about potential increases in workload.
We have set out:
The increase in monitoring requirements under the renewed framework has raised some concerns from education professionals who responded to the online consultation. They were worried that the proposed frequency of monitoring inspections would increase their workload, that the visits would detract their focus away from making improvements, and that this could negatively affect staff’s mental health and well-being.
Representatives of school group leaders were also concerned that the proposed frequency of monitoring inspections could add burden without a clear benefit. We heard from another representative body that the frequency would be disproportionate, and would not allow schools enough time to make meaningful improvements.
Respondents to the online consultation also noted the potential overlap of our proposed monitoring with the ‘s ‘regional improvement for standards and excellence’ () teams. They were concerned that if a school is subject to both Ofsted monitoring and support from the ‘s teams, then this would increase the stress for staff and could lead to mixed messages and inefficiencies. They said schools might be at risk of receiving support from too many external sources, especially when advice might differ.
Monitoring also came up in responses to our question about what we could do to help reduce or manage any unintended consequences of the changes. Respondents suggested removing grades and reducing the friction of inspection, but they also noted that increased monitoring inspections would encourage Ofsted to take more of an advisory/support role for schools. This showed that monitoring is seen as a positive by some in the sector.
We have set out our approach to monitoring.
In response to the question on how we propose to identify schools causing concern, many agreed with the process and welcomed its clarity.
Some of the more negative respondents considered the language to be stigmatising; they felt that labelling a school as causing concern would lose the community trust and damage staff morale. We have since changed the terminology to ‘urgent improvement’.
They said that the renewed framework increases the number of potential points of failure and could increase the risk of schools being judged negatively. Some felt this may discourage leaders’ aspirations and hinder efforts to continuously improve.
We have set out our approach to identifying schools causing concern.
Inspectors who responded to the consultation had a range of views. They praised the nuance that report cards would bring to inspections and said that having 5 grades would allow them to distinguish between providers that currently just about reach a ‘good’ rating and those that are not quite ‘outstanding’. However, many noted that they would need to be trained to grade consistently across the 5-point scale.
Some inspectors told us the importance of having a clear ‘expected standard’ grade that looks at whether providers are meeting their statutory and non-statutory responsibilities.
Others thought the number of evaluation areas and associated toolkits might increase their workload. Some raised questions about what the changes to the methodology would mean for inspection practice, such as the removal of deep dives.
Many thought the changes would lead to more supportive inspection practice.
We have set out our approach to grading and inspection methodology and toolkits.
We know professionals understand that our first duty is to the children and learners we are charged with protecting. It is our job to call out practice that undermines children’s safety or robs them of their one chance at an education that allows them to thrive.
Sadly, we too regularly uncover bad actors working in education and care. We must always be vigilant not to provide opportunities for wrongdoing to go uncovered. But more often, unacceptably low standards are not the result of malign intent. Instead, well-intentioned professionals are struggling in difficult circumstances. In these cases, we must still act to protect children and learners while enabling professionals to receive the support they need.
This is why reducing workload and promoting well-being remain central to our approach. We will maintain our focus on raising standards and holding providers to account. But we also believe giving education professionals time and space to receive support allows them to do their best to raise standards and protect children.
We accept we have a challenge. To change our approach, we have to change. The government committed to this change in its manifesto, and parents and professionals called on us to change through the Big Listen. But, as the independent workload and well-being review we commissioned recognised, any change to the framework is likely to result in stress due to the potential workload consequences that may come from adapting to a renewed framework. We are doing what we can to alleviate the pressures of that change on professionals, while being true to our duties to children and learners, and our responsibilities to parents and carers to offer them the nuanced information they have called for.
This section sets out what we are doing to address the workload and well-being implications of our inspection reforms through:
We have considered leaders’ workload and well-being from the outset of our reforms and in our methodology design. In the consultation, we stated:
This commitment remains embedded in our approach.
First and foremost, nothing in the standards set out in the toolkits should add to a provider’s workload. Our toolkits are built on the requirements, standards and expectations already placed on leaders and their provision. This includes statutory and non-statutory guidance and standards that professionals should be meeting. They are also based on the research and inspection evidence that suggests the most effective strategies in securing better outcomes for all children and learners.
We believe inspections will help providers focus on meeting those expectations more efficiently and effectively. We do not expect any provider to be doing more than it needs to just ‘for Ofsted’.
We have built on this through our inspection practices. Our revised grading is more nuanced, fair and informative, and we believe it better supports well-being than the previous model or the alternatives considered.
As set out earlier, we have designed the inspection methodology to be more collaborative, to minimise the stress of inspections. We have considered this so deeply that it even flows through to how we have structured the toolkits themselves. The toolkits group the 3 most common grades (‘needs attention’, ‘expected standard’ and ‘strong standard’) on one page, with the 2 extremes (‘urgent improvement’ and ‘exceptional’) shown separately. This will help to focus attention during an inspection on the areas where most providers sit, and make it clear that inspectors are not to looking to ‘catch leaders out’, as some falsely fear.
We have revised the grading methodology so that it is fairer and more informative, while reducing unnecessary anxiety. We will keep leaders and nominees (where relevant) informed about likely grading outcomes through regular reflection meetings, which will be an opportunity to review emerging evidence (for more detail, see the operating guides). This will help to reduce the build-up of anxiety around revealing the inspection grades at the end of the process.
We are introducing a more detailed report card with a 5-point grading scale. This recognises providers’ strengths and areas for improvement. It offers a more nuanced form of reporting and replaces the previous ‘overall effectiveness’ grade that we know from the Big Listen caused much anxiety across the education workforce. We also believe that we can reduce anxiety by ensuring consistency in grading.
We are assuring leaders about how we will see the context of their provision, and how we will adapt our inspections to different settings. Our section on monitoring and reinspection sets out how we can ease the concerns of those leaders worried about the ‘needs attention’ and ‘urgent improvement’ grades, and how they can be improved within an inspection cycle.
We have a section on everything we are doing to reduce the workload and support the well-being of leaders and education professionals. This explained how:
We stand by the steadfast commitment we made in the Big Listen to reset the relationship with, and consider the well-being of, those we inspect in any changes we make. This is why we have taken the concerns raised during the consultation about workload and well-being very seriously.
We have been determined to refine our approach through testing. Through March, April and May, we held a series of test visits across a range of providers in the different education remits we inspect. We wanted to understand the impact of our proposals on the ground.
This first round of test visits was based on the toolkits and methodology we consulted on in February.
After the test visits, we asked providers whether the proposed methodology was likely to reduce their inspection-related workload compared with how we currently inspect.
The findings were mixed. About half of all providers agreed it would reduce their workload; half disagreed. It was about 50/50 in schools, more early years providers agreed and more providers disagreed. As part of this feedback, leaders across all remits reported that they appreciated being more involved in the inspection event, even though it was demanding on their time.
After the consultation closed, we made significant changes ahead of further testing. These included:
After making these changes, we carried out a subsequent series of test visits through June and July. Those test visits allowed us to assess the impact of these changes.
Feedback from the test visits also gave us more evidence that we can build on. Providers remained divided on whether the reforms will reduce their inspection-related workload in the years between inspections. Most early years providers agreed that their workload would reduce during inspection days compared with under previous inspection frameworks. More schools and FE providers said that theirs would not reduce, as they felt that there is always the need to prepare for inspections.
We had positive feedback on other elements of the methodology. Almost all early years providers and schools involved in the test visits agreed that the planning call helped them to understand what to expect. Almost all early years providers and half of schools said that the proposed inspection methodology did not negatively impact on their day-to-day running. All early years providers and almost all schools said that accompanying inspectors and talking with them on the visit helped to develop a shared understanding of their provision’s strengths and areas for development. All early years providers and most schools said that, overall, they were satisfied with the way evidence was gathered. Almost all early years providers and schools, most providers and all providers felt that conversations with inspectors about grading were collaborative.
These findings bolster our confidence that our approach will be more transparent, less intrusive, more supportive and more collaborative – which should combine to reduce anxiety and support the well-being of those we inspect.
In addition to this extensive testing of our approach, we commissioned Sinéad McBrearty, Chief Executive Officer at Education Support, to carry out an independent review of the impact of our inspection reforms on the workload and well-being of the education workforce.
The review took a ‘mental health impact assessment’ approach as the framework for the analysis. It provided us with recommendations on how to manage the initial stress that is an inevitable consequence of change and the potential workload consequences that may come with it.
The review was split into priority actions and secondary actions for Ofsted. We have responded accordingly.
Recommendation 1: Explore and implement changes to reduce the isolation and individual responsibility felt by headteachers and principals.
Strong leadership is vital to a school’s success, but we recognise it is never the responsibility of just one individual. Leadership of a school is shaped not solely by one individual but by a group of leaders, including those who have statutory responsibilities for the well-being of the headteacher. To reflect this, we name headteachers and the chair of governors or trustees and the chief executive of the multi-academy trust (if applicable) on school report cards. This makes it clear that inspection outcomes are a collective responsibility.
We also recognise that leaders’ well-being and workload are influenced not only by reporting but by the whole process of inspection. As we have set out in the sections inspection methodology and workload and well-being, we are significantly improving this process through our reform. The inspection methodology is designed to ease leaders’ workload by tailoring inspection activity to each provider’s context, involving leaders throughout, and reducing the likelihood of unexpected findings through the sharing of emerging grades. Introducing an optional ‘nominee’ role for all remits should ease the inspection process and help reduce the demands placed on providers. This builds on changes we have already made to address headteacher isolation, including that all headteachers and teachers could have a colleague from their school or trust join discussions with inspectors.
In addition, the ‘s revised accountability model, combined with our approach to monitoring inspections that can review and update any grade below ‘expected standard’, gives leaders a clear opportunity to make rapid improvements and to have these recognised in subsequent monitoring visits. If a provider improves, we will then update its report card. This will ensure that providers are fairly represented to parents and the public. This change has important implications for well-being, as progress can be recognised promptly.
Recommendation 2: Invest significantly in the well-being and professional development of the workforce.
We want to minimise stress and workload pressures for inspectors as well as providers, to ensure that they are at their best.
We will add an extra inspector to inspection teams for schools for the first day to boost inspection capacity and support inspection teams. By shortening inspection days, we will reduce inspectors’ workload and by improving the opportunities for dialogue between inspectors and providers we will make the inspection experience more positive for everyone involved.
We have engaged with inspectors’ trades unions closely on our plans.
We have developed a comprehensive package of training for inspectors for the launch of the renewed framework. This training will help to refresh the core skills, knowledge and behaviours that inspectors need to carry out inspections effectively. It will also help prepare them to inspect with the professional, propositional, procedural and conditional knowledge they need to be at their best. The training will include refresher sessions on mental health and well-being, which build on the training that we rolled out on this topic following the Prevention of Future Deaths report in 2024.
Recommendation 3: Introduce an unequivocal mechanism for independence in the complaints process.
We have already made significant changes to how we handle complaints in response to concerns raised about the process. However, we are determined to go further to build trust in how we do this.
We are improving communication with complainants: investigating officers offer direct conversations to better understand their concerns. We have set up complaints panels with external sector representatives, who review whether complaints are handled fairly. These panels began in January 2025 and will be strengthened further by increased involvement from external representatives to enhance transparency and trust in the process.
We are continuing to work closely with the on how we can introduce further independence into the complaints process. The contracts with the Independent Complaints Adjudication Service for Ofsted, which is run by an independent body and gives recommendations to us on how to improve our complaints handling.
Under our new Chair, we expect the Ofsted Board to take a significant role in developing our complaints policy. The Board will challenge us on the quality and independence of our processes and monitor this work.
Recommendation 4: Develop a clear protocol for responding to individuals in acute distress or at risk of suicide.
In response to the Prevention of Future Deaths report, we introduced measures to respond to individuals in distress. This included a policy allowing inspectors to pause an inspection if they have concerns about an individual’s well-being. We also embedded mental health awareness in all inspector training. We will update that training regularly in response to the latest research and guidance. When the British Standards Institution’s standard dedicated to suicide awareness has been finalised and published, we will review it and ensure that our training reflects it.
We have also launched a provider contact helpline and created an ‘inspection welfare, support and guidance hub’ to offer support and guidance to inspectors and providers during the inspection process. These steps are part of our wider commitment to ensuring that everyone we inspect is treated with professionalism, empathy, courtesy and respect. We have completed every action we committed to in our response to the Prevention of Future Deaths report.
We developed our inspection approach to accommodate concerns where we are responsible for addressing them. For example, we designed our pause policy to create space during an inspection to allow responsible bodies to support an individual experiencing distress. Ofsted is not the appropriate organisation to provide that support itself; we should not step in where others have responsibilities to do.
Alongside the Prevention of Future Deaths report, we also commissioned Dame Christine Gilbert to lead a learning review of Ofsted’s response to the death of Ruth Perry. This looked at the actions we took in response to hearing about Ruth Perry’s death, our communication and engagement with stakeholders, information-sharing within Ofsted, and the support we offered internally to staff. After every Board meeting, we publish a report on our progress in completing the actions set out in the Big Listen, including in our responses to Dame Christine Gilbert’s review. From September, Dame Christine will become Chair of the Ofsted Board.
The also responded to the Prevention of Future Deaths report, committing to improve communication with schools, review safeguarding guidance, and strengthen support for school and college leaders.
Recommendation 5: Monitor the unintended consequences of the revised framework highlighted in this report.
As part of preparing for our inspection reforms, we have carried out the workload impact assessment we set out above. This included testing the impact of our reforms ‘on the ground’ through test visits, as well as commissioning this independent review. We have taken on board the findings from those test visits and the recommendations of this review to inform our changes.
As we start inspections under the revised framework, we want to keep checking for any unintended consequences. In autumn, we will invite a random sample of providers to take part in ‘exit interviews’ with His Majesty’s Chief Inspector, the National Director for Education and other senior Ofsted officials. These interviews will supplement the standard post-inspection survey and give us deeper insight into the impact of the changes.
We will also start holding ’roundtable’ meetings with sector representatives to gather qualitative feedback on the impact of the reforms in real time. We will continue to listen to, reflect on and respond to any challenges. We have also commissioned an independent evaluation of the renewed framework. This will start with a baseline study in summer/autumn 2025, followed by in-depth qualitative research in spring 2026 and an ongoing post-inspection survey beginning in spring 2026 and continuing in summer and winter.
We will use these insights to help us respond to any emerging issues as fast as possible and to adjust the framework when needed. We do not want the framework to be ‘fixed’. We intend to amend it as necessary to take into account changes to government policy, experience of inspections on the ground, feedback from stakeholders and evidence from research and reviews.
Recommendation 6: Develop and monitor key performance indicators to track the progress of key actions identified in this report.
We are committed to being a transparent, learning organisation; we will continue to review all available evidence on the impact of the renewed framework to inform our future improvements.
As part of this, we will carry out a comprehensive evaluation programme to understand both the implementation and impact of the framework. This will include:
Our Strategy and Delivery Unit, set up in response to the Dame Christine Gilbert Review, will track the progress we make against each of the actions we have committed to in this response, and give regular updates to the Ofsted Board.
Recommendation 7: Carefully monitor and be prepared to revise the amount of inspector time that can be allocated to contested inspections.
We have made significant reforms to how we grade providers to make the process of inspection more collaborative, and improve the consistency of inspections. We believe these changes will lead to fewer contested inspections.
For more challenging inspections, our regional directors will be able to give inspectors more time to gather evidence to inform their grading.
If an inspection is contested through the complaints process, we will dedicate expertise from across Ofsted to review the inspection outcome and ensure that we give an accurate grade.
Recommendation 8: Develop a plan to address the particularly low level of trust in Ofsted among primary schools.
Rebuilding trust in the inspection system is a priority for Ofsted. We understand that trust must be earned through openness, fairness and a clear commitment to listening and responding.
The Big Listen was a key step in this effort. It gave all those we inspect, including leaders, staff, parents and carers, the opportunity to share their experiences and concerns. We heard the need for more transparency, greater empathy during inspections, and a system that better supports well-being while maintaining high standards for children and learners.
We believe that the reforms to the renewed framework will help to instil greater trust in inspections. Several measures in our renewed framework directly address the concerns that primary school professionals raised – such as removing deep dives, which were particularly difficult for small primary schools to manage. The toolkits also have specific sections explaining how inspectors should adapt inspection activity for smaller settings, such as primary schools. Schools will also have the option to have a ‘nominee’ who can liaise directly with the inspection team, which will support a more collaborative inspection experience.
We will train our inspectors to ensure that they are well equipped to understand the specific context and challenges of different providers, including primary schools.
All primary school inspections will be led by inspectors with expertise in the primary phase. In the rare cases where this is not possible, we will use additional quality assurance measures. We aim to improve the consistency and accuracy of inspection findings in primary settings.
We are confident that our new approach will promote stronger collaboration, greater consistency and renewed confidence in the inspection system. However, we recognise that our renewed approach, and the process of change itself, may create some additional workload for some providers. This is why we have taken the extensive steps set out above to alleviate these concerns.
The independent review focused mainly on the schools sector, in which workload and well-being concerns had been the subject of much attention and had been a major concern of sector representative organisations for some time.
This does not detract from our focus on the workload and well-being concerns of all the other sectors we inspect and regulate. For instance, as we increase the frequency of routine inspections for early years, we will review the workload and well-being implications of early years inspections and what we can do to mitigate them. In the long run, we believe that more frequent inspections will give greater assurance to providers, professionals and parents alike.
We recognise that inspections can be stressful. That is to some extent inevitable in an inspection system fundamentally aimed at ensuring that proper standards of education and safeguarding are in place, and that parents are fully informed on those matters. However, we are determined to minimise this stress where we can. We fully believe the changes we have made do this, and that they will lead to a more informative, transparent and fairer system of reporting that better serves children and learners, parents and carers, and professionals and providers.
We will start inspecting under the revised framework, using the operating guides and toolkits, from:
This will give providers at least a full 2 months to become familiar with the changes.
For state-funded school inspections, we will prioritise volunteers for full inspections in the weeks between 10 November and Christmas. These inspections will result in a report card, with a complete set of grades. We will return to the normal schedule for state-funded schools towards the end of the period and not before 1 December. If there are enough volunteers, we will continue to prioritise them after 1 December. We will not carry out inspections in the final week before Christmas.
Our Deputy Chief Inspector will review all requests for an inspection deferral to make sure each case is treated with the utmost sensitivity and consideration.
There will be a steady and consistent start to inspections. This month, we will use the end-to-end piloting process as an opportunity for as many and early years regulatory inspectors as possible to experience and apply the new methodology. We expect all to take part in a pilot inspection.
We will carefully structure our schedule to ensure that our senior lead the first inspections, with on the inspection teams or shadowing these inspections. From November until the end of the year, all inspectors will go through the process of shadowing, teaming and learning. We want to ensure that they are all confident with the renewed framework.
To support a steady and assured start, our National Director for Education and Principal Inspector will quality assure the work of the lead inspectors after their pilot visits to providers in early autumn. This will ensure that they are confidently able to carry out inspections to the required standard.
We will also evaluate the implementation and impact of our reforms. The evaluation plan will include:
We will publish data alongside the report cards to illustrate the providers’ and learners’ contexts and, where available, the performance and attendance data that we will use to support inspection.
The data will be:
When we have carried out additional analysis on the published data, for example comparing a provider’s figure with the national average, we will include a clear explanation of the methodology we used.
Early years providers: provider context
Independent schools (non-association independent schools): school context
Independent schools (non-association independent schools): pupil context
State-funded schools: performance – key stage 4
State-funded schools: performance – 16 to 18*
State-funded schools: absence
In addition to the data above, the provider’s page on our Find an inspection report site (where the report card will be) will continue to contain an ‘About this school/setting/provider’ section. This section includes up-to-date information, such as its address and provision type.

