Multi-school compliance tracking is one of those academy trust problems that looks simple until you try to run it properly. On paper, it sounds like “keep a list of actions and chase people”. In reality, you are trying to maintain a live line of sight from risk, to control, to evidence, to governance decisions, across multiple schools, multiple roles, and multiple systems. When that line of sight breaks, the trust ends up spending a lot of time preparing for committee meetings while still feeling unsure about what is genuinely under control.
That uncertainty is not just annoying. It cuts directly across what the Academy Trust Handbook expects trustees and audit and risk committees to do. Trusts are required to maintain sound internal control and run a programme of internal scrutiny that provides independent assurance over financial and non-financial controls and risk management procedures. The programme needs to be planned, timely, spread appropriately across the year, and reported to the audit and risk committee, with an annual summary report submitted alongside audited accounts by 31 December. (GOV.UK)
This is where a platform approach becomes useful. MYAUDIT.school is designed to bring planning, evidence tracking, action management and closure assurance into one operating workflow, so you are not rebuilding the picture every term from separate spreadsheets, emails and local folders. Used well, it can help a MAT move from “we think we are on top of it” to “we can evidence it quickly, consistently, and with confidence”.
This guide sets out how to make that work in practice, and what to avoid along the way.
Why multi-school compliance tracking breaks down in the real world
Most breakdowns are not caused by a lack of effort. They happen because the trust has multiple mini-systems doing the same job in different ways. One school uses a local tracker, another relies on meeting minutes, central teams keep a separate log, and evidence sits in individual inboxes. By the time you reach audit and risk committee, somebody is manually stitching together an “update” and hoping the underlying story is accurate.
That is risky for three reasons.
First, it creates version control problems. If a tracker is updated in two places, you do not have a single source of truth, you have competing truths.
Second, it encourages soft closure. When evidence is awkward to find, people lean on “it’s done” statements, and the trust quietly loses the discipline of verified closure.
Third, it makes trend management almost impossible. Trustees and executives need to see patterns across schools and across time, not only this month’s headline list. The DfE internal scrutiny good practice guide is clear that audit and risk committees should maintain oversight of the risk management and internal control framework and assess its application through the internal scrutiny programme, with regular updates and year-end reporting. (GOV.UK) That is much harder to do when the underlying data is fragmented.
The fix is not “chase harder”. The fix is to standardise the way compliance activity is recorded, evidenced, escalated and closed.
What good looks like, and why the data model matters
If you want multi-school compliance tracking to be dependable, it has to behave like a system, not a collection of documents. That starts with your data model. The term sounds technical, but the idea is straightforward: define the minimum information you will record for every finding and action, across every school, in the same way.
A practical trust-wide compliance data model usually includes:
- Entity: trust-wide, school-level, or central function, so you can report properly by area.
- Theme: domain and review type, such as procurement, payroll, safeguarding, website compliance, governance.
- Finding: a clear statement of what failed, written so a trustee can understand it.
- Risk rating: consistent logic, otherwise comparisons are meaningless.
- Action ownership and due date: named owner, realistic timescale, and clear accountability.
- Closure evidence and verification: what proof is expected, where it is stored, and whether closure has been verified.
Once those fields are standard, reporting becomes easier, and so does governance challenge. You can aggregate open actions by school, by theme, by owner, and by risk level without rebuilding the analysis each term.
This also aligns well with the risk-led intent of internal scrutiny. The good practice guide describes internal scrutiny planning as a risk-based exercise informed by the trust’s risk register, with an iterative relationship where findings inform risk scores and updates. (GOV.UK) You cannot do that iteration properly if your findings and actions are not recorded in a way that supports pattern analysis.
From planning to closure: making the workflow do the work
A platform only helps if it reflects how your assurance lifecycle actually operates. In most MATs, the lifecycle has four moments that matter: planning, fieldwork and findings, action delivery, then closure verification.
Planning should feel calm and structured. The audit and risk committee is expected to review and approve a risk-based programme each year, receive updates, and report regularly and at year-end. (GOV.UK) In practical terms, that means your plan needs to show what will be reviewed, why it matters, when it will happen, and what reporting points sit alongside committee meetings.
MYAUDIT.school can support this by acting as the place where the annual plan lives, rather than a document that disappears into a shared drive once approved. The value is not simply having the plan stored. The value is having the plan linked directly to the work delivered and the actions raised, so you can report progress without manual reconstruction.
Findings capture needs discipline. Weak tracking often starts at the point a finding is recorded. If one reviewer writes a crisp finding with a clear control failure and another writes a vague paragraph, your action owners and your trustees will struggle. It is worth standardising finding language so it is specific, evidence-based, and tied to a control expectation.
Action management should be more than reminders. Good action tracking is about clarity. Who owns it, what exactly will change, by when, and what proof will show it is working. Where responsibilities cross central and school teams, the platform should make shared ownership visible, not hidden in email chains.
Closure is where most systems fail. Trusts often track action status changes well, but do not separate completion from verified closure. That matters because trustees are relying on the internal scrutiny programme as independent assurance, and the Handbook expects reporting and annual summary outputs to support governance assurance and transparency. (GOV.UK)
A practical closure workflow is built around three states, used consistently:
- Updated: the owner has changed the status, but no claim of completion yet.
- Completed: the owner says the action has been delivered, with evidence attached.
- Verified closed: someone independent confirms the evidence is sufficient and the control is operating as intended.
For medium and high-risk actions, “verified closed” should be the standard for closure reporting, even if the trust still tracks the earlier states for operational management.
This distinction does not have to create bureaucracy. It often reduces it, because it stops the cycle of “we closed it, now it is back again”.
Evidence tracking that feels realistic for schools
Schools do not need a compliance system that feels like an extra job. They need one that fits how they work. Evidence tracking succeeds when the platform encourages small, repeatable habits.
A few examples that tend to work well:
A procurement evidence pack that is always stored and linked the same way for higher value spend, so reviewers can sample quickly without chasing.
A payroll change workflow where authorisation and reconciliation evidence is attached monthly, so payroll assurance does not rely on memory.
A website compliance check log that links directly to the page tested and captures the review date, because website compliance is about what the public can see on that date, not what you meant to publish.
On that last point, the DfE website publication guidance for academies and academy trusts sets out what must or should be published online, why it must be published, and the range of sections expected on trust and academy websites. (GOV.UK) If a trust treats that as a once-a-year task, it will keep finding the same gaps. If it treats it as a control and tracks it consistently, compliance becomes more stable.
The practical aim is simple: anyone preparing a committee pack should be able to pull evidence-linked updates without ringing schools the week before the meeting.
Dashboards that support governance decisions, not performance theatre
Committee dashboards can be genuinely helpful, or they can become decorative noise. The difference is whether they are tied to decision-making.
A useful audit and risk committee view is usually quite small:
- open findings by risk rating
- overdue actions, by owner and by school
- verified closure rate over time
- recurring themes across terms
- high-risk escalations needing committee direction
Those views map neatly onto what committees are expected to do: oversee the internal control framework, monitor the programme, challenge where weaknesses persist, and ensure risks are addressed appropriately. (GOV.UK)
The best dashboards also support good behaviour. When people can see that “closed” is not the same as “verified closed”, it changes the culture of assurance. It becomes normal to attach evidence and expect verification, rather than treating closure as a status update.
Integrating non-financial compliance without creating silos
A common mistake is to run separate trackers for “finance”, “governance”, “safeguarding”, “website”, “GDPR”, and so on. That feels tidy, but it fragments oversight. The internal scrutiny good practice guide is clear that internal scrutiny can cover a wide range of financial and non-financial areas, including IT and cyber, health and safety, estates, safeguarding, HR, culture and management information. (GOV.UK)
A single platform model works best when it can accommodate that breadth. You do not need every theme live from day one, but you do need a structure that allows the trust to bring additional domains into the same governance view over time.
Website compliance is a good example because it sits at the intersection of transparency, governance discipline and public accountability. The DfE guidance sets out specific publication expectations for academies and trusts, and it is frequently used as a quick indicator of how well basic controls are maintained. (GOV.UK) If website compliance findings sit in a separate spreadsheet, trustees are less likely to see patterns or recurring weaknesses across schools.
A rollout approach that avoids the “big launch” problem
The trusts that succeed with platform-based compliance tracking usually roll it out in phases. Not because they love project management, but because data quality matters, and bad data spreads quickly.
A sensible approach is:
Weeks 1 to 4: configure the trust structure, agree status definitions, and load one or two live review themes. Focus on getting ownership and evidence habits right.
Weeks 5 to 8: pilot with a small group of schools and one theme that produces regular actions, such as website compliance, procurement, or safeguarding training evidence. Use the pilot to refine what “good evidence” looks like and how verification is recorded.
Weeks 9 to 12: expand to all schools and bring in additional themes that align to the trust’s risk register and annual scrutiny plan.
Ongoing: refine dashboards, align reporting with audit and risk committee meetings, and introduce routine data quality checks.
A simple data quality gate at the end of each phase is worth the time. Check for missing owners, duplicate records, inconsistent risk ratings, actions marked complete without evidence, and overdue items that are not flagged. This gives governance leads confidence that the first dashboards are telling the truth.
What changes for trustees and executives when tracking improves
The biggest governance gain is not time saving, although that usually happens. The real gain is clarity.
Trustees can see whether risk is reducing, rather than relying on narrative reassurance.
Audit and risk committees can challenge based on evidence-linked trends, not a single meeting’s snapshot.
Executive teams can spot bottlenecks, such as recurring delays in a particular function or school, and address capacity or process design issues early.
Annual reporting becomes easier to defend. The Handbook expects trusts to submit the internal scrutiny summary report by 31 December and confirm in the governance statement which internal scrutiny option has been applied and why. (GOV.UK) A platform that links work delivered, findings raised, actions taken and verified closure provides a much stronger evidence base for that narrative.
If you are trying to build an assurance culture across a growing MAT, this is one of the most practical steps you can take. It helps good people prove good work, and it helps the board focus on risk outcomes rather than document chasing.
How internalscrutiny.co.uk can help
internalscrutiny.co.uk supports trusts using MYAUDIT.school to combine practical scrutiny delivery with strong compliance tracking discipline. We help teams set up a workable model quickly, align workflows with audit and risk committee expectations, and build reporting that highlights verified closure and recurring themes clearly.
You can explore the platform through our MYAUDIT.school service page, register directly at Register, or discuss trust-specific implementation via Book Audit.
Sources
Checked on 24 February 2026.
- GOV.UK, Academy trust handbook 2025: effective from 1 September 2025