https://digitaltrade.blog.gov.uk/2025/09/10/from-daunting-to-demystified-the-evolution-of-service-assessments-in-dbt/

From daunting to demystified: The evolution of Service Assessments in DBT

Someone writing on a post-it note in front of a laptop with virtual post it notes on the screen.

Emily Agnew

Emily Agnew

Matt Sellars

Matt Sellars

What if a service assessment was not something to fear, but – dare I say it – something to embrace, with confidence?

Over the past year, we have been reshaping how service assessments work in the Department for Business and Trade (DBT). What started as a formal requirement has evolved into a more collaborative, human, and forward-thinking process. It is about helping teams build better services, not just pass a test.

In this blog, we will share how we have matured the process, and what we have learned.  We will also talk about why assessments are now less about ticking boxes, and more about building confidence, capability, and continuous improvement.

Where we started

When service assessments were first introduced in DBT (formerly the Department for International Trade) in December 2021, they were viewed as a formal requirement – something to get through, rather than something to learn from. The process was new, the expectations were unclear, and for many teams, the experience felt intimidating. There was a real fear of “failing” in front of senior stakeholders, especially for complex, policy-led services. Assessments were one-size-fits-all events and perceived as high-stakes.

When we first picked up the reins of the service assessment process last year, it felt like we were assembling a plane mid-flight. Teams were unsure how to prepare, what to expect, or even whether the assessment would reflect the realities of their work. Engagement was low because the process was reactive and treated more like a compliance hurdle than a meaningful checkpoint. Assessments were booked late in delivery and teams scrambled to prepare slide decks that tried to tick every box. It was not uncommon for teams to feel like they were being judged, rather than supported.

In short, the early days of service assessments were marked by uncertainty and anxiety.  There was a sense that the process was something to survive, rather than something to embrace.

Culture of continous improvement

Fast forward to today, and service assessments feel like a completely different experience. What was once a daunting hurdle has evolved into an adaptive process that supports teams to improve, learn and grow. We have built a culture of continuous improvement that embraces progression over perfection.

The new ratings system reflects this shift. Services are no longer given the ratings of ‘met’,‘not met’ or ‘met with conditions’, but rated either Green (met), Amber (not met, not critical) or Red (not met, critical). The Amber rating allows services to continue delivery while addressing the areas that need improvement. It keeps momentum without compromising standards, highlighting that assessments are no longer just a compliance tick box, but an opportunity to support the direction of travel. This encouraged a more open, constructive and conversational approach, reframing assessments as opportunities for learning, rather than judgement. Assessors are now encouraged to take on a more human approach: bringing empathy, curiosity and a genuine desire to support teams. This allows assessments to feel less like a tribunal and more like a conversation with a critical friend. It focusses on helping services move forward, not catching them out.

Early engagement has also been key. By working with service teams right from the outset, we have supported preparation for the assessment through embedded feedback loops. We have introduced assessor focus groups, participated in retros and even introduced a show and tell into our pre-meet calls. All these came from listening and adapting to user needs, reflecting our commitment to improving the process for everyone.

We have strengthened our relationship with the Government Digital Service (GDS), allowing us to trial new ideas which inform wider practice across government. Collaborating on training materials has helped grow our assessor community, with quarterly training for colleagues in both DBT and other departments. This ensures training is streamlined and consistent with GDS’s vision, increasing assessor confidence and building a better-connected community of practice.

The Service Standard is now a guiding light from day one, not just a checklist at the end. The results speak for themselves: since September 2024, compliance has increased from 50% to 76%. Our biggest increase is around accessibility and Service Standard 5, where our compliance has grown from 14% to 78%. Some Standards, such as 9 and 11, are now met 100% of the time – an achievement that felt out of reach a year ago. This is all thanks to the culture we have embedded, sustained by a growing community committed to doing things better, together.

Proportionate assurance

Not every service needs the same level of scrutiny, and that’s okay, assessments aren’t a one-size-fits all. We recognise this, and take a proportionate mindset to assurance, offering flexible options that fit the context. Whether it is a non-transactional service, or a complex service with multiple user journeys, proportionate assurance enables early intervention and course correction without blocking delivery. This helps teams stay on track even under severe time constraints or stakeholder pressure.

Our aim is for a mindset that is less of a blocker and more of an enabler. We support teams to set up for success by surfacing risks early, including encouraging Alpha reviews within the first 8 weeks. Even when teams are still shaping their direction, these early assessments can flag issues to senior stakeholders (such as under-resourcing).  This enables timely course correction before code is built, and change becomes more complex. We also offer pragmatic solutions such as regular assurance reviews. These are used to maintain alignment with the Standards across multiple structurally similar user journeys, without the burden of repeating full assessments.

This approach helps us strike a balance between rigour and flexibility, while supporting teams to meet the Standard.

What's next?

The journey from formal requirement to collaborative practice has been transformative, but we’re not done yet.

We are continuing to iterate the process based on what teams and assessors tell us they need. That means refining our guidance, learning where to focus support, and evolving our communications to make assessments clearer, more consistent, and more valuable. Their feedback is vital in shaping the future of assessments and strengthening the culture of continuous improvement.

As we deepen our analysis of how services are meeting the Service Standard, we’re working towards a future where everyone embraces service assessments, with confidence.  

If you are interested in how we adapted our service assessment approach, please contact DDaTAssurance@businessandtrade.gov.uk, to learn more.

 Check out our recent blog from one of our product teams who found a Service Assessment to be a positive learning experience.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.