Navigating the GDS Jungle

by Alex Malloy 

How do you get your government website through a GDS (Government Digital Service) assessment? With so many rules surrounding the GDS assessment criteria, it can be hard to know where to start. Here’s how you can navigate the key tenets to get it right from the outset.

“It’s like a jungle, sometimes it makes me wonder how I keep from going under”.

Grand Master Flash

Ah! The wonderful wisdom of Grand Master Flash. While it’s highly unlikely that he was referring to the potential pitfalls and perils of getting your government website through a Government Digital Service (GDS) assessment, the sentiment more than applies to this often complicated and confusing process.

Gov.uk listed almost 300 UK government websites which provide essential services to the public, businesses or other parts of the government. These belong to ministerial and non-ministerial departments, agencies and public bodies. The vast majority of these websites went through a rigorous vetting process before the great British public were allowed anywhere near them.

These websites were subjected to a set of far reaching guidelines known as the Digital By Default Service Standard. The standard is a huge repository of information and technical resources available to teams working on government websites. It covers everything from the initial user research, through to having the site tested by the government minister. Ultimately it allows digital teams to deliver world-class products.

Every government website is assessed to ensure that it meets a set of 18 GDS assessment criteria. With so many rules and restrictions, it can often be daunting and difficult to know where to start. There are several key tenets of the standard that you MUST get right from the outset, as you simply won’t be able to backtrack and revise the project if you start without them.

It’s in the way that you use it

At its very core the Standard is designed to ensure that tax payer funded websites are suitable to users’ needs, rather than simply matching the website administrator’s perception of their needs, or the government’s need to impart information to the users. To that end, you should:

1. Conduct a Discovery Phase where the users’ needs are analysed and properly understood. This usually involves workshops with user facing stakeholders and crucially the users themselves. It also involves assessing the different types of users and what services are already being provided to them. The outputs of the Discovery phase will determine the scope of the project as you begin to build your Alpha service.

Discovery outputs should include:

  • Problem definition,
  • An understanding of users,
  • An understanding of constraints,
  • A list of ideas and team required for the Alpha phase,
  • Measures of success.

2. Conduct user research throughout the project, not just at the start. GDS will demand evidence that your service has remained true to its users’ needs by continued testing. One of the most fascinating types of testing is lab based user testing, in which typical users are asked to perform a set of tasks on the work-in-progress website, often with unexpected results and frustrations. This type of user testing never fails to highlight problems with your user experience (UX) that can be quickly addressed. Other types of user testing include focus groups, one-to-one interviews and surveys.

3. Deliver clear deliverables in your Alpha and Beta phases.

4. Alpha outputs should include:

  • Lo-fi prototypes and possibly Hi-fi prototypes
  • An understanding of Accessibility needs and principles
  • Evidence of initial user testing service feasibility
  • An understanding of existing and legacy technology relevant to the service
  • An idea of the objectives and team for the Beta phase

5. Beta outputs should include:

  • Demonstration of key user journeys
  • Demonstration of metrics and order of magnitude of the service
  • Evidence for overcoming constraints of Live service
  • Privacy considerations
  • A dashboard for measurement of service KPIs
  • A number of hi-fi prototypes or a scaled down Beta service

Agile, iterative and user-centred methods

The user research mentioned above is channelled into a number of user stories that form the product backlog. The delivery team works in dedicated “sprint” periods of uninterrupted work, usually lasting two weeks. During this time a set of stories from the backlog are worked on, and developed. At the end of each sprint there is a new iteration of the service that can be released, and crucially used for testing. The users’ feedback is then channelled back into the product backlog, before the next sprint starts again.

Similar to BrightLemon and the occasional Gazelle, the GDS team loves the agile way of working. Here is a video from GDS explaining their take on agile, how it compares to older forms of project management and how the feedback from real users is crucial.

Useful links

BrightLemon