The five-point guide to passing a GDS alpha assessment
Clearly understanding your MVP, managing ambiguity, and involving assessors in your project are among the top tips offered by Christina Ehlers of BJSS SPARCK
- Set up for success
Starting a project to go through a GDS Alpha assessment can be daunting. Often there's a new team with little first-hand knowledge of the insights gained during the discovery phase. previous user research can be time consuming for the team to wade through – and at a point when everyone is keen to get the development team coding.
One way to get everyone up-to-speed quickly and focused on understanding user needs is by conducting an 'inception workshop.' Using a more playful approach and including role play, user stories and oversized artefacts helps bring key project information to life and energise the team.
Another way is to get the team out of the building, meeting end users and experiencing the 'as-is' journey for themselves. Seeing users’ pain points first-hand leads to a much greater level of empathy and for the team gain a real sense of purpose when they play back their observations, helping to connect them with the people they're creating for.
- Clearly define the MVP
By the end of alpha, you should have a clear definition on what the MVP will be, and the ideas worth taking forward into beta to achieve that. You must think about it upfront. At the start of alpha, it is crucial to ensure that the team and the main project stakeholders understand and buy into the true definition of a minimum viable product or service.
Developing an understanding and agreement early in alpha as to what truly constitutes an MVP will save time and ensure expectations can be set and met – including those of your GDS assessors. During alpha, you will then explore what minimum scope makes sense for users. By doing this from a user’s perspective and keeping the accountability for decision-making with the PO, you can avoid unnecessary scope creep.
During alpha, there are a number of avenues to be explored. However, this can often end up with an expanding MVP scope as new ideas surface, excitement grows and more and more “critical” functions are defined. This leads to what is often now referred to as the “minimum acceptable service” to the business, including features and functions that those involved in the decision making are too afraid to leave out. While a degree of flexibility is required as the product is shaped by your user research results, some boundaries should be agreed in order to protect the scope.
By the end of alpha, you should have a clear definition on what the true MVP will be; you don’t even have to have prototyped all of them, but you should have a clear idea on which ones are worth taking forwards into beta.
Don’t forget: the sooner you get something out there, the sooner you can learn from real users in a real environment. This doesn’t have to be what the business considers the full service, doesn’t have to serve all users, and doesn’t even have to all be completed online – it just has to be enough to ensure users can reach their goal, and you can then learn from the experience.
MVPs assist learnings and are not about getting a lean service out into the world – don’t fall into that trap.
- Understand how much user research is enough
Empathising with the users, identifying their needs and pain points is key to user-centred design, but it can be challenging to decide when you’ve done enough research. There are no defined rules that say when research can stop – it’s all about confidence. Confirming insights from a previous phase for example, or seeing recurring patterns, are signs that you’ve got enough elements to make informed decisions. Ask yourself how you may justify your decisions; Why did you go with that design? Do you have a plan to further increase that level of confidence?
It is important that your research panel is as inclusive as possible. You’ll need to include candidates with varying needs and comfort levels when it comes to using technology. The service should be accessible for cognitive, sight, hearing and physical impairments and needs to be tested with assistive technologies. Specialist companies can help find the right candidates and carry out an accessibility audit of your service.
- Manage ambiguity
Sometimes GDS guidelines don’t have the pattern for what you need to build, and your designers need to fill in the gaps.
For example, BJSS was asked to develop an app for a GDS-assessed service. The existing style guide was incomplete, and did not cover app design at all. Our solution involved research from other design guidelines that could be adapted to a GDS look and feel. Working collaboratively with the GDS team guaranteed future consistency and assured we passed the GDS assessment, since we had evidence and prior engagement for all of the design decisions we took during the project.
- Involve the assessors
It’s important to ensure stakeholders and project teams understand what GDS is, why it’s important and that everyone is encouraged to take ownership for passing the assessment.
One way to do this is through regular ‘GDS health check’ workshops to assess the team's confidence levels against each point of the Service Standard. This helps the team to identify issues early and put actions into place so that they can be rectified well within time of your assessment date.
Also: engage with your assessment team early, and involve them in your journey – they want you to pass.
Invite them to meet the team, talk them through your approach, tell them about your challenges and most importantly ask for feedback.
After all, GDS is a team sport.
This article is part of PublicTechnology's How to Design a Government Service project, in association with BJSS. This specially created content week will feature a range of exclusive interview, feature, and analysis content dedicated to the art of delivering digital services for citizens and public sector professionals - from the earliest stages of discovery, right through to maintaining live services in use by millions of people. Click here to access all the content.
Simon Case tells MPs that adopting new technology is one of three key strands supporting efforts to reduce civil service headcount
Two-week ‘headcount efficiency review’ engagement aims to find possible cutbacks that could be achieved through use of technology
Government unveils plan to ‘replace Victorian infrastructure’ across routes in counties to the immediate north of the capital
Cabinet Office tech agency seeks leader to spearhead implementation of three-year plan