Apply for a Budgeting Loan - Beta Assessment

The report from the beta assessment for DWP's Apply for a Budgeting Loan service on 3 October 2016.

Service provider DWP
Stage Beta
Result Met

The service met the Standard because:

  • The team have built a digital apply product that is part of a wider loan service. It is well researched, has been iterated in an agile way to meet user needs and has been consistently improved, including making significant enhancements from the alpha phase.

  • A strong digital team is in place that is working in an agile way.

  • The team have learnt from the private beta, with over two thousand successful applications completed. The time to complete an application, to the processing and decision making has been reduced significantly, and the service is delivering clear improvements both for users and business operations.

  • Policy, operations and key stakeholders are part of the product development and team, resulting in notable successes such as removing a number of unnecessary questions and data.

About the service

Service Manager: Zoe Gould

The service ‘Apply for a Budgeting Loan’ enables eligible customers to apply for a loan and this new part of the service allows users to do this online. A Budgeting Loan is an interest-free loan that is paid back usually from weekly benefit, over a period of up to 104 weeks.

Detail of the assessment

Lead Assessor: Julian Dos Remedios

Based on the existing departmental structure and for operational reasons the wider service of applying on paper, was not in the scope of the assessment.

User needs

The private beta has operated since February and has expanded to a number of locations. The team demonstrated a good rhythm of user testing and there was strong evidence of the customer journey being significantly improved through iterations. A prototype for the next phase is well developed and well researched to ensure the service continues to develop.

The team have developed support for those that may want to use the digital service, including ensuring assistance will be provided by operational staff at over 700 locations and with a third party (Citizens Advice Bureau). They will be using a separate URL for those receiving assistance to obtain further data and to assist research.

Whilst the team have some new ideas for further development, both for the product and wider service, the panel did feel that any development of new products such as webchat should follow the best practice of being well researched to inform any implementation.

Similarly, the panel felt that the team’s ideas for improving the service must also follow GDS standards and guidance. Any work to change the wider service should be assessed with all elements of the service being brought together under one service manager and team.


The panel was very impressed with the level of knowledge and commitment shown in delivering the best service possible. A strong agile team is in place, notably with little reliance on external resources, and the team are developing and nurturing internal capabilities which will make a valuable contribution to the wider digital strategy of the department.

Key roles are well established with the product owner, service manager and interaction designer demonstrating a clear vision for the service. User research is well evidenced and the team are performing strongly even with a split over four different locations.

The team have worked extensively with a range of stakeholders, ensuring that operational colleagues are part of the development and can see the benefits of a user-focused design. Key achievements such as removing the need for a user to input their national insurance number demonstrate the benefits of this approach in overcoming internal views over user needs.

The panel was impressed with the team, their knowledge and focus on delivering a quality service for users and broader understanding of the business context and operational priorities.


The team have taken a pragmatic approach to technology, building a front-end system for passing applications to the existing back-end system for processing by operations staff, reusing a common DWP approach for hosting and a centrally supported gateway for passing data across security boundaries.

The team have made great strides reducing the amount of information a user needs to provide to apply for a loan, minimal data at rest in queues, however, this places the additional burden on operation staff, in particular when matching an application to the existing user data. The panel noted the high success rates seem by the team in matching data during private beta, but are concerned the matching of addresses and speed of the manual processing for every application may not extrapolate as the volume grows during the public beta.

The team have drawn on expertise within DWP for security, and should continue to work with security experts, in particular on threat analysis for fraud, given the system is making lump-sum payments (though to known users of DWP services, with an out of band acceptance process for accepting the loan, and not dealing with changes of circumstance).

The panel understands why aspects of the back-end system, in particular modules, using DWP libraries and APIs for interfacing with legacy systems have only been made available code repositories internal to DWP. The team have made the front end code available and will continue to make sure it is maintained and made publicly available. This is to be extended beyond the front end unless there are strong reasons for not releasing all code.


The team have shown that most users can get through the service easily. We feel more research should be done with users who are not familiar with DWP terms i.e. people who are new to benefits. There is a language used throughout the service that is quite DWP specific, so testing with this user group will give a better understanding of whether the service is easy to use for everyone. The team should also continue to make sure users understand some of the vaguer questions such as ‘Do you or your partner owe any money to your landlord?’. The team should also continue to push back on the need for all the current questions. It was mentioned during the assessment that some of the questions can potentially be dropped.

The team have done a good job of following the GOV.UK frontend style. They have also done held workshops with the accessibility team from GDS and DWP, and are aware of the needs to further improve accessibility.

The team mentioned some features they are introducing based on user needs such as check your answers. It’s important that they introduce this carefully and understand the implications to the user journey if a user needs to amend an answer from midway through the service, how they then get back to the review screen. The team also mentioned implementing webchat. It’s important they base these decisions on user needs and not just because it can be implemented.

Although the service team are best placed to understand and manage the risks of a dramatic increase in users in public beta, the relatively low numbers in the private beta have benefited from having controlled locations and local operational support may not fully demonstrate the complexities and challenges of a full rollout at hundreds of sites.

The team did acknowledge these risks and had options to manage them, but also recognised that interest of the service could quickly spread by word of mouth and not be easily controlled. Compounded with a seasonal spike and an insufficient sample size to have strong confidence in key processes such as matching, the likelihood and impact of some risks may need close management.

There are some design tweaks that should be considered which will be sent in a separate email sent by the design assessor.


The performance platform and KPIs are in place and the team have made good use of google analytics. The team are also developing more qualitative data, including the use of a separate URL for users receiving assisted digital support to give an additional insight and are obtaining permission from users to conduct further research to ensure wider service outcomes are met.


To pass the next assessment, the service team must:

  • As a condition on point 8, and linked to a wider departmental approach, ensure that code is made publicly available under a suitable license by default.

The service team should also consider the following recommendations:

  • Continue to scale the service up in an incremental way to ensure risks are being effectively managed.
  • Longer term, combine the paper and digital service as the back end processes should be improved and automated to deliver additional benefits.

  • The minimal approach to a technical team has worked well for the minimal application service assessed for this public beta. A complete technical team with a different architecture and support structures will probably be needed if the service expanded to include some of the features introduced during the conversation, not least accessing information about the loan from the back-end system.

  • The front-end depends on JavaScript, with non-persistent URLs, but the team reassured the panel the service was evolving to use progressive enhancement.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 22 December 2016