Moving to Alpha with the Blaenau Gwent Waste and Recycling project
10 min read Written by: Cory HughesLast week we were excited to share some of the progress we’d been making around new ways of working, digital service design, and how we supported Blaenau Gwent Council in recruiting their first digital apprentice, Luke.
When Luke joined the Council, we’d already established good working relationships with the teams that would be involved in this work. We’d started taking a look at their business data and establishing what we could use to help us make improvements to the service.
In this blog, we’ll delve a little deeper into how we worked with data and the residents of Blaenau Gwent to really understand where we could make improvements.
Speaking to users generates ideas
When we first set out to make things better for users, we knew that we needed to really understand the types of problems users were experiencing. The first step was defining the “user”, not just the end user (customer), but users at any point of the service, including those on the front line, and those in the back office.
There is no substitute for real conversation – so we quickly developed both Welsh and English social media recruitment ads to interview people about what they felt was important to improve. In parallel, we took time out to explore the existing processes. We started out by developing service blueprints which got us 80% toward understanding the processes – but led us to identify gaps – gaps which could be causing users issues or not be up to expectations.
So we took time to go out with front line employees face to face to experience first-hand how new equipment is ordered and delivered. This learning helped us ‘fill in the blanks’ of the service blueprints and provide some real world context to what changes we could make to help improve the service.
Survey Bots are a real pain in the…
When we first ran our recruitment form via social media we were delighted to see the numbers of applicants soar. Until we looked at the quality of responses and found that some of the 900 or so responses – okay 90% of the responses – were by bots trying to claim the Amazon vouchers we were offering to participants.
We learned however that this can easily be overcome by creating a landing page on our site and embedding the form on that page. Just adding in this simple step allowed us to add further details about the interview and to confuse bots that were actively seeking out social media posts with links to survey platforms. This resulted in 0 bots answering our next recruitment campaign allowing us to focus more on what we really love – real people.
Covid and social distancing weren’t a barrier to user research
We adapted our approach to conduct user research with real users in real time, building on the skills we all learned since the pandemic. With clear guidance and a little hand holding we were able to find out some really valuable insights from users about existing processes. With a little bit of imagination and a little tech we were also able to share the interviews, live, with stakeholders in the business exactly like we’d have done using an ‘observation room’ if we weren’t in a pandemic.
It worked so well we realised that it probably adds more value than if we actually did it in real life. The user is more comfortable and natural in their own environment on their own sofa. It’s probably more ‘realistic’ in that respect. Observation is easier and less intrusive allowing others to draw their own conclusions and make their own points. It’s recorded so we could spend less time on taking notes and more time experiencing the customer journey as they carried out a transactions on the Blaenau Gwent website.
Data and user research need to work together to make a whole story
Listening to users is great and gets even better when you can handle business and operational data in a way to help tell a story. During the discovery phase we learnt that adding collated business data into a database with consistent formatting makes it far easier to look for trends and check parts of what we learned in the customer interviews. It seems an obvious thing to say – but how many reading this have mass amounts of data kept in disparate spreadsheets with semi functional macros that don’t quite work, written by someone that’s left the business? Sound familiar?
We used PostgreSQL to import data from most of the last four years to analyse. It showed us the shift between phone and web ordering over time, numbers of duplicate orders, excessive orders, delivered, non-delivered – and fundamentally, combined with the user research provided a clear problem for us to solve in these next delivery stages.
So what next?
We’re lucky to have been joined by a new delivery lead for the project – James. A fresh look on the project, plus the whole team answering questions from James as he gets up to speed helped us to take stock. We know we could always have done more research and we could have looked in to even more data – but in terms of adding value early, as agile delivery aims to do, we’re in a great place to move forward and develop prototypes for improved services with a new look and feel.
We’ve already decided on a set of 4 solutions to progress into prototype, design and build which we’ll explore in more detail in our next post.