Reed Exhibitions
Lead UX Designer
2012-2016
Responsible for:
- Customer centric design
- Coaching and hiring designers
Outputs:
- Global customer panel
- Setup and chaired global best practice forum
- Prototyping and usability testing
They partnered with ThoughtWorks, an agency that specialises in agile software deployment, to help deliver this part of the wider programme.
I was responsible for helping bring the customers into our development process. I proposed a two arc insights approach. The Foundation, this is the BAU really, things we know are working and require refinement. Extension are from the more successful experiments conducted in our longer visionary research activity on innovation, new market exploration.
What I inherited
Reed Exhibitions was a product centric organisation, this is to say the strategy was to create the best, by introducing new features measured by delivery. We had a product that had incredible capability but with only a fraction of which was used and understood by the event teams. We decided to move towards a customer centric approach, this strategy involved meeting customer needs and solving their problems, measured by NPS.
Gather insights
Our approach to insights was to first to identify the questions we still needed answers to. What can we learn from customers at the actual event. What can we learn from customers remotely to ensure we can reflect our global presence and how we can use a customer panel as our potential design partners.
MVP Research
We prioritised questions or tasks we were least unsure about either in a User Story or an Epic. We would then seek to see if these are answered from within the business, if not, we would then formulate a research plan to get those answers as quickly and easily as we can to feedback into the MVP delivery plan.
Setting up testing Thursdays – We did this to ensure our development process included the customer throughout. As issues were captured we would iterate based on tech and business feasibility and re-test.
Customer journeys have many touch points, so some of the story level interactions or concepts are often presented to the customer within the context of a task journey, as a result we may discover possible issues or opportunities in areas we hadn’t considered for the delivery plan. To counter this, we came up with a prioritisation framework. This looked at the impact and size to figure out if we need to respond to it in the current sprint, pick it up in the next one or go into the backlog.
Prioritise feedback during a sprint
What gets developed versus what gets put into the backlog during the sprints
Once an issue is resolved they will be impact scored… Risk – the probability of critical errors % of users impacted Business value Customer value Along with delivery size…. small – it can be easily absorbed into the sprint and effected story medium – it will require another story, it will affect other stories large – it will require another story, it will affect many planned stories Based on this the Product managers could prioritize on its inclusion.
Onsite – At an event
We attended shows to capture general jobs, pains and gains and demo prototypes for feedback. Rather than start with our solution, which may direct customers to discuss what we think is important we started with general show agnostic questions on what their agenda was for the day, how they prepared for that and what challenges they faced along the way. This allowed us to capture what is important to each of our customer segments before we show them our prototypes which may or may not address some of those issues.
Research booth at International Jewellery London 2016
Remote research
These were mainly task based usability studies to help us reach wider segments using a mix of pre-recruited non-Reed Exhibitors. We mimicked a similar structure to our face-to-face interviews. We started with general warm up questions on what their objectives are when accessing the service, also general show questions on how they prepared for the show, what challenges they faced and if there was anything that they wished they could do.
Remote unmoderated feedback using UserZoom
Developing a customer panel
This was to allow us to bring our customers as design partners and enable us to extend our research capabilities. I wrote the business case for developing this and shortlisted from 5 possible agencies. We recruited a panel specialist to manage the process, who I worked closely with to ensure that UX research capabilities could be covered in a timely and cost-effective way.
We included an opt-in function on planned research programmes conducted by the various business units to allow visitors and exhibitors to opt in and become our design partners for this central programme.
Exercises planned included…
- Card sorting and tree testing – helps identify taxonomy around site navigation and categories (quant)
- Ethnographic field studies – participants opt in to be either passively tracked as they move around a show or record aspects of their tasks before, during and after a show (qual.)
- Focus groups – facilitated discussion for customers to explore new concepts or general jobs, pains and gains (qual)
- Usability studies – provide tasks for the customers and see if they can complete them – test the UI, layout, navigation and language (qual.)
- Preference studies – participants are shown different designs and asked to associate a set of attributes to each design – (can be qual. or quant.)
- A/B testing – assigning groups to different designs and monitoring which is more effective on a given task or conversion (quant)
What I learned
Time was always against us. When we are working in lean teams, we had to interpret existing research and insights, translate them to hypotheses and visualise the solutions. On top of this, we also had to think of how we can learn and validate our ideas, prepare the artefacts, in some cases these were prototypes, design the test script, discussion guide, conduct the research and analyse the results.
Having a Testing Thursdays in place, did put pressure on this process but actually allowed us to think of ways we can validate the quickest. However, the challenge which got us the most was getting hold of customers. As part of the business case to develop a customer panel we experimented recruiting small groups via the recruitment service from UserZoom which actually worked well, and allowed us to learn how frequently and what we probably would do if we had one in place.