Recently, we blogged about our research on how users feel about webchat, and the way we used roleplay to test human interactions in a webchat.

Our third piece of research took us out of the lab.

This post looks at our approach to user research with government and commercial contact centres across the country.

Finding our users

Once we’d defined the scope of our alpha, we identified three broad user groups across government who use webchat:

  • contact centres; for example: webchat advisers, webchat managers and webchat implementation teams
  • service teams
  • citizens

We started our research by speaking to contact centres and service teams in departments, which were already using (or thinking of using) webchat. We then travelled up and down the country visiting these contact centres.

To get a better view of the market, we also spoke to a commercial contact centre.

The human side of webchat

We observed webchat advisers handling live chats, and webchat team managers supporting and guiding their work. We saw first-hand the range of webchats advisers and managers deal with on a daily basis. This gave us valuable insights into the types of challenges faced by our users.

A webchat adviser sits at their desk looking at a computer monitor, reading from a word document. The text on the monitor is blurred and not legible.
A webchat adviser

We mirrored the position and behaviour of the advisers during the observations. We did this to make the advisers feel at ease with us being there (we didn’t want to interrupt them during live chats). For example, if advisers were working while sitting down at their desks, we also sat down at their desks with them. If advisers were working at standing desks, we stood with the advisers at their desks.

We wanted to make ourselves inconspicuous and we didn’t want advisers to feel like we were watching them over their shoulder. So we kept our distance, but remained close enough to see what was happening on screen and how chats were handled.

We resisted asking questions while we were observing – we didn’t want to interrupt them or stop them from doing their job.

We learned, we returned

We followed our observations up with a second site visit, during which we held one-to-one interviews with webchat advisers and managers. We not only wanted to know about how chats were handled, we also wanted to know how the webchat team was managed and how the contact centre operated.

Before each site visit, interview discussion guides were created for one-to-one interviews with webchat advisers and managers.

Having identified our user groups during the webchat discovery phase, our interview discussion guides were based on what we’d learned about particular user groups.

Asking the right questions

Webchat managers and webchat advisers have different roles so we asked them different questions. We asked questions about how it was used and managed, and asked both groups how they felt about working on this type of support channel.

Here are some of the questions we asked webchat advisers:

  • how many webchats can you usually handle at once?
  • what information do you get about the user?
  • what happens if you can’t answer a question?
  • overall, what concerns do you have about using webchat?

Here are some of the types of questions asked webchat managers:

  • how are teams and advisers organised?
  • how do you analyse why people are getting in contact?
  • how do you think webchat is measured in terms of success?
  • how do you think the contact centre has changed to support webchat?

Each interview was conducted by a user researcher. Another member of the team was also present to observe and transcribe the interview.

What did we learn and how?

After each interview session, we wrote a summary of the most interesting points. We tried to keep summaries to no more than two thirds of a page, but this sometimes proved tricky as we were finding so many rich and valuable insights.

We analysed all the interview summaries and grouped the data into digestible chunks. By doing so we were able to see common needs and themes in the data gathered.

Useful insights

Here are a some of our most common findings:

  • advisers tend to handle fewer chats concurrently, if webchats are more complex
  • using legacy IT systems can make security painful and hard, and slow webchat down
  • seeing what users are typing before they submit helps webchat advisers communicate faster
  • webchat data is more readily available and tends to be more thorough than telephony data

Research gaps

Although we’ve finished the webchat alpha, there’s still more research that could be done. Costs and charging models, in particular, need further research.

We suggest service teams explore the best way to gather useful metrics to assess how efficient webchat is for their service. Don’t assume this support channel saves money.

Finally, it’s important to note how useful webchat is for shining a light on users’ needs. These valuable insights, captured within webchats, should be used to improve services.

If you’re keen to know more you can read Chris Heathcote’s blog about how the different groups of webchat users feel about this support channel.

And if you would like to see more detailed webchat alpha findings, please register your interest.

Keep in touch. Sign up to email updates from this blog. Follow Emma on Twitter.

Original source – User research

Comments closed