The GP Provide Information Collection (PIC) service is an information collection service that enables GP practices to share up-to-date information with CQC and other regulators that describes the practice’s view of the quality of care they are providing against the 5 key questions as part of the annual reporting process. Specifically, it allows practices to detail what has changed over the year, plans for improvement and examples of good practice. This contextual information is then shared with inspectors, enabling them to make more informed monitoring decisions and to better target resources where needed.

The current process for collecting this information is for CQC to send out an information request via email to GP practices in advance of an inspection. There is no consistent template for completion, which results in inconsistent information and an additional burden on GP practices

The GP PIC is designed for the purpose of monitoring and not for pre-inspection, so is not replacing any existing data collection. It is also not information held or collected elsewhere.

Department / Agency
Care Quality Commission (CQC)

Date of Assessment:
10th April 2018

Assessment Stage:
Alpha

Result of Assessment:
Not met

Lead Assessor:
B.Showers

Service Manager:
Uma Datta

Digital Leader:
Iain O’Neil


Assessment report

Outcome of service assessment

After careful consideration, the panel have concluded that the service did not meet the standard because:

  • The expected outcomes of the alpha phase have not been completed. There is still work to do and there are a number of gaps in the work which would need to be completed prior to progressing to Alpha, these include:
  • Completing the design and iteration of the questions and adding these updated questions to the alpha prototype and testing with users.
  • The panel felt as though the PIC had been built as a standalone means of collecting data rather than as part of an integrated end-to-end service for GP performance review. From the evidence provided, it was not evident that the team had clearly thought how this information would be fed back to the relevant CQC inspector. Without this joined up thinking it is not possible to clearly see how this service would add value and not just additional burden on the CQC internal team, inspectors and GPs alike

Conditions for passing Alpha

In order to progress to Beta, the service team will need to provide the panel with a show and tell that addresses the following areas:

  1. Demonstrate the alpha prototype with all of the iterated questions included
  2. Show how user research has informed the alpha prototype and the questions being asked
  3. Describe how you are exploring the end-to-end journey of the service, and in particular the findings from research with inspectors and the information needs identified. The panel will require a clear narrative articulating how the user journey progress end-to-end
  4. Articulate the types of needs and journeys that assisted digital users will have, and how those needs will be met at scale

Service assessment

User needs

During Discovery and Alpha the team have clearly developed a good insight into their users, their needs, how they use the service and the pain points. However, it was unclear how much of this was research done at the discovery phase, as opposed to a more nuanced understanding of the users following the team’s progression into alpha.

All research undertaken in the alpha phase had been effectively communicated, and had involved team members either through observation or listening to recordings at a later point. Research was undertaken with thought to participants’ data privacy and using appropriate data handling and storage.

The team referred to having two primary user types – practice managers and inspectors. However, only individuals who would have a direct interface with the service should be considered users. The team explained that inspectors would be in receipt of a report (or similar) which is produced following the practice manager submitting answers to the service – although inspectors benefit from the service, they would not interact with the service directly. The team have demonstrated an understanding of the needs of the inspectors with respect to assessing the quality of care, which is very important. Going forward, the team must concentrate on the needs of the service users while continuing to check that the answers delivered through the service allow the inspectors to grade the quality of care.

While a fair amount of user research has been undertaken in the four month alpha phase, the team’s user research so far has been quite limited. It has been primarily focused on the content design of the questions proposed for the service. However, the team has not yet finished this process – many of the questions in the prototype have not undergone the content refinement process that the team described at the assessment.

No prototype testing has been undertaken on the service, which will not give the team a good basis on which to set hypotheses for beta and proceed to further development. The panel would have liked to have seen evidence of contextual research to go alongside the content testing undertaken – it is unclear how much new contextual work was done. The panel were pleased to note that the team were continuing to use the research findings uncovered at discovery, but note that these findings should be built upon at alpha by continuing to test assumptions. It wasn’t clear if this was the case. The team do not appear to have undergone any hypothesis testing on the service, other than relating to the format of questions.

The team explained that they believe there will not be many service users who have low digital confidence/skill, and as such they do not expect to have to develop a support model for Assisted Digital (AD) users. This comment was of concern to the panel – given the policy on digital competence in GP practices, it is unlikely that users will identify themselves to the team as requiring assisted digital support, meaning the team will need to put extra effort into finding them. The team acknowledged that they had already spoken to at least one potential AD user (a user who had expressed an unwillingness to interact with an online service, preferring to interact by email).

The team have had difficulty finding users with accessibility needs or assisted digital needs – but have put in place a plan which they hope will mitigate this in beta. In the event that they are unable to find users with accessibility needs, the team plan to put the service through an accessibility audit (for instance with the Digital Accessibility Centre). This is a reasonable alternative, but the team should be mindful that such audits are usually completed by expert usability testers and as such is not a complete substitute for understanding the user experience. In addition to testing with AD and accessibility users, the panel were pleased to hear that the team had put thought into the next steps for user research in the beta phase, including their aims and potential methods which would be appropriate.

Team

The panel were impressed by the knowledge and commitment of the team and their ability to articulate a clear vision for the service.

The panel did have concerns around the changes to suppliers and personnel within both Discovery and Alpha. However, it was clear the Alpha team had been able to weather these changes in both suppliers and CQC staff changes, and had been able to continue working successfully toward their vision. They had also sought to mitigate the impact of these changes through the effective use of tools to capture outputs and ensure information was captured and shared appropriately.

The Alpha team included the following roles:

  • Service manager
  • Product manager
  • Delivery manager
  • User researcher
  • Technical architect
  • Content designer
  • Business analyst
  • Developers
  • Web operations

The team was distributed, but had been able to utilise tools to ensure they were able to effectively communicate and work in a agile way, such as G Suite, Slack and Jira.

The team had also been able to communicate with more senior members of the organisation through show and tells, as they worked in an open and collaborative way.

The service team had also clearly thought about the requirements for the future development of  the service and had plans for the roles in Beta.

Technology

The panel were impressed by the team’s commitment to re-using technology from other CQC services. This minimises the amount of effort required for internal CQC developers to scale the service. This reuse has also afforded the team the opportunity to learn from other services lessening the likelihood of repeated errors and improving the efficiency of development. A particularly clear example of this is the fact that the team were able to apply the security fixes identified as being required by the penetration test completed on the social care PIC before the service has gone into beta. This learning was also clear from the fact that there is already an incident management plan in place for critical, major, significant and minor failures.

Also reassuring to the panel was the extent to which  the technical architecture of the service allows for flexibility and portability. This has very clearly been thought through to negate the risk of vendor lock-in and create a sustainable service that will withstand future changes to the process and information required.

In addition the panel would like to commend the team for their willingness to embrace an open-source approach to development. The team are making excellent use of open-source software in Drupal and GitHub and were able to give several clear examples of how they have given back to the open-source community throughout the lifecycle of other PIC projects – giving an indication of their intention to do the same during the development of the GP PIC.

This commitment to open-source and flexibility gave the panel the reassurance required that there was a clear thought-process behind the selection of technology previously even if in this instance the team were limited with their choice given the need to build on the existing architecture.

Finally, although it was not immediately apparent to the panel why the team were building separate services for each type of provider that might require an information return instead of creating a single service with multiple pathways. However, after further questioning it became clear that there was a defensible rationale for this and that the team had clearly considered this as a possibility.

Design

The team has reused a successful design from another service in development by CQC (the Adult Social Care Provider information Collection), which is a good way of benefiting from a tried-and-tested design without expending unneeded effort. However, the prototype which has been developed has not yet been put in front of any users. Effectively, the team do not have a working prototype – nor have any paper prototypes been tested with users. The existing prototype is also unfinished from a content perspective, as not all of the questions due to be included have been appropriately iterated prior to the assessment. As the service has not been shown to any users, it will not have been possible for the team to assess objectively whether their proposed course of action is effective or could be improved upon.

The content design process is good – the team have worked well with the service’s primary stakeholders (inspectors) to understand what information the questions need to elicit from users, and have applied good content practice and the results from user testing to ensure that the questions used in the service are as straightforward as they can be. However, this work is unfinished – only about half of the questions have been through this process.

The service would benefit from having been considered from a more end-to-end perspective – taking into account both the users’ pain points and opportunities to simplify processes.

Analytics

The panel were pleased with the amount of thought that the team had given to analytics even at this early Alpha stage. The team were able to clearly articulate the KPIs that will be used to manage the service:

  • Reducing the amount of time taken to fill in the form
  • Reducing the number of cals to the call centre
  • Reducing the number of follow-up calls required from inspectors
  • Ability to receive 600-800 submissions a day and send the same number of notifications

The commitment to keeping to these KPIs was evident from the fact that the team are already registered with Gov.UK performance platform. In addition, the panel were reassured to hear that the team had identified Google Analytics as a skills gap and were planning to recruit someone with these skills for beta.

The panel would, however, encourage the team to take a bigger picture approach to performance analytics and also consider how the PIC can improve other areas of the service, for example if this reduces the amount of time inspectors spend on follow-up calls what are the ramifications further down the chain of work.

Recommendations

Below are further recommendations for the project team as they progress toward Beta. These are not conditions on passing Alpha, but should be considered as part of Alpha remediation and Beta planning:

  • Work with inspectors to understand and clearly document the needs they have from data/information collected. Document what outputs are required and the ways these outputs can be obtained (i.e., some of these may not necessarily be through new questions)
  • Identify end-to-end user journeys so they can be tested through the site and measured for successful outcomes. This may include meeting with users in their own environment to understand where users are running into problems with their journeys
  • Test the site further for accessibility. For example, the colour patterns and contrasts might not work for a person with a visual impairment, dyslexia, or a neurological condition
  • Identify assisted digital users and do contextual research and usability testing with them
  • Have a plan in place for ongoing user research
  • Have in place a finalised and stable Beta team that will be able to see the project through to Live
  • Bring marking of mandatory and optional questions in line with GOV.UK standards

Original source – Stephen Hale

Comments closed