Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.
Skip to main content

Stephanie Tierney, University of Oxford

Ruth Abrams, University College London

 

In this blog, we reflect on our attempts to seek information from Clinical Commissioning Groups (CCGs) in England, using freedom of information (FOI) requests, to contribute to our development of two realist reviews we are working on.

What is a realist review?

Realist reviews have emerged in recent decades as a way of using multiple sources of data on specific programmes to explain what works, for whom, in what circumstances and why. They are theory driven, seeking to identify and elucidate causal factors through: a) the iterative development of a programme theory (mapping how an intervention operates and brings about change), and b) context-mechanism-outcome configurations (CMOCs) (statements about how context can activate certain mechanisms that result in specific outcomes). A broad range of data is often included in realist reviews; quantitative projects may show patterns that can be explored and explained using data from qualitative research. Realist reviews can also include items such as commentaries, editorials and evaluations, if they contain data that offer an insight into how an intervention is proposed to work or further understanding of parts of a CMOC.

Where do FOI requests fit in?

CCGs are a potentially ripe avenue for data that can be used in a realist review. Obtaining such information has been notoriously challenging, unless published online. Often an email request goes to a generic inbox, manned by people who may be unclear about where to redirect it. Such requests have become slightly easier to make in light of the Freedom of Information Act. Now the responsibility lies with the researcher doing the asking to be clear about the data they are seeking. We tried contacting CCGs as a route to finding information to use in a couple of realist reviews we are working on, due to a lack of relevant data found from searches of traditional bibliographic databases.

The reviews

We are part of a team producing a series of realist reviews looking at service redesign in primary care. One focuses on early visiting services; these entail delegating traditional GP led home visits to other professionals (e.g. advanced nurse practitioners, paramedics, emergency care practitioners or locum GPs). CCGs were asked whether and how this service was operationalised within their practice, including implementation processes such as management of patient access, workforce recruitment and cost implications. Service evaluations were also requested.

The other review is looking at care navigation, whereby people are supported to locate and access services, often in the voluntary and community sector, which can help address non-medical problems (e.g. loneliness, bereavement, housing problems) that have an impact on health and wellbeing. We invited CCGs to let us know whether this type of service was being provided in their area and, if so, who was providing it, for which patients and whether such services had been evaluated.

Procedural issues

We had 195 CCGs to contact. There is often one central email address for multiple regions, which is not always made clear online. It can be helpful, therefore, to group CCGs by region, rather than alphabetically, to cross-check points of contact. For example, there is one central email for all CCGs based in North London. By identifying an email address covering several CCGs, we could send one request, outlining precisely where and from whom the information was required. Not all CCGs had a designated FOI email address (e.g. a few had online forms to be completed with requested information). We recorded when this was the case, to be transparent in how information was accessed. 

We found that staggering requests in batches (of 30-40) every couple of days was necessary, to stop our email inbox from becoming unwieldy with responses. We learnt that when making requests alongside other tasks, it can be difficult to track responses (e.g. in an Excel spreadsheet) when a large volume of requests is sent at the same time. Staggering requests enabled us to log FOI reference numbers against the relevant CCG, which aided both the tracking of replies and ensured the relevant information was to hand when no reply was forthcoming.

It was important within requests to outline clearly the nature of the service of interest. This can include the different names it might be labelled as and whether it tends to be operated within or outside of typical working hours. We included, in our requests, a brief paragraph defining the type of service we wished to learn about from each CCG.

What information did we get back?

The early visiting services review received a response from 184 CCGs. Many of these responses were brief, stating that the CCG did not commission the service we were requesting information on. 46 CCGs were able to reply providing some detail on the service. Given the newness of this intervention, many providers had not yet undertaken formal evaluations. Four additional documents were provided, one being a formal evaluation. This information fell under copyright protection and would require copyright permission to be published in digital form. Hence, retrieving information on early visiting services in this instance did not provide much additional data that was useful to our review. However, it did allow for some meaningful connections to be made with commissioners; several individuals at director level expressed an interest in our conclusions and may, in the future, be able to act as avenues for dissemination.

For the care navigators review, we received responses from all but 2 CCGs. Another 29 stated they did not hold requested information and 2 returned unclear replies. This meant we received usable data from 162 CCGs, which has provided a perspective on the different ways in which care navigation is being implemented across England. Furthermore, as part of the responses, we received 31 evaluations that we had not identified from our previous online searches. We are in the process of working with these data, which we envisage will help us refine and strengthen CMOCs. Hence, sending out a FOI request, in this case, was worthwhile in deriving new data for the review.

Conclusion

Realist reviews incorporate broad searches for literature. Relevant documents may not always be listed on bibliographic databases, especially when looking at an area such as service redesign incorporating new interventions. We wondered how useful it would be to make a FOI request to CCGs in terms of finding material that could inform our understanding and development of CMOCs. The answer seems to be it depends on the area being explored (e.g. its newness versus established programme), how far CCGs see it as part of their remit and their ability to provide information that is not commercially sensitive. Deciding to make a FOI request may have to be balanced against the time this work takes to administer (e.g. devising and piloting the questions to ask, finding email addresses and sending out requests, tracking returns using Excel and dispatching reminders when required). However, it may also produce unintended outcomes such as meaningful connections that extend a researcher’s network for stakeholder engagement and dissemination avenues.

Stephanie Tierney (ST) is a Researcher in Evidence Synthesis at the Centre for Evidence Based Medicine, University of Oxford. Ruth Abrams (RA) is a Research Assistant at Primary Care and Population Health, University College London. ST’s and RA’s posts are supported by The Evidence Synthesis Working Group of the National Institute for Health Research School for Primary Care Research (NIHR SPCR) [Project Number 390].