Why should we co-produce research?
- Understanding the context
- How did the co-production journey begin?
- So, what was the model?
- What are the benefits?
- Understanding research in a local authority
- Sharing co-production evaluation learning and publication
- What are the key lessons learnt?
- Embedding R&D objectives into practitioner work plans
- Timeline for intervention
- Do not over commit
Claire Sullivan and Gill O’Neill
Understanding the context
Understanding the context a co-production project is to be conducted in is key and can include prioritising limited resources to meet health needs, something of a particular challenge in areas like the North East of England where health outcomes remain stubbornly lower than the England average and where there is substantial (perceived or actual) unmet heath needs. There is much to consider with regard to how interventions are evaluated to demonstrate adding value to the local system without the requirement for a randomised control trial.
There is prolific delivery in local public health teams where the evidence base has been interpreted and applied within the local context but we are often left unsure of whether there is a difference being made to local people/ recipients of the interventions unless robust evaluation is wrapped around an intervention.
Additionally, there is perhaps an assumption that commissioning decisions are linear. These decisions are subject to strict funding regimes, are bound in political tensions (national and local), follow a strict timeline and are led by one individual whose focus is just on that commission.
Considering all of the above, the importance of the context is clear. Good understanding of its potential impact by researchers is therefore clearly beneficial and can be supported through the co-production process. This is likely to provide deeper learning and robust research findings that are more likely to be applicable to the environment a practitioner works within.
Translational research and translational practice
Having worked in public health practice for some time, we have also advocated that the focus on 'translational research' or knowledge exchange, has often been considered as one directional - how to get the research evidence into practice. There is often a very long time lag to get published research into practice with this taking as long as 17 years (Morris et al. 2011). Yet rarely do we hear how can practice inform research. The more recent focus on the need for research funders to look at ‘natural experiments’ is promising but underutilised (Petticrew et al. 2005).
Case studies are used to share ‘good’ or ‘promising’ practice - but who said it’s good and will it translate here? Therefore, case studies are read with interest but are often not replicated as the local context remains pivotal to successful implementation.
Implementation science is commonly defined as the study of methods and strategies to promote the uptake of interventions that have proven effective into routine practice, with the aim of improving population health. Implementation science therefore examines what works, for whom and under what circumstances, and how interventions can be adapted and scaled up in ways that are accessible and equitable (Michie et al. 2009). The field of implementation science has been born as a result of recognising the importance of the gap between research and practice (Glasgow et al. 2003). This gap has expedited the use of multitudinous theoretical constructs, aiming to enhance the implementation process, identify the barriers and facilitators and acting as valuable tools in evaluating implementation (May and Finch 2009). For public health practitioners endeavouring to implement evidence based practice, understanding the barriers and enablers to practical implementation are critical in the field. Undertaking local level evaluation, with implementation science domains, is a step towards understanding the context in which to apply evidence based research.
One of the major barriers to engaging academics in co-production work within certain public health practitioner settings is the constraint put on the type of experiment that can be conducted. It would be unusual to be able to carry out a randomised controlled trial (RCT) in a local authority setting for example. While some methods would allow controlled evaluation of new initiatives as they are rolled out (e.g. stepped wedge cluster randomised trial) (Hemming et al. 2015), the majority of projects are likely to be observational or implementation studies. Due to the hierarchy of evidence and the academic drive to publish in high impact journals these settings may initially seem less attractive to academics (Newbury-Birch et al. 2016). However, through engaging in such co-production projects academics will gain access to new types of data and study subjects and may, for example, provide an excellent opportunity to generate important qualitative research (Newbury-Birch et al. 2016). Furthermore, it provides a welcome opportunity for academics to see their research put into practice. In order to create robust partnerships these benefits should be strongly communicated.
How did the co-production journey begin?
Research is expensive
Whilst working at a former Primary Care Trust, we were keen to understand the emotional and financial impact for patients and their families living with cancer. The ‘intervention’ was the provision of dedicated welfare rights support officers in the County to provide the advice during this difficult time for individuals and their families. We were keen to evaluate the impact of such a scheme, yet the research was going to cost twice that of the intervention due to the full economic cost (FEC) applied by academic institutions. A cost simply unpalatable to Local Government colleagues. The solution in the end was to use a third party - a charity where the FEC did not apply. At the same time some commissioners were engaging independent contractors to help with some process evaluations at the cost of approximately ten thousand pounds each using underspend funding. Some of these evaluations were poor in quality and lacked the methodological rigour as they were not being led by academic institutions. The risk with this is that glossy published reports are produced that look impressive to commissioners but have substantial flaws in their recommendations and thus renders them less than useful.
So, what was the model?
As a result of the difficulties and barriers described above, we agreed it would be more effective and cost effective to employ a researcher for a year rather than spend the small sums of funding on poor quality discrete pieces. The postholder was employed by a university, worked for three days a week in the Institute and two days a week in the practice location. Thus, the embedded researcher became straddled across two worlds - the researcher in the middle then benefitted from understanding the context of place, priorities and politics (Newbury-Birch et al. 2016).
A steering group was established to agree the areas of practice that would benefit from rapid evaluations, and the practitioners leading these workstreams became aligned with the researcher to drive forward the research programmes.
What are the benefits?
Increasing research capacity of practitioners
To ensure the embedded researcher wasn’t simply just undertaking academic traditional research but from a different office base, the public health practitioners leading on the areas of work being evaluated were given the opportunity to work alongside the researcher and a co-production model was created.
To enable this co-production model to be developed there were some practical barriers to overcome such as the majority of PH practitioners not having access to a university Athens account and therefore not having access to up-to-date evidence and journals. The model applied in County Durham, at a time when Athens was not available to the practitioners, was to ensure they had access to the university library and could also step outside their world and institute into another. This step to improve access to evidence, to be able to critically review it using a practitioner lens as well as the workforce development opportunity for practitioners to undertake research as part of their daily practice, was also a key driver for the model.
Understanding research in a local authority
Whilst the co-production evaluation programme was established within the PCT the work became operational post transfer into local authority in 2013. Within the NHS there are very clear protocols and guidance for research and development but within local government this is less clear cut. It was agreed with the R&D lead within local government that university ethics would be accepted as the approved process for the work and the standard template for R&D to be logged in local government would also be completed and recorded.
Sharing co-production evaluation learning and publication
One of the key aspirations of the co-production evaluation programme has always been to achieve publication within peer-reviewed journals. The opportunity for public health practitioners to develop the skills and competence to undertake quality assured evaluations and subsequently participate in the development of a peer reviewed journal article is not something often available at practitioner level.
During the course of the co-production evaluation journey there have been many opportunities to submit poster presentations to conferences such as Public Health England conference. These conferences are expensive to attend and often the cost renders them prohibitive for public health practitioners. Participating in co-produced evaluations with the embedded researcher has opened doors for practitioners to attend these conferences and share their learning with broader audiences.
What are the key lessons learnt?
Finding the right academic
This process is not for all researchers. It is a very pragmatic way to test the evidence base and determine how well it is being implemented within the field. With a small budget allocation the embedded researcher is often relatively junior in their academic career and so it is essential there is a professor with robust oversight and scrutiny of the work. The passion to
Why should ive co-produce research? 5 bring academic skills into the front line work of public health and the opportunity to publish small-scale evaluations to add to the richness and diversity of the evidence base requires a true belief in translational research.
Screening for eligible projects (ensure there is baseline data)
The programmes I interventions put forward for evaluation must meet a set criteria to warrant the evaluation viable. All public health practitioners receive the following guidance when submitting a request to participate in a co-production evaluation:
It is important to note that these are co-production pieces of work with researchers from Teesside University. By completing these forms you are agreeing to be involved in all areas of the research and the work to be divided between yourself and the researcher. This is worked out at the beginning of the individual projects and will need to be agreed by line managers. Please complete the relevant form below relating to the methodological area. The forms will be shortlisted by DCC and then Professor Newbury-Birch will identify which proposals are the most academically sound. At this stage we will have a meeting with yourself and if an implementation or evaluation study we will meet with the providers also to ensure the extent of the data that can be collected in order to carry out the study and work out the correct methodological framework that is needed in order to do the study.
Clarity of roles and responsibilities
To ensure all public health senior management team had a full appreciation of the commitment to undertake a co-produced evaluation the following roles and responsibilities agreement was constructed (Table 1.1).
Embedding R&D objectives into practitioner work plans
To fully reflect the work to be undertaken the co-production evaluation must be fully recognised within the job plan of the practitioner. This demonstrates corporate commitment to the time needed to complete the work and learn from the process.
Timeline for intervention
Timelines inevitably waver slightly due to ethics or the coordination of focus groups or a myriad of other legitimate reasons. However, wherever possible the evaluations are completed within a 12-month period.
Table 1.1 Role clarity
Aspec t of evaluation
University responsibilities Durham County Council Responsibilities
Spend one half day each shadowing the other in their day job (time permitting) to get to know each other better and to understand their roles.
Steel ing groups
Note taking for the Identification of steering group
Chairing of steering group
The arranging of dates and times for steering group meetings and the setting of the agenda will be a joint responsibility Roles and responsibilities of the wider steering group will be discussed and decided upon depending on the needs of the evaluation.
Additionally a steering group is held once a quarter to discuss co-production work and share learning amongst the public health leads and university staff.
Conduct basic literature Provide expertise on national review outlining evidence and local policy relating to for teen parent programmes their portfolio and their impact on wellbeing and future employment.
Design initial protocol Provide feedback on the initial
including combined litera- protocol for the final draft, ture review, and proposed methods for data collection and analysis
Evaluation protocol to be shared with the steering group once finalised for comments
Take lead on university Take lead on Durham research
ethics application - to be application pack and any Cal-completed prior to any data decott agreements needed to being shared with the share data with Teesside Uniuniversity versity. Liaise with necessary
people at Council.
Data collection and analysis Final report
TO BE COMPLETED
FOR EACH PROJECT
Background section of report to be shared between. University will take lead on writing up the literature search. DCC to take the lead on writing up of current national and local policy relevant to their portfolio.
Take the lead on writing Take lead on participant recruit-up the methodology sec- ment section of the methodology tion of the report.
Provide feedback on sec- Provide feedback on the Methods tions of the Methodol- section written by University ogy written by DCC.
Do not over commit
In the first wave of evaluations there were five evaluations undertaken. This became unmanageable for the researcher and also if the practitioner was participating in more than one evaluation project there became a pressure on capacity to undertake other work objectives. It was agreed after the first wave of evaluations that an researcher would not run more than three evaluations and any single practitioner would only commit to one.
Understanding political landscape and changing!emerging priorities
At a time of significant national austerity and political uncertainty of the public health grant many public health interventions are at risk of being decommissioned due to the necessity to prioritise programme areas. A couple of co-produced evaluations undertaken helped to make commissioning decisions regarding the future delivery of the programmes and the opportunity cost of money being invested elsewhere. During the process it may have felt complex and uncertain for the embedded researcher who was not familiar with working in this political environment but again this is back to research being completed in the context of local. In the real world of local authority work programmes are vulnerable, complex, uncertain and ambiguous (VUCA).
Contributing to the evidence base - local priorities and local relevance
The work undertaken over the last few years has resulted in poster and oral presentations at regional, national and international conferences and to date and seven co-produced publications (McGeechan et al. 2016a; McGee-chan et al. 2016b; McGeechan et al. 2016c; McGeechan et al. 2018a; McGeechan et al. 2018b; McGeechan et al. 2018c; McGeechan in press). This has enabled the co-production evaluation programme to raise its profile and demonstrate the added value of small scale evaluations being conducted in the context of local authority public health teams which is of relevance to local level priorities.
Glasgow, R. E., E. Lichtenstein and A. C. Marcus (2003). "Why don’t we see more translation of health promotion research to practice?” Am J Public Heald) 93(8): 1261-1267.
Hemming, K., T. P. Haines, P. J. Chilton, A. J. Girling and R. J. Lilford (2015). “The stepped wedge cluster randomised trial: rationale, design, analysis, and reporting.” BMJ: British Medical Journal 350: h391.
May, C. and T. Finch (2009). “Implementation, embedding, and integration: an outline of Normalization Process Theory.” Sociology 43(3): 535-554.
McGeechan. G. J., C. Richardson, K. Weir, L. Wilson, G. O'Neill and D. Newbury-Birch, (2018a). "Evaluation of a police led suicide early alert surveillance strategy in the United Kingdom.” Injury Prevention 24: 267 -271.
McGeechan. G., D. Phillips. L. Wilson, V. J. Whittaker, G. O'Neill, and D. Newbury-Birch (2018b). “Service evaluation of an exercise on referral scheme for adults with existing health conditions in the United Kingdom.” International J Behav Med 25: 304-311.
McGeechan, G.. C. Richardson, L. Wilson, G. O'Neill, and D. Newbury-Birch (2016a). “Exploring men's perceptions of a community based men's shed programme in England.” Journal of Public Health 39(4): e251-e256.
McGeechan, G., K. Wilkinson, N. Martin, L. Wilson, G. O’Neill and D. Newbury-Birch (2016b). "A mixed method outcome evaluation of a specialist Alcohol Hospital Liaison Team.” Perspectives in Public Health 136(6): 361-367.
McGeechan. G., D. Woodhall, L. Anderson, L. Wilson, G. O'Neill and D. Newbury-Birch (2016c). "A coproduction community based approach to reducing smoking prevalence in a local community setting.” Journal of Environmental and Public Health-. 538-653.
McGeechan G. J.. M. Baldwin, K. Allan, G. O'Neill, and D. Newbury-Birch (2018c). “Exploring young women’s perspectives of a targeted support programme for teenage parents.” BMJ Sexual & Reproductive Health 44(4): 272-277.
McGeechan. G. J., C. Richardson, L. Wilson. K. Allan, and D. Newbury-Birch. “A qualitative exploration of a school based mindfulness course for young people.” Child and Adolescent Mental Health.
Michie, S., D. Fixsen, J. M. Grimshaw and M. P. Eccles (2009). “Specifying and reporting complex behaviour change interventions: the need for a scientific method.” Implementation Science 4(1): 40.
Morris, Z.. S. Wooding and J. Grant (2011). "The answer is 17 years, what is the question: understanding time lags in translational research.” Journal of the Royal Society of Medicine 104(12): 510-520.
Newbury-Birch, D., G. McGeechan and A. Holloway (2016). “Climbing down the steps from the ivory tower: how UK academics and criminal justice practitioners need to work together on alcohol studies.” International Journal of Prisoner Health 12(3): 129-134. "
Petticrew, M., S. Cummins, C. Ferrell, A. Findlay, C. Higgins. C. Hoy, A. Kearns and L. Sparks (2005). "Natural experiments: an underused tool for public health?” Public Health 119(9): 751-757.