Monday, February 24, 2014

Evaluation of impact of arts and cultural activities on looked after children


OPM were commissioned by Creativity Culture and Education (CCE) to evaluate the impact of CCE-funded programme of arts and cultural activities on three groups of looked after children. The project, which was proposed and managed by the National Children’s Bureau (NCB), focused on activities delivered by three cultural organisations located in different parts of the UK:

The objective of the evaluation was to understand the impact and effectiveness of the various arts and cultural activities for the looked after children that participated. As well as understanding what impacts each of the projects is responsible for, the evaluation also focused on how each project created those impacts. Understanding how different elements of the projects have contributed to impacts identified is key to the sustainability of this line of work and will support the replication of projects beyond the three current settings.

What we did

Our evaluation team adopted a theory of change approach which built a clear understanding of how the programme worked, its intended outcomes and the extent to which its inputs, outputs and activities contribute towards these outcomes. The model hypothesised that four elements of the project – skilled artists, positive arts activities, involvement of foster carers and siblings in activities, and looked after children and foster carer focused planning and design – would, through a series of change mechanisms, result in the following outcomes:


In general the evaluation indicates that the arts and cultural activities did have a positive impact on the looked after children that participated at each site. In particular, the project has resulted in a marked improvement in the self-efficacy and empowerment of many of the children involved. Similarly, the project has also resulted in an increase in the confidence and self-esteem of many of the children.

This positive impact has largely been a result of the safe space created by the skilled artists, the use of positive arts activities and to the involvement of foster carers in the activities. There were inevitably variations in the extent of the impact across sites and between children.

The evaluation details those factors which contributed to the success of the project, identifies the challenges faced which varied significantly across sites, and makes a number of recommendations as to the design and delivery of similar projects aimed at looked after children.

For these reasons, the evaluation is an invaluable resource for those planning to deliver similar programmes for looked after children in the future.

Friday, February 21, 2014

Tackling the root causes of health inequalities


The Department of Health commissioned the NHS Institute to oversee support for local partnerships, to help them accelerate their progress in addressing health inequalities.

The programme was intended to tackle not only the ‘wicked problems’ – entrenched health issues affecting communities – but also the wider social determinants of ill health – the ‘causes of the causes’.

Each HPHL partnership selected its own wicked problem to address, and HPHL provided external expert support to each of the partnerships. Partnerships were supported by webinars, learning events, peer exchanges, a community web forum and access to Institute resources.

The Institute wanted to understand what works well in tackling health inequalities at a local level. Because of our extensive experience in conducting complex evaluations in the healthcare sector, it commissioned OPM to discover the success factors, challenges and impacts of projects and the HPHL programme as a whole.

What we did

We used a range of evaluation methods, including literature reviews, an online survey (covering local authority, PCT and other stakeholders from the partnerships), site visits, developing case studies and information mapping. We wanted to hear from stakeholders and learn from different approaches.

HPHL partnerships experienced success in different ways; for some, the project added a new, but complementary, health inequality strand to their work. These partnerships were characterised by strong senior leadership and day-to-day project management, and had existing governance arrangements and previous experience of addressing health inequalities. Others needed support to overcome the barriers to achieving the HPHL objectives.


The evaluation highlighted that different approaches are needed to really have an impact, with a shift from tackling symptoms to fully understanding what lies behind health inequalities. The programme acted as a catalyst, sharpening partners’ focus on the wicked problems and adding momentum by bringing partners together and providing support.

Through the collaborative, formative approach we adopted, the Institute and partnerships were able to incorporate learning as it emerged. The Institute has used the evaluation findings to help shape the second round of HPHL, for example, refining the support for partnerships, and continuing to enhance the webinars with a range of expert speakers, which the evaluation identified as useful to the partnerships. Several partnerships are sustaining their programmes, particularly where they were able to integrate HPHL activities with other workstreams.

This learning is informing the council’s future approach to community leadership and member development, and the programme is now being rolled out into other parts of the county.

Wednesday, February 19, 2014

Evaluating the Impact of Islington Giving, Cripplegate Foundation


Islington Giving is a campaign of local and national philanthropic organisations that work together to tackle the most pressing local issues facing the poorest residents in Islington. The campaign formally launched in 2010 after a preparatory period where it identified the key areas of focus, core activities, structure and constitution. It is working to meet ambitious targets relating to fundraising, volunteering and grant-giving over a five year timescale.

This evaluation occurred just after the mid point of the campaign’s delivery and was designed with the following aims:

What we did

Our first exercise was to identify Islington Giving’s key objectives during an initial scoping period. Through this process we identified three hypotheses to explore through the evaluation. These are that Islington Giving is:

These objectives formed the basis of the evaluation framework and helped us identify who we wanted to speak to and the questions we wanted to ask. The methodology was purely qualitative: we asked participants to describe the ways in which they have engaged with Islington Giving, the added value offered through this engagement and the impact this has had upon themselves and their wider community.

We spoke with 46 people including board members, strategic stakeholders, volunteers and donors, grant recipient staff members and grant recipient beneficiaries themselves – the people that attend Islington Giving-funded projects.


Islington Giving achieved a great deal in the eighteen months between its launch and this evaluation. It has managed to simultaneously bring together a range of local funders to work together as strategic partners, whilst also delivering effective activity to tackle the specific local need facing some of Islington’s poorest residents.

This innovative approach attracted attention from funders and councils from other areas who are keep achieving similar goals. OPM worked with Islington Giving to identify ‘top tips’ for those looking to replicate a place-based multi-funder model. In summary these are:

Tuesday, February 11, 2014

Evaluation of the NHS Patient Feedback Challenge


The Department of Health funded the £1 million NHS Patient Feedback Challenge to support the spread and adoption of innovative approaches to involving patients and carers, capturing their feedback and using this to inform service improvements. NHS organisations submitted applications to take part in the programme, and nine projects were funded across England, all adopting different approaches and forming partnerships with other NHS, social care and commercial organisations.

The NHS Institute wanted to learn about the various approaches to adoption and spread employed by the nine projects – what worked, in what circumstances, and for whom? They also wanted to capture evidence of the impacts emerging, including those generated within programme timescales as well as longer term anticipated outcomes, and assess the extent to which the programme offered value for money.

What we did


As a result of emerging insights from the evaluation, the NHS Institute commissioned a series of evaluation webinars, to support project leads to develop their own plans for monitoring the impacts arising. This included an introduction to economic assessment, helping project leads to assess the financial impact of their initiatives.

The NHS Institute was able to use the evaluation report to make recommendations for future work in this area.

The case studies and brochure will be used to encourage others to adopt the approaches embedded in the project sites, and to foster further linkages between NHS organisations and others with an interest in health and social care.


Tuesday, February 11, 2014

Measuring the economic impact of care homes wellbeing programme


In spring 2012 OPM carried out research into the care homes sector in the UK, on behalf of the NHS Institute. This research informed the development and marketing of the NHS Institute’s Care Homes programme, a service improvement package aimed at residential and nursing care home providers, owners and managers.

The NHS Institute worked with several care homes across England to test and refine the programme’s service improvement tools. The NHS Institute wanted to know about the effectiveness of using the tools in care home settings, the critical factors underpinning successful implementation, and the challenges encountered by staff and managers.

The evaluation was commissioned to generate evidence of impacts emerging – including impacts on residents, staff wellbeing and retention, safety and efficiency. The NHS Institute also wanted evidence of the financial savings arising, to enable potential purchasers to assess the programme’s value for money.

What we did

We carried out two strands of work:

  1. We carried out fieldwork with three homes that tested some of the tools in the Care Homes Wellbeing programme. This included visiting homes to see the changes made, interviewing staff and managers about the processes they went through, and capturing evidence of outcomes achieved. Following this fieldwork, we developed three in-depth case studies outlining the test site experiences and impacts arising as a result of the programme. The case studies explored the impact of the tools on reducing falls and accidents within the home, saving staff time by re-organising systems and tidying shared spaces, and introducing ‘resident summary at a glance’ boards to enable all staff to quickly see the status of all home residents.
  2. We developed a ‘ready reckoner’ economic assessment tool, to enable care home owners and managers to assess the potential financial benefits and costs associated with the Care Homes Wellbeing programme. We reviewed programme documentation, cost data captured from case study sites (including staff time spent on implementing the tools) and secondary data from other sources in order to populate the ready reckoner.

Our understanding of the care homes sector, and the challenges it faces in realising service improvements, enabled us to explore the crucial issues in depth with care home staff and managers. The work also built on our experience of producing economic assessment tools for the NHS Institute High Impact Actions in Nursing and Midwifery – The Essential Collection, enabling us to employ a similar methodology to developing the ready reckoner tools.


The project enabled the NHS Institute to assess the effectiveness of the tools when employed in care home settings. We provided evidence of impact as well as insights regarding some of the challenges experienced and critical success factors.

The ready reckoner tools enable care home owners and managers to assess the value for money of the Care Homes Wellbeing programme before taking purchasing decisions, and provide a resource to assess the impact of the tools in different contexts (e.g. homes with different staffing structures or with residents with particular needs).



Tuesday, February 11, 2014

Evaluation of the Health Foundation’s MAGIC programme


There is a growing body of research that highlights the benefits of shared decision making, including patients being more comfortable with decisions, having improved confidence and coping skills and making more appropriate use of services. But putting it into practice in the day-to-day reality of hard-pressed health services remains a challenge.

The MAGIC programme aims to take on this challenge by bringing together a small group of passionate frontline staff, managers and academics in NHS sites in Newcastle and Cardiff to implement and embed shared decision making at individual, team and organisation level.

The programme worked to embed SDM in a range of primary and secondary care settings and involved developing and testing practical solutions that support patients and healthcare professionals to work together to make decisions about treatment and care.

The programme combined a range of activities and forms of support to embed SDM. This included running skills development workshops for participating clinicians, working closely with a number of clinical teams to develop and implement decision support tools for use in consultations as well as social marketing campaigns to increase organisational and patient awareness of SDM. The programme also provided regular facilitation and peer support with the participating clinical teams and sought regular feedback and input from a patient public panel.

What we did

The aim of the evaluation was to assess how, and to what extent, the MAGIC programme was able to embed SDM within clinical settings. The primary focus was on understanding and exploring the ‘process’ through which SDM was implemented. It used largely qualitative methods to elicit insights about what worked well, what worked less well, and in what circumstances, rather than establishing the impact of the programme on measurable outcomes.

The evaluation findings are based on a range of data, including the development of a programme logic model, observations at MAGIC local design team (LDT) and core design team (CDT) meetings, in-depth interviews with participants and stakeholders, and interviews with patient representatives. In-depth interviews with key staff and a small sample of patients were carried out seven clinical settings across the two sites Finally, an online survey was conducted to capture the views and experiences of all staff in the clinical teams that took part in the programme.


The evaluation helps to fulfil the core of aim of the MAGIC programme which is to build practical and transferable knowledge about how SDM can become a core characteristic of routine clinical care, how this can be achieved and what the conditions for success are.

Insights and learning has been captured through:

Friday, February 7, 2014

Building the capacity of the voluntary and community sector


TreeHouse (renamed Ambitious about Autism) was delivering the Parent Support Project (PSP), funded by the then Department for Children, Schools and Families (DCSF), and needed an evaluation to assess its impact. PSP aimed to encourage professionals, parents and local government to work together to develop appropriate services for children with autism.

PSP exemplified two of the key planks of the Government’s support for the third sector, which were supporting the third sector in community action and campaigning, and improving local partnerships.

TreeHouse wanted to evaluate the PSP not only to provide evidence of impact to the DCSF but also to learn from the experience, so they could apply the learning elsewhere.

TreeHouse approached OPM as they were attracted to OPM’s values-driven approach to evaluation, and by OPM’s deep understanding of the public service commissioner–provider relationship.

What we did

OPM and TreeHouse were committed to conducting the evaluation in a way that did not merely satisfy the needs of the funder, but also built the capacity of TreeHouse staff to sustain improvements over the longer term.

The approach adopted for the evaluation of PSP recognised that meaningful engagement and the nurturing of ongoing learning and reflection could be a means of building capacity within TreeHouse to achieve ambitious longer-term outcomes. OPM and TreeHouse colleagues worked together to co-design the evaluation and a number of research instruments. They also identified specific activities that TreeHouse staff could do, with training and support from OPM, in a way that would not threaten the integrity of the evaluation. We then distilled learning from the evaluation to refine practice.


The evaluation helped TreeHouse demonstrate value and accountability to the funder and stakeholders, and communicate learning about the PSP. Co-producing the evaluation meant that TreeHouse staff involved began encouraging others in the organisation to use evaluation techniques, which contributed to a culture change.

Building on the PSP evaluation, TreeHouse secured new funding to deliver a Parent Participation Project (PPP), which benefited from PSP learning. For example, a key finding from PSP was that TreeHouse needed to adopt a ‘doing-with’, rather than a ‘doing-to’ approach when working with stakeholders, and to treat them as equal partners. So TreeHouse involved stakeholders in planning the PPP from the start. One stakeholder found this: ‘inspiring and generated increased momentum to work together to ensure that participation is embedded in service delivery’ (stakeholder comment after a workshop).