Tuesday, August 9, 2016
Models of Care in Motor Neurone Disease
This report for the Motor Neurone Disease Association (MND Association) explores the different ways MND care is organised across the UK. It examines the practical arrangement of MND care, such as how multi-disciplinary teams are constituted and the relationship between hospital and community services. The report provides advice for the NHS and social services on how to organise MND care effectively.
Tuesday, August 9, 2016
Our work with children and young people is informing best practice
We are delighted that our evaluation of the vInspired 24/24 programme has been included in the systematic review of the Early Intervention Foundation (EIF) Social and Emotional Learning Programme Assessment, contributing to the evidence base of best practice in this area.
The EIF has been funded by the Joseph Rowntree Foundation to undertake a programme of work to understand what works in developing the relational, social and emotional capabilities and assets of children and families either in or at risk of poverty. This work builds upon and extends an existing review carried out by EIF in 2015: ‘Social and Emotional Learning: Skills for life and work’
The 24/24 programme, managed by vInspired and funded by the Department of Education (DfE) and the Jack Petchey Foundation, was a structured volunteering and social action intervention programme, designed to help young people facing challenging circumstances to improve their life choices. The evaluation was designed to measure the progress of individuals during the programme and evidence the impact of the programme on young peoples’ outcomes and against key performance indicators. It also identified the critical success factors and limitations of the delivery model and made recommendations for future delivery.
The evaluation report can be accessed here.
Wednesday, July 27, 2016
New report on Motor Neurone Disease models of care
The Motor Neurone Disease Association (MND Association) has published our report exploring the different ways MND care is organised across the UK. It examines the practical arrangement of MND care, such as how multi-disciplinary teams are constituted and the relationship between hospital and community services. The report provides advice for the NHS and social services on how to organise MND care effectively.
Thursday, July 21, 2016
New research explores the experience of Black-African pharmacy candidates
The General Pharmaceutical Council (GPhC) has just published our report into the experiences and performances of Black-African candidates in the Registration Assessment. For this work we held depth, qualitative interviews and focus groups with Black-African trainees and recently registered pharmacists and interviews with people who are involved in pharmacist education and training. The sample covered a diversity of education and training experiences and demographic characteristics, but included only those who completed an MPharm degree.
Exploring the experiences of a wide range of students, the research points to a complex interplay of factors that influence the experience and performance of Black-African students during their pharmacist education and training pathway. While it is important to note that many Black African students are very successful in their education and training, the research sheds light on some issues that can be specific to students from Black-African backgrounds. This included: the challenges associated with undertaking the training and assessment as a mature student and of being an overseas student; and the challenges associated with being a small minority in one’s school of pharmacy and consequently struggling to form study groups and supportive peer networks. The research also unearthed some examples of explicit prejudice towards and perceptions of implicit bias against Black-African students particularly where they had undertaken their secondary education overseas. Related to this was a perceived lack of Black-African role models within the pharmacist education and training pathway to guide, inspire and motivate students of a similar background.
The research concludes by suggesting some actions which may help improve the experience and performance of some Black-African trainees and at the same time offer benefits to all pharmacy students and trainees, and ultimately to the pharmacy profession.
The GPhC will hold a seminar in October 2016 to bring together those with a role in pharmacy education and training to discuss its findings, and to consider the actions that may be needed to address the issues raised in the report. A copy of the report can be found here.
Thursday, June 16, 2016
A review of public and personal attitudes to the confidentiality of healthcare data, General Medical Council
The General Medical Council (GMC) is the independent regulator for doctors in the UK. They publish guidance which sets out the ethical principles and professional standards that underpin good practice. In 2015 the GMC commissioned OPM to conduct a literature review to inform the updating of its guidance on the confidentiality of healthcare data.
What we did
The literature review was based on detailed searches of eight academic databases. The searches were supplemented by online searches and interviews with relevant experts to ensure that key sources of information were not missed.
From an initial sift by title and abstract, 184 articles were identified as being of possible relevance. Following full text review, 65 articles were identified for inclusion, and reviewed in detail. Relevant data from each item was extracted and recorded in the database against the research question(s) to which it related. The reviewed material was subjected to broad content analysis, with key themes and associations drawn out.
The key findings of the review included the following:
- Professionals are typically more open to the sharing of patient data than members of the public.
- Patients and professionals are enthusiastic about the possible benefits of electronic medical records, but concerned about data security.
- Members of the public have higher levels of trust in the NHS’s use of their data, but low trust in relation to private companies.
- Members of the public often have poor awareness of the ways in which patient information is currently used and who it is available to.
Wednesday, January 27, 2016
A cautionary tale of survey sampling
A cautionary tale of survey sampling
It’s rare that survey sampling gets much attention outside of methodological textbooks. But recently, it has been getting just this attention – in the aftermath of the surprise result to the 2015 general election, that major polling organisations did not predict.
It’s also rare that survey sampling is considered to sway history on a particularly grand scale. But look at how hypothetical scenarios of ‘what if the polls had been different’ are cascading out, now that the Polling Inquiry into what went wrong for the pollsters in the summer of 2015 has shared its first findings.
Although the Polling Inquiry doesn’t report its final findings and conclusions till March, it’s already worth reflecting on what this means for research practice.
At OPM we run research surveys both with large nationally representative samples, but also, very differently, on a much smaller scale with specific groups of respondents. These could be users of a pilot service; frontline professionals delivering particular interventions; or stakeholders in a particular project or area. Often this will involve only sending out surveys to limited sets of specific people.
It may seem that what’s going on in the world of large-scale nationally representative polling has little bearing on these smaller-scale, targeted survey projects. But I’d argue that there is a cautionary tale here for practitioners of both types of surveys, which could be summed up as: ‘know your sample!’
This simple maxim actually has quite wide-ranging implications that affect all stages of the survey process. And, if sampling isn’t carefully considered at each of these stages, quality can suffer. Below are some very basic pointers, which may seem like no-brainers but are worth considering every time when putting together a survey.
- Know who your sample is, and make sure you are asking the right people. If not, the results will be inaccurate or irrelevant; essentially this is what pollsters got wrong in 2015 by under-representing Conservative voters in their polls in the run-up to the election.
- Know your sample’s perspective – so you can design questions that make sense to the people you want to answer. This could be in terms of terminology used, assumptions about the level of knowledge respondents may have already, and what questions they can constructively respond to. It’s worth testing this out with a smaller group before going out with the full survey where possible.
- Know your sample’s behaviour and preferences when it comes to filling in responses, so you can use the most appropriate survey format, timing, and method.
- Know the differences within your sample, so you can distinguish relevant subgroups within your sample. Overall results can hide very significant variations in opinion between different types of respondent.
- Know your sample’s context when analysing and reporting on findings. Some responses only make sense in light of what we know about respondents’ involvement in a service, organisation or field.
- Know when your sample is biased. Sometimes this can be adjusted for; but for very targeted surveys, a degree of bias may be unavoidable. However, being aware of this upfront is vital in order to analyse results with the appropriate pinch of salt.
Particularly with the advent of online surveys, it may seem that the rulebook for survey practice needs to be rewritten. But I’d argue that this maxim is important no matter what method, sample, or scale. And it’s always better to check twice before than to only notice your mistake afterwards – just ask the pollsters from the summer of 2015.
Tuesday, August 11, 2015
OPM research into SEND pathfinder programme published by the DfE
The final report from the evaluation of the special educational needs and disabilities (SEND) pathfinder programme, featuring qualitative research conducted by OPM, has been published by the Department for Education.
The DfE commissioned a consortium led by SQW to undertake the evaluation. The team drew together a wide range of complementary experience as shown in the diagram below:
The evaluation has been on-going since 2011 and has described and analysed the work done to develop new approaches to deliver Education, Health and Care (EHC) plans across 31 local authority areas, and the resultant impact on families.
The report considers:
- Families’ experiences of the new system
- The impact that the new system has had on perceptions of satisfaction, fairness and outcomes
- The cost effectiveness of the new approach.
The report contains data gathered through:
- A survey of 698 Pathfinder families who had received a completed EHC plan between August 2013 and April 2014, and a comparison group made up of 1,000 families that were in receipt of either an SEN Statement or the post-16 equivalent and had not yet received an EHC plan
- Feedback from the initial and follow-up qualitative interviews conducted with families from a sub-set of 13 Pathfinder areas
- Detailed thematic case study work by SQW in a selection of locations, including an assessment of the costs of the old and new systems.
Friday, July 10, 2015
Child Sexual Exploitation: A study of international comparisons
Funded by the Department of Education and commissioned by the Virtual Staff College, OPM undertook an evidence collection and desk review of international comparisons of Child Sexual Exploitation (CSE).
CSE has increasingly become the focus of intense discussion, debate and intervention in the UK. At a summit in March 2015, the Prime Minister described CSE as a ‘national threat’, and announced that it will be given the same priority by the police as serious and organised crime.
The report aims to explore:
- How is Child Sexual Exploitation defined in selected countries?
- To what extent is there consistency in the response of public agencies around the world?
- What can the UK learn from experience elsewhere?
Friday, May 22, 2015
Communicate the method as well as the madness when measuring the political pulse
If a week is considered a long time in politics, then the previous two will have felt like an eternity for some in the polling industry.
The pollsters will be relieved to see the back of the last fortnight. Traumatised by the release of the now infamous exit poll that sent shock waves around the country at 10pm on election night, the following 48 hours were no less nightmarish with the announcement of an independent inquiry into their performance by The British Polling Council. Many have spent the following days desperately scouring raw data for clues and trying to explain to clients, the public and angry politicians how on earth the prevailing wisdom of the campaign failed to sense a Conservative majority. Even some of the big players in the political narrative are now calling for regulation of the polls in the final stages of election campaigns.
Ultimately the polling industry finds itself in such an uncomfortable position because it has a lot to answer for. Never had their services been in such high demand as in the run up to this month’s election, dominating the day-to-day framing of debate in the media by analysts and commentators. The parties themselves danced to their tune of ‘neck and neck’ and a hung parliament, shaping their strategies accordingly. Newspapers succumbed to the apparent deadlock by commissioning their own daily measurements of the nation’s political pulse – and in doing so dedicated discussion, not to policy or to manifestos but to electoral and Parliamentary arithmetic. Yet in practice this apparent dead heat bore no resemblance to reality and was never the prospect it appeared. The influence of the polling industry over political opinion during this period – and could be argued over the eventual outcome – was unprecedented. What proportion of the electorate voted tactically out of a fear that an opposition party were in touching distance of victory?
Yet at its most influential the polling industry was also at its most vulnerable – there are more pitfalls than ever to avoid in the exercise of polling itself. This explosion of political polling comes at a time where people lead busier lives than ever, when much of the public are increasingly reluctant to answer any kind of survey – not least one seeking political opinions when engagement is supposedly at an all-time low, and further complicated by the unusual fluidity of the electorate amid a new, multi-party landscape. Why they got it so wrong and how wrong each individual organisation got it is under investigation by the psephologists. My principal concern is with the communication of their research results and methods to the electorate, as I think they got that wrong too.
The numbers themselves may be subject to critical scrutiny by the British Polling Council, but the messages that accompany the numbers clearly were not. It should have been more important than ever to ensure that the definition of polling – that polls are samples, not forecasts, and the methods by which conclusions are drawn – extrapolating from people that can be reached to the people that can’t – was communicated as convincingly to the public as the poll results themselves. At OPM we’re acutely aware of the importance of effectively communicating how our research findings have been reached and that the public dialogues we run, sometimes exploring complex scientific concepts, are accessible – allowing meaningful participation in debate. These same principles should be adopted in the communication of election poll results, helping to shape a more politically literate public.
Health warnings, disclaimers – in other words, providing a context – may reduce the impact of headlines, but this will be far outweighed by the damage to the industry’s reputation if voting behaviour continues to be misrepresented on the scale it has been this month. It’s just not realistic to expect members of the public to appreciate how each polling organisation weights or models its raw data to arrive at an estimate of voting intentions by glancing at the latest headlines generated by the day’s polls. What the questions were (and how they were worded), how were they answered (typically by phone or via the internet), and who answered them, in other words – the methodology – must feature as boldly as the headlines.
I’m not suggesting that the accurate representation of forecasts is the sole responsibility of the polling organisations – it’s a matter for collective consideration. But polls themselves will no longer be taken on trust alone. The pollsters have no choice but to seek to remedy this confusion over how results are reported and interpreted if they are to recover from what was considered widely to be their worst night for 23 years. The last two weeks might otherwise only be the beginning.
Friday, November 7, 2014
Warm Home Discount – Energy Advice: Consumer Experiences
The Energy Act 2010 provided the Secretary of State with powers to introduce support schemes for the purpose of reducing fuel poverty. These powers have been exercised through the Warm Home Discount Regulations 2011 to establish the Warm Home Discount (WHD) scheme. The WHD scheme is a 4-year initiative running from April 2011 to March 2015. The scheme requires suppliers to provide direct and indirect support to customers in or at risk of fuel poverty. This support may be direct, through rebates to eligible customers, or indirect, through industry initiatives that provide assistance to customers in or at risk of fuel poverty.
Ofgem’s role is to administer the WHD Scheme and ensure that energy suppliers meet their obligations as set out in the WHD Regulations. As part of its administrative role, Ofgem wanted to gather evidence about the effectiveness of energy advice provided through the WHD industry initiatives.
The purpose of this qualitative study was to explore the benefits of advice provided to consumers as part of the industry initiatives element of the Warm Home Discount Schemes. The research was carried out by OPM on behalf of Ofgem and sought to:
- Explore the perceived benefits of the energy advice initiatives to consumers;
- Unearth the factors in the consumer experience and circumstances that underpin these benefits;
- Understand more about how behaviour change happens for different types of consumers; and
- Highlight what might prevent advice being acted upon.
- Consumers can benefit in a range of ways from the schemes
- There is a role for third party intermediaries in the provision of advice to vulnerable consumers
- Consumer benefit can be enhanced when provision closely matches need
- The evidence supports a mixed economy of provision in advice services
- Behaviour change is conditional on a range of competing factors
- Mode of delivery may have an impact on behaviour change
- Support can be key to more vulnerable consumers adopting advice
- Wider initiatives are also valued by consumers