News and Comment

Reflect, adjust and share learning: how to measure soft skill development in children and young people

Monday 22 August 2016

By:

So far in this blog series, we’ve talked about how to manage expectations and understand findings from baseline and exit surveys with children and young people. We’ll now continue this conversation with a focus on the challenges of measuring soft skill development, and share some useful tools and ideas from our recent work.

The challenges of measuring soft skills and getting honest and accurate answers when working with children and young people are well documented. Common reasons for this include:

  • Developing self-awareness – assessing your level of soft skills relies on understanding what the skill entails and self-awareness about where you might be in its development process. A lot of young people don’t have enough experience yet to judge this, and are unsure about where they sit in this process or what is ahead for them to develop
  • Soft skills are notoriously hard to define – take ‘confidence’ for example, asking a question like ‘how confident are you?’ has a wide range of meanings for different participants in different situations
  • Giving answers we want to hear – children and young people tend to be more susceptible to giving answers they think adults want to hear. This can result in participants ranking themselves as highly skilled at the start of an intervention because think it will be ‘pleasing’ to the programme leaders or researchers

This blog explores how we adjusted our approach to overcome these issues when evaluating the Reading Agency’s Reading Hack and Greater London Authority (GLA)’s Stepping Stones programmes.

Lessons from Reading Hack evaluation

Reading Hack is an innovative national programme that brings together volunteering, reading advocacy and reading-inspired activities for children and young people. Our 3-year evaluation was designed to measure impact and programme learning, while contributing to the evidence base of ‘what works’ in getting young people to read more.

The first year of research involved surveys, observation, focus groups, and in-depth interviews with young people and staff. Our fieldwork focused on four case study library authorities, while our baseline and exit surveys were co-produced with the Reading Agency and rolled out to participants across the country.

When the survey results came back to us we found they told a very different story from the qualitative fieldwork. We knew from interviews and from reflective questions included in the exit survey that participants felt the programme was having a positive impact on them in lots of ways, but the comparison of the baseline and exit survey results suggested that little change had taken place for young people over the course of the year. Although young people had generally ranked their skill development high in the exit survey, many had also ranked themselves so highly in the baseline survey that there was comparatively little change. The second issue was that the baseline and exit surveys were not linked due to concerns about collecting personal data from this age group, and Reading Hack is a rolling programme with young people joining and leaving at different times across the year. This meant there was no way to know for sure if the same young people who completed the survey at the start of the programme were those who completed it at the end.

To address these challenges, we looked at what had worked well in understanding impact so far in this evaluation. Building on these strengths, and together with the Reading Agency, we adjusted our approach to the survey in Year Two. The traditional baseline and exit survey was replaced with a reflective survey to be filled out at the end of the participants’ journey. This type of survey encourages self-reflection of the ‘distance travelled’ and acknowledges that participants’ awareness of the skills in question will be more developed at this stage in their journey. This means the young people are likely to be more able to reflect on where they think they were at the start and how this compares to where they feel they are now. Secondly, a reflective survey means we can confidently capture a young person’s complete journey, solving our challenge of the unlinked baseline and exit surveys. We anticipate that by adapting our approach this will better capture the extent of the changes we heard about through our qualitative research.

Lessons from Stepping Stones evaluation

GLA’s Stepping Stones programme is another example of where we have recently addressed some of the challenges of evaluating soft skill development with children and young people. Here we are combining formative, summative and economic elements to understand whether the GLA’s pilot model of peer mentoring and life skills lessons does indeed support at-risk children to have smoother, more successful transitions from primary to secondary school.

In order to set an honest and accurate baseline, we are working with the GLA’s Peer Outreach Workers and using their ‘virtual young Londoner’ tool. This tool creates a fictional character and allows young people to reflect on what needs, successes and challenges their character faces. Through projection, participants are able to speak more openly about their aspirations, behaviour and attitudes without feeling overly vulnerable or exposed, especially when speaking in peer groups. This method also removes any judgment as to where participants set their baseline and exit responses. Instead of asking young people to rank themselves as ‘good’ or ‘bad’, it asks for examples of how their character might feel, react or change in different situations.

We are in the early stages of the Stepping Stones evaluation but we are taking all this learning on board as we design the next steps. As in the Reading Hack evaluation, we are also taking a multi-method approach so that we can compare findings to uncover and explore patterns and discrepancies as they arise.

The importance of sharing learning and reflection

Although the challenges discussed in this blog are not new, our aim is that through reflection and sharing our learning, tools and methods we can work together to keep uncovering what works, why, and how. This blog series is part of accelerating this conversation and strives to share approaches for capturing more meaningful findings to support work with children and young people.

Be sure to check out the next blog in this series to learn more about flexible approaches to qualitative research with children and young people that can help in this endeavour.