top of page

Customer Services: the challenge of demonstrating our value and impact - a review of the CGSUK annua

The Conference this year with its broad theme, (Customer Services: the challenge of demonstrating our value and impact) had a variety of speakers at different levels and from different sectors. Alex Bols, Deputy Chief Executive of GuildHE, gave the opening speech which was very strategic in tone and gave some background to how we got to where we are today in HE and the consumer attitude to education. Alex also mentioned, in answering a member of the audience that “the institutional narrative is a powerful addition to data, don’t lose it by concentrating solely on data”.

This was followed by Lucy Chambers (of Tower Hamlets Schools Library Service) who outlined for us the agenda in education for schools and how matching library projects to this agenda, using an impact evaluation report template, demonstrated the value of Libraries on pupils. I was shocked to se the statistic that England ranks 23rd out of 23 OECD nations for teenage literacy, but encouraged to see “school libraries ‘play a vital role in contributing to pupils’ success’” (NLT School Libraries literature review 2017). Lucy works in this area despite there being no official figures of school libraries in UK, no statutory requirement to have a school library and school libraries are not a core component in the Ofsted inspection framework

Her template included the following headings:

  • Objective (where from)

  • Starting point

  • Activities

  • Purpose

  • Date/Term/Who

  • Costs

  • Success Measure

  • Stats kept

  • Measured impact

  • Comments/What next?

Katie Edwards’ talk on “improving the impact of our social media engagement (within NHS Education, Scotland) exposed me to the words Tweetreach, Followerwonk and Crowdfire for the first time – tools whose results, when added together make a rich picture of social media engagement.

The University of Sheffield’s team honest title (Virtually There? Providing a digital help service to library customers and measuring our impact) gave an overview of using Lib-Answers and investigating statistics from that product. I did like the idea of “Enquiry Cafés” - enabling staff across the library teams to discuss student enquiries to enable service development & enhance student engagement.

The final talk by Alison Brettle (Salford University) and Clare Edwards, (Health Education, England) showed us a practical evaluation framework used in NHS libraries. The methodology identifies three stages: agreeing the focus by setting impact objectives; identifying the indicators that illustrate what change has occurred and planning evidence collection by identifying the most appropriate impact evidence

Once metrics have been created it’s important to ensure they meet the following criteria

  • Meaningful - does the metric relate to the goals of the organisation, to the needs of the users and is it re-examined over time for continuing appropriateness? Do other people care about it? Combining two facets can strengthen a metric — for example usage by a particular staff group

  • Actionable - is the metric in areas that the [UL] can influence? Does it drive a change in behaviour? The reasons for changes to a metric should be investigated not assumed. Beware self-imposed targets - are they meaningful to stakeholders?

  • Reproducible - the metric is a piece of research so should be clearly defined in advance of use and transparent. It should be able to be replicated over time and constructed with the most robust data available Collection of data for the metric should not be burdensome to allow repetition when required

  • Comparable the metric can be used to see change in the [UL] over time. Be cautious if trying to benchmark externally The diversity of services must be respected no one metric fits all.

After the talk we were given pro-formas to discuss and fill in for a chosen critical incident – where we provided details of a specific instance or use of a service and then answered questions relating to this particular instance. The reason for choosing an incident was that the impact from that is specific and general.

CONCLUSION

I found the Conference challenging, seeing different sectors’ approaches to impact and value, and came away encouraged that small steps and methods can work. I was also pleased that there was no single answer or panacea for this very complex but pertinent issue. The other aspect that I came away with was that many small pieces of evaluation give us a bigger story to tell.

Norman Boyd, User Experience & Quality Coordinator, Anglia Ruskin University


Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page