Blog by Eastside People Consultant Toby Williamson.
‘What difference does it make’, apart from being a hit for the 1980s band, The Smiths, cuts to the chase in describing a growing challenge for charities and all voluntary, community and social enterprise organisations (VCSE) – the importance of good quality evaluation to measure social impact and clearly demonstrate the results of their work. Many VCSE organisations now recognise the importance of impact evaluation, but applying this to often complex, values-driven, people-focused activities with multiple outcomes can create tensions and difficulties that need handling with expertise and sensitivity.
I’ve been evaluating VCSE-led programmes and projects for over 20 years, as both an employee and independent consultant. This has included evaluations of projects for charities and not-for-profits that support adults with mental health problems, dementia and other long-term health conditions, programmes focused on minority ethnic communities, grass roots community empowerment programmes, arts, heritage, health and wellbeing programmes including social prescribing, and designing an impact measurement and evaluation framework for an anti-gambling charity.
All of these evaluations were completed successfully and provide valuable insights into evaluation being incorporated into the work of VCSE organisations.
Historically speaking the VCSE sector and impact evaluation have not been close allies. People who work in or use the services of VCSE organisations can view evaluation with a lack of interest or awareness, or even show some suspicion. ‘Why do we need it?’, ‘It will disrupt the project and put people off from coming’, are some reactions I’ve had.
Passionate articulation of an organisation’s values or simple descriptions of activities and services have been seen to be sufficient proof of impact, rather than gathering evidence about outcomes and effects. Evaluation is often seen as research, and therefore the preserve of academic bodies rather than VCSE organisations.
My experience of evaluation in the VCSE sector has been different. Thirty years ago I was co-ordinator for an innovative mental health outreach team based at a local association of the charity Mind where we willingly accepted being part of a research project, together with a separate sociological evaluation, as well as collecting our own, detailed monitoring data. Knowledge is power; we appreciated the information that was gathered, and our funders liked it. But that example is not representative of all VCSE organisations.
Addressing needs and strong values are hallmarks of VCSE organisations. Relationships with the people who use their services are of fundamental importance (environmental and animal charities are a bit different). But so too is the need to demonstrate a positive impact to secure funding for future activities in a competitive environment. In my experience, organisations often want evaluations to prove impact, without interfering with relationships.
Here are some recommendations and suggestions for managing charitable impact evaluation based on things I have learnt over the past 30 years:
- Independence: I often have to remind VCSE organisations that the independence of impact evaluation is important to give it credibility: the evaluation needs to be neutral and impartial in its assessment of an activity or service. This may mean identifying things that didn’t work so well or had unintended consequences.
These are rarely signs of failure (nothing I’ve evaluated has completely failed), but there is always the possibility something won’t work when testing out new approaches; providing this is reported fairly and diplomatically in the evaluation it is an opportunity for learning and improvement. If an evaluation not only looks at outcomes, but also the process of developing and delivering an activity this will provide further insight into why things worked or didn’t go according to plan.
- Credibility: Organisations also sometimes don’t understand how difficult it is for an evaluation to show incontrovertible proof of success. To give an evaluation credibility it’s best to use established ways of measuring impact, though these may need to used alongside particular questions the charity or non-profit wants answering.
But an increase in a wellbeing score shown by someone attending a social prescribing programme once a week for example, doesn’t prove the programme was the cause. Something else might have happened in the person’s life to cause the increased score. If everybody who attended the programme has increased wellbeing scores , that looks like there may be a correlation but without a similar group who didn’t attend the programme (a ‘control group’) one still can’t be sure.
Control groups are extremely rare in evaluations (I’ve never heard of one) for various reasons including cost so, important that evaluations are, it is important to also understand their limitations.
Some VCSE organisations also want concepts such as theory of change or social return on investment (SROI) to be built into an evaluation. That’s good, but they require considerable expertise, specific processes and data collected to do correctly, which needs proper resourcing (and still might not produce the results that were hoped for).
- Data: I have had to remind some organisations that evaluations can’t just be based on describing project or programme activities, outputs such as showcase events or presentations, or a literature review comparing it with similar work elsewhere.
To demonstrate impact on people’s lives some form of data needs to be collected from participants, whether this is through surveys, interviews, focus groups (face to face or online), personal diaries, images, a voice notes app, you name it (and evaluations rightly use many of the methods used in research projects). There’s no avoiding it, this requires some degree of interaction with programme participants or service users. But people unfamiliar with evaluations can be wary or resistant to asking for or sharing personal information.
Evaluators need to understand VCSE organisations they are working with, build positive, trusting relationships and provide clear information about the evaluation methods and why they are important.
- Work in Partnership: The best advocates to encourage programme or project participants to be involved are usually staff and volunteers, so they will be vital partners providing they understand the purpose of the evaluation. Evaluators need to be available to support this role, as well as being as flexible as possible in the way they engage and collect data from participants, taking on board advice from the organisation.
I’ve provided free evaluation workshops for organisations where these issues can be discussed. These workshops can also be used to explain self-evaluation activities that the organisation can lead on. As well as potentially demonstrating impact, it can be pointed out that evaluation findings can be used in the organisation’s publicity material and to support future funding bids.
- Data collection: Even collecting basic monitoring data can be tricky. Asking people about their age, ethnicity, or sexuality can be seen to be overly intrusive and I’ve experienced resistance sometimes when the topic first comes up. But once people understand the importance of collecting data like that (how else will you know if your project is inclusive and accessible?), and the importance of strict requirements around confidentiality and data privacy, those most resistant sometimes have turned out to be the ones who are best at collecting the data.
- Design: A good impact evaluation ideally also needs to be co-designed by the evaluators with the charity or not-for-profit organisation, usually with the intended outcomes of the programme or activity as its focus. This co-design should occur at the start, not as an add-on as the activity to be evaluated is coming to an end, when there’s little scope for measuring its impact over time (though I still receive find myself doing this). By embedding the evaluation at the beginning I’ve also been able to share interim findings which an organisation has been able to use to make adaptations during the course of the project or programme.
I’ve involved programme participants in co-designing as well, and in one recent evaluation we recruited, trained and supported community researchers, who were also programme participants, to do evaluation interviews. The community researchers were able to use and share their local knowledge, connections and insights to enhance the evaluation, and in terms of both quality and quantity the interviews were excellent, so the benefits easily outweighed the (small) additional costs.
- The Evaluation Report: Finally, if an evaluation goes according to plan, a VCSE can expect a clear, accessible impact and evaluation report that can be used for reflection, learning, and shaping future activities of the organisation. I was recently thanked for the “honesty” of an evaluation report I had written. I was rather taken aback by this; if you don’t trust the integrity of your evaluation what’s the point in commissioning it? Taking another song title from the 1980s by The Eurythmics, ‘Would I Lie To You?’.
Find out more about Eastside People’s Impact Measurement and Evaluation Services