Is Social Impact maturing into something more sensible now?

Once, in my early twenties, I went swimming off the coast at Whitley Bay in May, and ended up in a first aid hut wrapped in a hypothermia blanket. I’m a bit more sensible now.

Twenty years ago, someone had the bright, incisive idea that heart-warming stories coming out of charities and social enterprises were simply not enough information to inform any judgement on whether that organisation is any good at delivering its mission.

All charities and social enterprises claim to be able to turn money into social change.

But some are not as good as they say they are. Some have a misplaced focus, in that they are inadvertently providing something else entirely.  Some provide nothing of value at all, other than salaries for their staff.

Why would an investor or funder, local authority or government department invest in the services of a social enterprise that couldn’t demonstrate its ability to use its money effectively to create change in peoples’ lives, to make a difference?

Social outcomes measurement was born, and in the last few decades has become an ever growing and changing field of endeavour, straddling the academic and practitioner worlds.

At first, academics grabbed the concept of measuring social impact and they subjected charities to large scale, expensive research projects that attempted to have control groups, verifiable and exhaustive data collection, impeccable data analysis and lengthy, 100-page reports.

An early contributor to this field was the Joseph Rowntree Fund (JRF), and many social science departments of universities joined in with gusto. Brilliant research, greatly needed, but totally outside of the capability of ordinary organisations to emulate. In addition, even the best and most costly research was out of date within months.

As Voltaire may have once said (I can’t find the quote, even though I live by it) ‘perfect is the enemy of good’.

How can a real social enterprise demonstrate the change it is making?

A real social enterprise has resource constraints such as not enough money to spend on outcomes collection and reporting, not enough skilled staff, and not enough hours in the day.

Data collection is a particular challenge, as the people in receipt of services may be vulnerable or have conditions that make it difficult for them to understand or respond to questions. When developing an outcomes measurement process for MIND, everything was straightforward except for how to get people to assess their own improvement in their mental health.

Luckily the academics at Warwick and Edinburgh universities had developed a useful measurement scale for wellbeing, the Warwick-Edinburgh, but the problems of data collection remained. Staff had to sit down with clients at the start of their course of interventions to go through a set of questions, marking their answers on a chart. Then repeat the process at milestone points and at the end.

Staff ‘forgot’ to do it. They complained that the process took up valuable client time, and that the client’s interests had to come first. They were right. Data collection cannot be a burden on a client-centred organisation. It has to be proportionate to the size and capability of the organisation. It has to be in the interests of clients, easy to collect, and pleasurable to do, or it won’t get collected.

 

Data collection is the key area to get right

Social ventures and charities should focus first on their clients, beneficiaries or end users, and think about how they like to communicate. How do they naturally talk? Are they comfortable with some concepts and not others? What sort of age are they and what are they words they understand? Would pictures work better? What about putting coloured counters into jars labelled with emoticons, or just one printed word? Are they tech-savvy or quite the opposite?

The questions themselves are massively important too

If a social venture has thought through its ‘theory of change’ – ie its understanding of how it creates change in individuals and in society, and what those changes look like, it will know what the desired outcomes are of its activity or intervention.

Survey questions should relate directly to outcomes and should be laser-like in their ability to find out whether or not a particular outcome has in fact manifested in a participant, and that participant should be able to easily report back on the extent to which they have experienced that outcome.

How many questions to ask?

Just four, laser-like questions are enough for younger participants, if you have asked the right questions you don’t need to keep on asking more and more.

For older groups, anything up to ten questions is probably going to be enough. Any more than ten and you are straying into academic overkill, and haven’t selected your questions carefully enough. People will not answer numerous, ill-defined questions. They shouldn’t have to.

How social enterprises are making sure they are delivering what matters to their participants

The importance of listening to clients seems obvious. The people for whom the organisation exists, the recipients and beneficiaries, surely these are the people whose views and experiences drive the organisation? Ummm, yeah of course they are…

The locus of power should be firmly with the beneficiaries. They should help design the questions and processes, and get the results of the analysis, in a form that they understand. They should have the chance to respond to findings. Too often in the past, beneficiaries have given their data for everyone else to pore over. This has to stop. Beneficiaries should have the power to hold the organisation to account when outcomes are not delivered, change not created.

What does a mature approach look like?

Real change comes through building a learning culture whereby organisations design and use their data to manage their organisations so that more and better outcomes occur in the lives of their participants.

There are multiple ways to demonstrate impact in different contexts, not all of it granular like the questions in a survey. Qualitative methods, such as films, narratives, and stories can be used to back up the systematic collection of outcomes data.

A mature approach takes heed of the real-life, on the ground challenges to data collection. It is person-centred, light touch, enjoyable, and generates a picture of the impact created.

It doesn’t plunge headfirst into a freezing sea on a grey day, and expect that to be a good experience.

Photo by Pixabay