This is the fourth of six key pillar posts that will take you through our approach. We recommend reading each of these posts and following the links we’ve made to other useful tips and tricks. If you’re working on evaluation and outcome-monitoring, you might like to come back to these posts as and when you need support. We’re here to help! So please do get in touch if we can help you embed an outcome-focused approach to your work.
- An overview of the Matter of Focus approach
- Understand the outcomes and impacts that matter
- Understand the unique context of your work
- Get the data and evidence you need to track your progress (you are here)
- Track and visualise the difference you make
- Report in an outcome-focused way
There are several ways of framing this approach to evidencing the difference your initiative is making. Like others working with complex systems, we believe in investigating not just what works, but how things work, for whom, where, when and why.
- Can this work? For the early stages of an initiative, setting out your intended outcomes and impacts, thinking about and collecting early feedback, and building up a picture of what is important, what is going well and where change is needed.
- Does this work? For tracking ongoing work – checking if your initial assumptions are correct, or when rolling out to new people, geographies or sectors.
- Is this working? More light touch monitoring for tried and well evidenced approaches where it is important to check things are going as expected.
- How did this work? For sharing what difference your initiative made, or for evidencing to funders and stakeholders how you made a difference.
- To what extent is it working? Once you know whether and how your initiative works, then you can start to build up more quantitative data to answer this question. If you try and answer this question too soon, you may end up measuring the wrong things. (For example, see our post Why the data on children’s wellbeing is misleading).
The Matter of Focus approach to working with outcomes and impact uses an evaluation method called contribution analysis. This approach shows you how to define what matters to a project, programme, or organisation and set out an outcome map which shows the difference you make. The next job is to assemble evidence against it.
What data, evidence and feedback do you need?
Our approach, and our software OutNav, encourages collection and analysis of evidence, data and feedback. It combines qualitative and quantitative data. This approach helps you to think about what the evidence is telling you by breaking it down into manageable chunks, which can be combined to tell a coherent impact story.
Initially, we advise looking at all of the available information you already have and seeing where this will help evidence your story. This might include administrative data, outcome tracking systems, feedback from people you work with and any relevant research you are using. We go through a data audit process with our clients to think about this and to assess what is useful.
Once existing data is assembled against an outcome map it is easy to get a picture of where the gaps are. For example, it’s quite common for teams and organisations to have more data on the work they do and who they reach. It’s also normal to have gaps around how people feel, what they learn and how this helps work towards the outcomes or impacts that are important. We talk more about this in our next key post Track and visualise the difference you make.
We then help you improve your data system to get just the right amount of information for the work your initiative is doing. The right amount means having data that is ‘good enough’ to tell a robust story.
Tips for ensuring your data is ‘good enough’
- Collect data, evidence and feedback in a thoughtful and purposeful way.
- Be clear about what this data will be used for.
- Plan data collection according to your capacity to actually make use of this evidence, don’t collect more than you need.
- Be mindful of the reporting requirements for funders or other stakeholders.
The purpose of this is to get ‘good enough’ data, evidence and feedback for the specific context and requirements of your initiative.
Quantitative data is good for:
- providing numerical snap shots
- measuring change along an established scale
- examining reducible phenomena
- ‘proving’ relationships.
Qualitative data is good for:
- understanding and making sense of relationships
- being inclusive of different perspectives and experiences
- examining complex social systems
- providing real life examples.
We believe that the exact data, evidence and feedback needed will be specific to your initiative and the context in which you find yourselves.
Examples of the sources of quantitative and qualitative data we have used with teams and organisations, include:
- Routine data
- Standard assessment data
- Feedback about experiences of services or programmes
- Notes from staff meetings about key risks and assumptions
- Reflective logs
- Focus groups
- Online surveys
- Case notes
- Creative feedback
- Videos/photos
Making data use manageable
We built our software OutNav to support organisations with a vision for social change to be able to use data and evidence efficiently and effectively themselves.
In OutNav you can attach or link your data and evidence to the relevant outcomes in your map, so everything is accessible from a single place (you can see how this looks in OutNav here).
Knowing where your data and evidence is and being able to access it quickly makes your life easier for the next step of our approach…
As soon as you have some data and information you can start assessing your progress towards outcomes alongside how confident you are in your evidence.
Read more about this in our next key post – Track and visualise the difference you make.