Improving impact starts with using your data

The ‘use to improve’ principle is simple. It says: the best way to improve your data is to use the data you already have. Read on to find out more!

Published:
April 3, 2025
March 26, 2025
Published by:
ImpactLab
Download PDF
Visit their website
Improving impact starts with using your data

The Use to Improve Principle 

The ‘use to improve’ principle is simple. It says: the best way to improve your data is to use the data you already have.

Step one: Use the data you already have


House of Science is a charitable trust based in Tauranga who work to enhance science education in primary and secondary schools around Aotearoa through teacher professional development and science resource kits. They aim to empower teachers to foster understanding, curiosity, and critical thinking in their students that can pave the way towards future careers in science. 

House of Science got in touch with ImpactLab because they wanted to make better use of their data.

One simple, important question they had was: How many students are we reaching each year?

To help answer this, they used results from their regular survey to teachers which asked: how many students in your class used this science kit? Teachers answered by selecting a range, for example,10-20 students, 21-30 students etc.

Looking at the survey results together, we realised that depending how many kits each child used per year, House of Science could be working with anywhere between 7,000 and 50,000 students, with ~20,000 as a median guess. 

In addition, the team noticed many teachers weren’t responding to the survey. As Business Development Manager Sandra Kirikiri described it:

“Our reply rate from teachers was ridiculously low – maybe around 20 percent - and with an output of approximately 400 kits per week, you’re left with a large variance in the number of students that you’ve reached.”

Step two: Improve Your Data


Realising the limitations of the existing data collection, House of Science decided to change their process so they now require teachers to provide the exact number of students that will use a kit before they are able to place their order. 

The improved method got a 100 percent response rate and showed that the number of students engaging with at least two of their kits during the year was 63,725 (allowing for some assumptions about absences).

By tweaking their data collection process, House of Science found out they were reaching three times more students than the previous year’s best estimate.

This means House of Science have both a more powerful, and a more accurate story to share with funders about the reach and impact of the programme. 

With the charity adding more partner schools, they are now able to more accurately track how student reach and engagement changes as they scale.

For the policymakers:

A common piece of feedback we hear from community organisations receiving government funding is that they spend a lot of time reporting to their funders, but the reports don’t get read.

The first step to improving the quality of the 'data feedback loop' between funders and providers is for funders to use data they are already getting. By identifying one key outcome to improve and starting with that, the quality and usefulness of the data can be enhanced. 

This process needs to be iterative, and policymakers should expect to invest in the capacity of funded organisations to improve their systems over time.

For the analysts:

Particularly for larger programmes, better understanding of scale and reach is often some of the ‘low hanging fruit’ of data improvement. This story highlights several common issues we see with counting people:

  • Counting number of outputs (in this case, kits) rather than the unique number of people engaging with those outputs.
  • Low response rates which make counts highly unreliable.

Addressing these issues can create a much better view of the reach and impact of programmes. Awareness of these issues can help in appropriately interpreting impact data.

For the frontline workers:

House of Science could have made many changes to their data system, but by using the data they already had they discovered a low-cost, high-value change they could make to get a much clearer view of their impact.

For frontline teams with limited time and resource to invest in data systems, using the data you already collect to try and answer a specific, important question is a good way to figure out where to start with improvements.

The Use to Improve Principle 

The ‘use to improve’ principle is simple. It says: the best way to improve your data is to use the data you already have.

Step one: Use the data you already have


House of Science is a charitable trust based in Tauranga who work to enhance science education in primary and secondary schools around Aotearoa through teacher professional development and science resource kits. They aim to empower teachers to foster understanding, curiosity, and critical thinking in their students that can pave the way towards future careers in science. 

House of Science got in touch with ImpactLab because they wanted to make better use of their data.

One simple, important question they had was: How many students are we reaching each year?

To help answer this, they used results from their regular survey to teachers which asked: how many students in your class used this science kit? Teachers answered by selecting a range, for example,10-20 students, 21-30 students etc.

Looking at the survey results together, we realised that depending how many kits each child used per year, House of Science could be working with anywhere between 7,000 and 50,000 students, with ~20,000 as a median guess. 

In addition, the team noticed many teachers weren’t responding to the survey. As Business Development Manager Sandra Kirikiri described it:

“Our reply rate from teachers was ridiculously low – maybe around 20 percent - and with an output of approximately 400 kits per week, you’re left with a large variance in the number of students that you’ve reached.”

Step two: Improve Your Data


Realising the limitations of the existing data collection, House of Science decided to change their process so they now require teachers to provide the exact number of students that will use a kit before they are able to place their order. 

The improved method got a 100 percent response rate and showed that the number of students engaging with at least two of their kits during the year was 63,725 (allowing for some assumptions about absences).

By tweaking their data collection process, House of Science found out they were reaching three times more students than the previous year’s best estimate.

This means House of Science have both a more powerful, and a more accurate story to share with funders about the reach and impact of the programme. 

With the charity adding more partner schools, they are now able to more accurately track how student reach and engagement changes as they scale.

For the policymakers:

A common piece of feedback we hear from community organisations receiving government funding is that they spend a lot of time reporting to their funders, but the reports don’t get read.

The first step to improving the quality of the 'data feedback loop' between funders and providers is for funders to use data they are already getting. By identifying one key outcome to improve and starting with that, the quality and usefulness of the data can be enhanced. 

This process needs to be iterative, and policymakers should expect to invest in the capacity of funded organisations to improve their systems over time.

For the analysts:

Particularly for larger programmes, better understanding of scale and reach is often some of the ‘low hanging fruit’ of data improvement. This story highlights several common issues we see with counting people:

  • Counting number of outputs (in this case, kits) rather than the unique number of people engaging with those outputs.
  • Low response rates which make counts highly unreliable.

Addressing these issues can create a much better view of the reach and impact of programmes. Awareness of these issues can help in appropriately interpreting impact data.

For the frontline workers:

House of Science could have made many changes to their data system, but by using the data they already had they discovered a low-cost, high-value change they could make to get a much clearer view of their impact.

For frontline teams with limited time and resource to invest in data systems, using the data you already collect to try and answer a specific, important question is a good way to figure out where to start with improvements.

Newsletter Sign up

Join our Newsletter to stay updated with Impactlab’s latest resources, zero spam, promise.

By subscribing you agree with our Privacy Policy
Subscribe
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Other Articles

All resources
Practical tips for using your GoodMeasure report to secure funding
Articles
Articles
March 24, 2026
March 26, 2026

Practical tips for using your GoodMeasure report to secure funding

Many organisations measure their impact but struggle to talk about data findings with confidence. Here's what actually works, with practical tips from funders, fundraising experts, and charities who've done it.

Read resource
Reaching the 15% with the Highest Needs
Articles
Articles
March 26, 2025
February 26, 2026

Reaching the 15% with the Highest Needs

Reaching the 15% with the Highest Needs – 15% of Kiwis account for half of all social service use, yet mainstream support often falls short. By redesigning services around real needs, we can drive better outcomes for those most at risk.

Read resource
The Social Investment Agency: What we know so far
Articles
Articles
October 22, 2025
November 27, 2025

The Social Investment Agency: What we know so far

With SIA’s second funding pathway EOI now closed, we unpack what the Social Investment Agency is, what it aims to do, and what these changes mean for providers in the social sector, kicking off our new series on SIA and outcomes-based funding.

Read resource
Part 2- Understanding the Intervention Logic
Articles
Articles
September 19, 2025
October 8, 2025

Part 2- Understanding the Intervention Logic

Part 2 of this series, unpacks the Intervention Logic, outlining how it differs from a Theory of Change and the role it plays in mapping programmes, evaluating outcomes, and supporting clear communication with stakeholders.

Read resource
All resources

Get in touch

We help impact organisations know, show and grow their social impact. Let’s work together to do good, better!

Talk with us
Internal meeting of three people
Woman at computer screens
Hayden Judd and Dan BossonInternal meeting of two people