This post is written by Paige Kirby at Development Gateway, who last week launched the Results Development Initiative (RDI)
As a development community, we’re poised to invest billions of dollars in gathering data on results indicators in pursuit of achieving the Sustainable Development Goals (SDGs). Ostensibly, this data will improve project efficiency and effectiveness.
But are investments paying off? And what should the future of results management look like?
Through the Results Development Initiative (RDI), supported by the Bill & Melinda Gates Foundation, we have spent the past year teasing out how we can better collect, share and use results data. Bounding our research to the countries of Ghana, Tanzania, and Sri Lanka, and the health and agriculture sectors, we tackled this challenge from two angles:
- Qualitatively, from the bottom-up, with a particular focus on the voices of local governments and organisations; and
- Quantitatively, from the top-down, looking at output and outcome data published by major development agencies.
From the bottom-up…
Full qualitative findings can be found in each country report but across contexts we were able to identify four overarching themes critical for the open data community:
- We are collecting too much data. Front-line service providers are spending too much time collecting data that are neither useful nor used for decision-making. We strongly urge our colleagues to critically evaluate which indicators they collect, which they use, and how much those indicators cost.
- We are not yet providing the right tools for local actors to meaningfully use data in project planning and management. As a nonprofit that seeks to provide this type of technology, we’re taking this to heart and critically evaluating how we can better empower local level analysis. As it stands, most data tools are a long way from achieving this goal.
- Often acknowledged but infrequently discussed, incentives are critical to promoting data use. Existing structures do not necessarily reward those who use data to achieve results through promotions, recognition, or other incentives. We did find some ‘power’ data users and some government ministries where benchmarking performance, recognising exceptional data users, and leadership rhetoric helped galvanise data uptake – but we can and should do more.
- Finally, there’s a disconnect between data collectors and data users – which takes a toll on data quality and uptake. Lower-level officials are often tasked with collecting data, but are rarely told what this data could and should be used for, leading to low ownership and use. We recommend that senior officials give more proactive feedback around the value and use cases for data to local actors.
From the top-down…
From a quantitative perspective, we wanted to determine whether we could take a look at big-picture results across major development partners. Like many organisations, we feel getting a bird’s eye view of development activities can help all actors invest more efficiently and effectively.
Our team ‘joined up’ a decade’s worth of output and outcome data from 17 major donors. It took hundreds of hours to link up these indicators and we don’t see a real business case for replicating this work in the future.
However, there are some basic things that development agencies can do to make their results information more accessible and comparable.
Specifically, we noted five priority areas that could make results information more accurate, representative and informative. We are currently holding one-on-one consultations with development partners and others interested in learning more about our methodology and in improving results reporting.
So, what’s next?
As we seek smarter, more sustainable development solutions, collecting and using results data to greater effect only becomes more important. In the short term, any feedback on RDI findings – questions, whether this information is helpful, or anecdotes from your experiences – would be much appreciated. In the immediate to long-term, please join us as we evaluate ways that we, as a development actor, can better incentivise data use.