Aidspan is an independent observer of the Global Fund. We spoke to Aidspan’s Senior Programme Officer, Angela Kageni, about their work with the Fund and her experiences with open data.
Can you tell me about the work that Aidspan does?
We are a hybrid organisation: A watchdog and think-tank that does research, data analytics and community-level watchdogging. We publish our analyses through various media but mostly via the Global Fund Observer (GFO), a free e-newsletter, to provide news, commentary and analysis.
We focus on the Global Fund and its grant implementers. Since 2002, when we were founded, we have been critiquing global fund policy, structures and guidelines and doing in-depth analysis of grant data, primarily from a global level lens. We opened an office in Kenya in 2007 as a way of ensuring that we could reach implementing countries far more efficiently and after 2010 increased our country level work. Sub-Sahara Africa receives more than 60% of the Fund’s money, thus our current base.
We work on the premise that the Fund is open, based on its principles of accountability and transparency. Thus, we expect its grantees to subscribe to these principles. We assess how far, or how true, that is in reality throughout the structure (i.e. at global and country levels). We also explore how much we can learn about the Fund, its investments, systems, results and usefulness with publicly accessible data. Where we can’t get critical data/information we reach out to the relevant units within the Fund and advocate for it to be made available. If we do uncover any errors we alert relevant units at the Fund and follow-up to ensure they are corrected.
So as well as being analysts you also do quite a lot of advocacy to open up more data from the Global Fund?
We do forms of advocacy but mostly we provide evidence for classic advocates to use. We wade through the data to understand the financial and programmatic picture and find ways of getting this to relevant others. We also comment on focal issues, trends (positive/negative), gaps, or challenges we were seeing, plus points of confusion, conflict or uncertainty. Anything we feel is reducing the impact of Global Fund investments…because we want the Fund to succeed. We use our e-newsletter GFO to provide a platform for people to give feedback. We also play a strong role as “explainers”, helping those at country level who are struggling to understand the Fund, what its money is doing in their countries and what that means for them in the long term.
What kinds of groups are asking these questions? Do you see any cross cutting themes in the kinds of data that they’re asking for?
We work with a wide range of partners; a mix of government, civil society, technical agencies, bilateral and multilateral partners, some research and academia, and faith based organisations.
There are about 5 key data sets people want…the first two focus on money –(1) what’s committed by the Fund’s donors (see Pledges & Contributions) and (2) what’s going to countries, to which entities, for what and how it’s being apportioned per country, region and implementing partner (i.e. agreements, disbursements). (3) Next is the flow of money (expenditure), seeking also to understand where there are blocks or delays. For example, when you notice that disbursements in an active grant have slowed down and a country has not received any money for the past 12 months or has had no expenditure of money received – that’s a red flag for an interested party. We help visualise the data online – see Country Pages, providing a chart that highlights problems – allowing people to drill down to specific grants to get a snapshot of red flags. (4) Next is performance data. The Fund uses a performance based system, disbursing only tranches of money per implementing period. Performance is rated by the Fund for that period. We aggregate the different ratings given per grant and provide an average – useful because poorly performing grants face a risk of closure – See Grant performance analysis tool. (5) The final data set people want is programmatic data – what specific activities are being funded and what results are achieved against set targets. A data portal, Aidspan Portal Workbench (APW), allows people to filter/extract what they need for financial or programmatic analyses.
The International Aid Transparency Initiative (IATI) is set up to fulfil this kind of need. I believe Aidspan uses IATI alongside other datasets?
We tried to use IATI but faced data quality issues. The most efficient source of data for us is the Fund’s web services. It made more sense for us to use the Fund’s web services directly due to the discrepancies in the financial data and particularly in the programmatic data on the IATI platform . We then developed our own database (APW) to allow our analysts to slice and dice the data as needed – the web services provide data in a format that doesn’t allow us to analyze the data easily.
If the IATI data was good enough, and accurate enough, would it be your preference – because it’s more flexible and you’d be able to incorporate it into your system more easily?
Well…we’d have to explore that. If you download all financial data for Global Fund grants from their web services or from our portal and if you take all Fund grants on IATI by, say, transaction, ideally you should be able to get similar results. The IATI Registry is broader, the fields provided are many but what the Fund publishes is not enough for broader analysis. Or if it does give comprehensive data, then it’s not being reflected well enough. For instance, the indicators data has huge differences and an incomplete indicators’ catalogue. Maybe there’s a problem during upload into the registry or an ineffective way of presenting the stored data via the registry interface or a gap in the Fund’s own data.
IATI could allow consistency in the type of data being provider by publishers like the Global Fund against publishers like the UK’s DFID for instance. At least on the fields that do have data it would be possible to do a cross-donor analysis. That’s very useful. But when it comes to an analysis within one donor itself or at more disaggregated levels it becomes difficult. If the data quality was good then IATI would be a key point of reference for us, particularly where we need to triangulate. If it worked out well then I don’t think we would go looking for other data access points for the kinds of data that exist there. On financial data it just makes sense to go directly to the Global Fund. We’ve not had the IATI data as our default on this.
So IATI data is just not useful at the moment?
The platform itself is wide and could be quite useful, which it isn’t right now. The IATI standard is fine – the platform provided is fine. The challenge is that the responsibility of data quality lies with the publishers. It is incumbent on them to ensure their data is as accurate, comprehensive and as up to date as possible. We know for a fact, given the data we access directly from the Fund that one can get real-time data. So, why then is that not the case with the data on the IATI platform? Also, any inconsistencies or errors in the Fund’s own data get translated onto the IATI platform. Maybe if more publishers used the data they published, like they are currently doing at DFID, they’d note these issues and fix them. As an analyst, you don’t want to start thinking about data integrity particularly when you are working with multiple, large spreadsheets. Noting errors becomes difficult and you could get into trouble after you’ve already done an analysis. So it makes people shy away.
I agree that it certainly shouldn’t be your job to ensure the data quality of a dataset.
Correct. I feel if this is not addressed then we can’t address the next thing, which is ensuring that the Global Fund encourages it grantees to get on to IATI. We can’t effectively advocate for that before we can answer “Is IATI useful?” I can’t help but imagine how great it would be if all Global Fund grantees had good quality, comprehensive data on IATI. This would allow anyone to do better cross donor, cross national, national and sub-national analyses to really understand where money is going, what it is doing and how efficiently it’s being used. This will improve how recipient countries plan for the future. It is extremely difficult to do this efficiently or accurately now.
What do you see as the big opportunity for open data in the post-2015 Development Agenda and Financing for Development?
The platform developers and data cleaners and formatters have done their job. The next bit now rests with others. The immediate gaps right now are getting more people to use the data provided and identifying the problems that are in the existing data. Many shy away from interacting with this data thinking that it’s difficult to use. You’ll find national level organisations that could, for instance, do great sub-national analysis, using the findings to promote critical debate about transparency, accountability and effectiveness in the implementation of donor grants. Instead the majority are looking for funding from the donors or initiating projects in densely supported areas instead of going for areas that lack support. Top level data will always matter until programmes are funded 100% by national governments but more than ever country level data is vital – who is doing what, where, how efficiently and who’s benefiting most and how this links into future planning.
Any final thoughts?
Behaviour change on how to use data is almost as difficult as behaviour change on accepting watchdogging. People do accept that data are useful but more of them need also to start using those data in their day to day planning, reporting and other decisions; just as we are trying to get the Fund and its implementers to accept that being watched by people like us is useful to the success of their programmes.
You can follow Angela on Twitter here: https://twitter.com/akageni