Language:  English  Français  Español 
Log in or Join us

Lost your login?

Join the community

Are you a consultant?
Find out how you can benefit from consultant membership

Accountability in Tanzania Phase 1 (AcT)


Tanzania, Sub-Saharan Africa  Show on interactive map

Active from:

Mar 2009 to Mar 2016

Implementing organisation(s):

International Development Advisory Services (IDAS) KPMG



Contact persons:

Kate Dyer, Amani Manyelezi


The Accountability in Tanzania (AcT) Programme is a DFID-funded governance initiative which ran for almost seven years from May 2009 until March 2016. The Programme worked with mid-to-large sized Civil Society Organisations (CSOs) across a diverse range of issues in social services, inclusive growth and climate change and environment. In total, AcT worked with 28 CSO partners who implemented 31 multi-year grants ranging from £55,000 to £600,000 per year, with some £27.7 million disbursed over this time. AcT provided partners with both core and project funding, preferring the former, where possible, as the most effective way to organisations in a sustainable way. In addition, the programme provided capacity building support and facilitated opportunities for knowledge generation and partner learning.

A distinct component of AcT was the Climate Change and Environment (CC&E) Window, which was launched mid-way into the programme in December 2011, with funding from UK’s Investment Climate Facility (ICF) and DANIDA. Set apart from the rest of the programme, the window has been a pioneer of donor support to civil society in Tanzania on climate change issues. Moreover, within AcT it was unique for bringing a dedicated focus on the governance dimensions of this important issue, at the same time as directly supporting the government and people of Tanzania to become more resilient, as an ICF priority.

Objectives of the intervention:

How AcT was designed to contribute to governance changes in Tanzania is best captured in the theory of change, which in summary was:
…to support civil society organisations (CSOs) to implement context-specific strategic interventions that will enable them to influence positive change in the attitudes and behaviour of citizens, civil society and government, making government as a whole more responsive and accountable.

Why was OM chosen?

OM was choosen because the AcT programme is working to achieve govenace outcomes in a complex, multistakeholder environment where results are unlikely to be achieved in a linear fashion. In addition to monitoring of governance outcomes the programme had a keen interest in the organisational development of its grantees. Key to the success of AcT was tailored OD support to our partners. In monitoring their level of organisational development performance journal principles proved to be a useful starting point for developing a set of "progress markers for partners".

How was OM used?

A full set of documents describing the rationale, adaptations, approach and outcomes of the use of outcome mapping can be downloaded from the AcT website.

What was the experience of using OM?

OM is not for the faint-hearted. It takes a lot of investment of time and energy to get the full value out of it. A one size fits all model does not work; but a good understanding of the methodology is important in order to help partner organisations see the way forward given their different sectors, approaches, organizational histories and values.

o When AcT was first set up it was assumed that training in OM could be a one off intervention, and hence that a lot of the training and support could reasonably be outsourced. We learned in the early years that an ‘accompaniment’ model works better in terms of supporting partner.

o Most partners need to link their reporting to a log frame; there are examples to learn from but each organisation needs to make its own fusion. Logic models have been encouraged as a more flexible approach than log frames.

o Some staff in partner organisations are concerned that Progress Markers lack objectivity. Ideally, they should be written so whether or not a PM has been achieved would be interpreted by anyone in the same way, but this is hard to achieve, especially if you want to avoid having too many boundary partners to monitor.

o Thinking differently. OM encourages users to think differently and to think more carefully about what will work. With many aid dependent organisations, projects are conceived and log frames developed to meet donor interests without thinking sufficiently what will work, about the strategies required, about skills required.

o Link to learning. With OM you are always monitoring your strategy; if it is not working, you change something. It provides evidence to base decision making on: quarterly planning meetings are based on evidence, not the views of the most assertive participant.

o Progress Markers can be quite intuitive so untrained community members can monitor them, as opposed to with conventional indicators which require understanding of definitions, percentages and so on. This results in a huge shift in the balance of power as there is no need for programme officers in the field with expert knowledge. On the other hand PMs also support programme officers by giving them a very clear steer on what to look for and what to report on in their 'back to office' reports. Helps them get away from long narratives which don't actually capture the important programming results.

We are increasingly confident that, through the way we are merging OM and conventional indicators in our revised log-frame, we are in a stronger position than before to provide a detailed and systematic body of qualitative and quantitative evidence that takes us beyond anecdotes, and towards a nuanced understanding of what makes change happen.

Web links

Linked Resources



Latin America & Carribean Sub-Saharan Africa North Africa & Middle East South Asia South East Asia & Pacific Far-East Asia Eastern Europe & CIS (ex USSR) Western Europe North America & Canada Australasia