Language:  English  Français  Español 

Outcome Mapping Practitioner Guide

Systematic monitoring of progress markers

A systematic way of reflecting on the progress made for each progress marker facilitating analysis and reporting of outcomes as well as supporting future planning.


Author: Steff Deprez

Published: Sunday 21 September 2014


Once your Intentional Design is developed and the programme is implemented, there comes a time when the implementing team and the boundary partners need to reflect on the progress made for the progress markers that were formulated for each boundary partners. Below is a possible way of carrying out a systematic reflection process to monitor progress markers supporting the planning of the next implementation period. It consists of 4 steps and some possible follow sessions. It is designed to be carried out during each monitoring cycle (e.g. every 4 or 6 months), but it could also be used for the annual more in-depth analysis of progress markers and to guide the planning of the following year.

The 4 steps below are meant to be applied for each progress marker, but you could also apply it to a selected number of PMs (see link) that are important in the given monitoring cycle. Keep in mind that step 4 is particularly interesting for those PMs for which you know that there is no progress yet.

The systematic reflection process could be carried out during a joint workshop with boundary partners (and other important stakeholders) or by one boundary partner doing an individual reflection exercise by themself or by an implementing team that wants to reflect on the changes they have observed at their boundary partners

1. Four steps to review each progress markers

Step1: Read the progress marker

Read the progress marker and if necessary clarify it as there might be people present that did not participated in the design / formulation of the progress markers or in the previous monitoring cycles.

Step 2: Discuss and describe the progress made during the given monitoring period

Reflect on the progress that is made for the behavioural change that is described in the progress marker. If done in a team or a workshop, foresee sufficient time to discuss the progress and make sure that everybody can contribute and ventilate his/her opinion about the progress made. Describe the progress made in the agreed monitoring period in about 5 to 10 lines (to the point but explaining enough detail to fully describe the change). If done during a workshop, read out the described change and ask for agreement of the participants before moving on. If this review process is done on a regular basis throughout the lifespan of the programme, it is necessary to start this step with a brief update on the progress that was agreed upon in the previous period in order to avoid repetition and to make sure that you build further on previous changes.

Remarks

  • It is necessary to spell out clearly the period that is reviewed before you start the reflection (last year, last 6 months, last 4 months, …)
  • Describing the progress is not the same as making a list of activities that were carried out by the boundary partner. It is a description of progress related to the statement written in the progress marker
  • In some cases, it might be necessary to provide evidence or specific monitoring data to be able to discuss the progress. Make sure that this evidence, data, information, … is available before or during the reflection. In many cases, there is sufficient knowledge - based on observation and experience of the people in the room – to make a collective judgement on the progress made.

 

Step3: Scoring the progress made

In this step, people give a scoring to the change that is described in step 2. They are different ways of scoring progress (through colour coding, numeric scoring, percentages or 0-L-M-H scoring, … but the general idea is to give a relative value of the progress made between the situation at the beginning of the project/programme and the desired final desired change (i.e. the change that people agree upon as being sufficient in that particular context). In other words, the scoring refers to the progress made from the start of the programme until the specific moment of review.

The advantage of scoring is that one can compare progress across progress markers or across boundary partners using the same progress markers. Furthermore, the process of scoring also forces people to motivate the change that already happened (step 2) but also express the change that is still expected (step 4). This is particularly powerful when this is discussed in workshops in which both implementing team and boundary partners are present. After some discussion, a final scoring is agreed upon.

Remarks

  • Although the reflection of the progress focuses on describing the change that happened in the given monitoring period, the scoring gives a value to the progress compared to the start situation and the final desired situation. This implies that a scoring can increase (from 30% to 50% or from ‘0’ to ‘L’, stay the same (=no progress observed in the given monitoring period) or even decrease (from 70% to 40% or from ‘M’ to ‘L’) as a particular progress is not necessary stable over time.
  • If done on a regular basis, always provide the scoring that was given in the previous monitoring period before you discuss the new scoring
  • It is important that all people involved understand the principle and the weighting of the scoring and that the same type is scoring is used throughout the programme.
  • You can also do step 2 and 3 at the same time as the scoring process will assist in clearly describing the change (step 2) of a given period, and even give hints towards step 4.

 

Step 4: Spell out the change that is still expected in the next period

After a description of the progress made and the scoring, people discuss and spell out what kind of change should take place or is expected in the next period of the programme. In a workshop it is important that everybody can express their thoughts as there might be different ideas about the way forward. Especially if this discussion is done with both the implementing team and boundary partners, it provides good insights in the mutual expectations about future changes for a particular progress marker. For the final description, it is recommended to be as specific as possible and to not keep it very general, so it is understood by everybody what exactly the change entails and also to understand whether it is still smaller change or a substantial change.

 

Keeping track of hindering and contributing factors

In step 2, 3 and 4, it is good spell out the most important hindering and contributing factors in the achievement of the progress markers. As a facilitator of this process it’s good to keep track of those and list them separately as you go through the reflection exercise.

Keeping track of unintended changes

Throughout this systematic reflection of the progress markers, people might come up with changes that were not included in the original set of progress markers. In this way, we generate unintended changes. As a facilitator of this process it’s good to keep track of those and list them separately as you go through the reflection exercise.

 

2. Possible follow-up sessions

2.1 Reviewing the progress marker set (more)

After the full review, you can conclude the systematic reflection with some follow-up discussions or questions:

New progress markers? Sometimes it happen that due to changes in the context or a better understanding of the change process, people identify new progress markers that are important to keep track of. One way of generating potential new progress markers is to review the list of unintended changes and discuss whether some unintended changes are substantial enough or an indication of a bigger change that was originally not intended, but is seen as important to keep track of in the future. New additional progress marker can be taken up in the next monitoring cycle.

Stop tracking progress markers? It is possible that some progress markers have been reached to a sufficient level and should not be taken up anymore in the next monitoring cycle.

Remove progress markers from the original set? It might happened that a specific progress marker – that was originally seen as interesting and important - is later on in the monitoring process not experienced as useful, i.e. even if it does provide data, it doesn’t tell much about the general progress or it’s simply not seen anymore as an important change. In that case, you can decide to remove the progress marker from the original list.

2.2 Discuss all hindering and contributing factors

Review with the team or people involved all the main hindering factors and discuss whether there is anything that can be done to overcome those. Also, spell out the most important supporting / contributing factors and discuss what can be done to maintain them or use them as leverage for other purposes.  

2.3 Identifying concrete actions and support strategies

An in-depth reflection on the progress markers through the 4 steps provides directions for the planning and prioritisation of concrete actions by the boundary partners and for support strategies/activities of the implementing team. By looking at the entire list of changes that were spelled out in step 4, and the main hindering and contributing factors, the boundary partner could discuss actions they could take to make the change happen, prioritise them and decide what is achievable in the next period. Based on this, the implementing team can reflect and discuss with the boundary partner how they could support the partner in the best possible way. The output of this session is a good basis for the development of concrete work plans & budgets.

3. Documenting the systematic review

The progress that is made for the outcomes of the Boundary Partners is normally captured in the Outcome Journal. Documenting the results of the reflection process described above implies a change in the format of the original outcome journal. An example of an Outcome Journal that is based on this approach can be found in here


This nugget was applied in: VECO (global programme), Olof Palme Centre Western Balkan Programme, COP RBM European Social Fund, …

Related Practitioner Guide sections:




Latin America & Carribean Sub-Saharan Africa North Africa & Middle East South Asia South East Asia & Pacific Far-East Asia Eastern Europe & CIS (ex USSR) Western Europe North America & Canada Australasia