Why do managers need alternatives to LogFrame, too?

February 23, 2011 | Adrian Gnägi | Learning Elsewhere |


Rating: none

Adrian picture for sdclan



by Adrian Gnägi




How is it possible that

  • social change is emergent and therefore cannot be precisely planned for, but
  • LogFrame is the standard tool in aid for planning and reporting on social transformation?

Is theory wrong or are development practitioners systematically lying about what they are doing? In this post I argue that the issue is not lying, but rather precariously muddling through. Imprecision and cascade reporting are the two main techniques used in our business to reconcile LogFrame and emergence. This is unhealthy.


In a recent blog post on what has gone wrong with MfDR (Managing for Development Results) I argued that support for social transformation should not be conceived using LogFrames. In a comment, Rick Davies expressed puzzlement with this demand. I can easily understand why people do not want to let go of LogFrame. The LogFrame approach is backed by the most powerful lobby in our organizations: it is the middle managers who make it our standard. LogFrames are still here after 50 years because middle managers get from them what they need: a nutshell project summary; the link between resources, activities and results; and indicators for measurement and reporting. LogFrames are a great tool for organizing funding relationships. Unfortunately, they are utterly inappropriate as guidance for implementation (see my earlier post on the usefullness of different program formats). This is why we need to go for the institutional struggle, that’s why the standard must fall.


LogFrames in essence are matrixes with a double aggregation logic:

  1. The horizontal logic is: under certain planning assumptions, xy resources (time, money, knowledge) applied to stratagems (activity scripts) produce outputs characterized by indicators a, b, c…
  2. The vertical logic is: planning assumptions add up to risk assessment; resources add up to staffing needs, budget and TA requirements; stratagems add up to strategy; indicators add up to monitoring and reporting framework; and outputs add up to outcome and contribute (attribution gap) to impact.


Fair enough as basis for funding decisions; but highly problematic as support for implementation (see Jacobs et al 2010 for an excellent summary). I want to focus on two issues:

  1. Unreliability of planning assumptions over time: Operations typically are “sold” to funding decision makers with LogFrames spanning 3 or 4 years; but they normally are implemented with annual plans, mid-term reviewed after 6 months. This 6-monthly real planning rhythm reflects practitioners experience that their comfort zone with planning assumptions is not a few years, but rather a few months (looking at the planning granularity of colleagues in SDC I conclude that the average comfort zone is around 4 months). The experience based comfort zone of aid practitioners with planning assumptions corresponds well with the emergence character of social transformation as put forward in complexity theories.
  2. Attribution gap prevents impact feedback: When impact on social transformation is conceptualized with LogFrames, impact happens in an unspecified way in an unspecified future. This is due to the attribution gap, the unknowable attribution of any single change vector to overall social transformation. But the attribution gap is at loggerheads with the primary rule of change management: go for low hanging fruit, show change is happening and gather momentum in your change project with motivation through proof. LogFrame planned change initiatives not only get no energy through quick wins, they get no feedback from the impact level at all. Since whatever social transformation is happening cannot be attributed to a single influence, one cannot know whether an adaptation of what is done would be necessary.


The missing impact feedback is a great worry to middle level managers. They fear those evaluations showing projects reached their objectives, but did not impact on poverty.  

Most development practitioners know that construction projects can be pre-planned in detail, but shifting power relations and delivery responsibility through decentralization cannot. They also know that the support we provide for social transformation does not vanish in an attribution void in some distant future, but is appropriated by local actors. Sometimes in the way we though it would and should, sometimes not, but frequently earlier than we had assumed. Differences between plan and implementation reality normally show up within a few months. What do experienced implementers do to handle the differences? Well, sometimes they do lie by pretending to be following the plan, and sometimes they do follow the plan even though they know the operation will go wrong. Sometimes there just are good reasons for stupid behavior. But most often they do something else:

  • When designing operations, they build “flexibility” into their LogFrames. Lack of precision with indicators has been the major complaint by managers with LogFrames since USAID adopted it in the late 1960ies. If people have not learned for 50 years how to formulate precise and measuralble indicators, there must be compelling reasons for it. The higher up in the impact chain, the more important imprecision with indicators is. Only imprecise indicators survive emergence.
  • When designing their monitoring system, they look for incongruencies with their management systems. Keeping LogFrame indicator information away from operations’ steering is key for operational survival. Typically, monitoring frameworks are geared at reporting – full stop.
  • When reporting on their operations they avoid transparent plan – implementation comparisons. There is a clear role distribution in the reporting chain for this. Staff close to the field mostly choose to be “non-strategic” in reporting, drowning indicator related data in epic contextual, justificatory narrative. In the middle of the chain, data is re-arranged (“rationalized”) into organization-specific formats. Towards the top of the chain, data is aggregated into unspecific result generalities.

Imprecise indicators, unspecific information, and keeping steering and reporting separate – those are the rules of the LogFrame versus emergence game. I think we should wholeheartedly support middle managers’ requests for more precise indicators. Precise indicators are the most lethal weapons in the fight against LogFrames. That is how ZOPP went down.

One could argue that this system of working misunderstandings is quite okay, since it allows people in different contexts to do what they are asked to do. But the opportunity costs are just out of proportion, and so are the foregone potentials. As explained in my last posts, one of the sad unintended outcomes is the debiliating waste of human energy and talent on useless data gathering, control and accountability exercises. I would like to close with potentials that middle managers might be interested in, though:

  1. If we start using appropriation hypothesis instead of planning assumptions, we build political economy analysis into our operations. Our assumptions about what drives change may still be wrong, but at least we will know early on and can adjust. And we gain credibility: fear of non-regular appropriation is one of the key drivers of critique in aid, one of the main causes of “obsessive measurement disorder”.
  2. If we start using progress markers for system change instead of results’ indicators, we will be able to tell within months whether change in the desired direction is happening. Just showing that the theory of change an operation is built on is correct makes for a far more compelling impact story than any attribution-gap-hampered data avalanche.


Comments to“Why do managers need alternatives to LogFrame, too?”

  1. Riff Fullan says:

    Dear Adrian,

    You make a compelling argument for looking beyond logframes when social transformation is a goal…I think part of the sub-text of your suggestions for middle managers is that to the extent local actors can be more closely engaged in project implementation and monitoring, you are likely to see more meaningful results (at least in the local context). A further implication is that – unlike in most existing project development cycle contexts – local actor engagement in planning as well has significant transformative potential, though how this can be reconciled with current practice in development institutions is not at all an easy question to answer.

  2. Katharina Häberli Harker says:

    Dear Adrian

    I fully share the points you raise in your blog on logframes.

    I wonder if one could also draw different conclusions. Logframes have their strong limitations, as any model trying to reflect reality. My point would be that if logframes became living documents, at the centre of not only reporting but steering, and thus consequently were adapted by better understanding of what really is going on, they might still be the best (least bad) available instrument?

    What certainly is wrong is the following: that as an annexe to a contract we expect partners to stick to the original version of the logframe for years, and actually require them to ask for permission to change it, rather than we expecting form professional project implementation that logframes continuously are adapted: turn living document, rather than be a nuisance corsett.

    Great thanks for the entire great blog


Leave a Reply