Commentary

Monitoring in metamorphosis: attending to the seven features of thoughtful monitoring

Daniel Ticehurst - Member of EvalIndigenous and mentor at the African Leadership Centre, King’s College, University of London

Monitoring: accountable, responsive and attuned to needs

My passion for monitoring rests on how it helps those who provide support to listen to and learn from disadvantaged people and struggling communities and organisations. Monitoring has the potential to improve their performance in delivering support that is accountable, responsive and attuned to people’s needs.

A central focus of the monitoring system’s inquiry is on the process or interaction between frontline staff and the reactions, perceptions and expectations of crisis-affected people – that is, if it is to meet the needs of decision-makers.

Monitoring higher level results – as promoted by advocates of results-based management – such as income and resilience, is misguided. The data does not resolve management’s decision uncertainties in complex environments. They do not tell you what to do. Instead, it perpetuates the over-ambition of M&E systems from some 40 years ago and the shortcomings of management by objectives – the precursor to results-based management – in the 1970s private sector. Even before…

“Management by results – like driving a car by looking in a rear view mirror.”

What’s the main problem with monitoring?

Many issues related to monitoring are rarely methodological or technical, but rather organisational and managerial: they centre around lack of support and importance given to the purpose, responsibilities, functional relationships and positioning of monitoring. Compared to evaluation, monitoring receives little attention (Ticehurst, 2013). It is easy to get the impression that evaluating aid programmes is seen as a more challenging and deserving task than delivering them.

Monitoring is primarily a learning function that needs to be integrated into the structures, functions and processes of a programme or organisation. These structures include management and associated planning, financing, operations, learning and decision-making.

In-house M&E experts and teams often become isolated from management. This is often due to being a contrived learning function when, in reality, many are set up to comply with the funder’s planning and reporting requirements. This problem has been exacerbated through setting up third-party monitoring and contracting out MEL functions to consultancy companies. Both these arrangements run in parallel to the structures, functions and processes of the organisation and the company that implements the programme or project. The problems of such arrangements can be well imagined.

The most important question managers and their teams need to ask at the outset is whether there is a need to have a monitoring (not an evaluation) professional on the team. The argument for having one rests on:

  • Whether the existing management and those responsible for delivering the support team have the capacity to fulfil the monitoring function.
  • How well existing decision-making, lesson learning and operational processes enable assessing, listening and learning.

If deemed necessary to have a dedicated person or team, it is important to mitigate against the risks of isolation, overdependence and the potential for evidence being ignored by and/or irrelevant for management.

Seven ways monitoring can become thoughtful

  1. Ask yourself how monitoring can benefit those responsible for implementing and making decisions on programmes: how does it work, what are their information requirements, what decision uncertainties do they face and when? Try not to begin the process by developing a theory of change and/or a results framework.
  2. Integrate monitoring into existing systems, processes and responsibilities. Do not confuse an M&E system with a data collection plan developed with sole reference to a results framework or theory of change – it has to connect with people, not concepts and theories.
  3. Make sure the function of monitoring reflects the complex and uncertain environment the programme operates in, focus on rapid feedback loops to make improvements to the products and services, and how they are delivered.
  4. Give attention and effort to adequately researching and then periodically checking the assumptions. The monitoring of assumptions is as important as the indicators, if not more.
  5. Recognise how monitoring can learn from and help bring material value to indigenous knowledge systems that – unlike many western approaches to monitoring – embrace complexity.
  6. Balance the need to be as accountable to those the programme supports as much as to those who fund, by adapting the results and treating those in need as subjects of conversations that matter to them, not as objects of an interview driven by the needs of those in charge.
  7. Give voice to those who deliver the support to share their experiences, successes and challenges, and for senior management to listen to improve the nature of the support and ways it is delivered.

In conclusion, taking up thoughtful approaches to monitoring is at the discretion of the user and its features can be used as needed. Some organisations and programmes may only have the need and capacity to take on what they see as priority areas. Regardless of the starting point, I would encourage managers and M&E specialists or teams to reflect on their approach with the seven features of thoughtful monitoring in mind.