Hi colleagues
just wanted to seek your help on a couple of issues:
1. are there CMAM reporting tools, that includes some data quality control? for example, flagging potential issues with some of the CMAM reported indicators: relapses vs defaulters? defaulters vs. new admissions?
2. are there studies/reviews/analysis that assesses the relationship between defaulters vs. relapses and potential ratios?
Thanks again for any insight and guidance you may provide on this
best
In general I am very skeptical of considering defaulters and relapses (recurrence of wasting) as signs of programatic failure. Having personally treated many tens of thousands of wasted children in sub-Saharan Africa it is clear to me that if we're honest, default/ relapse happens often. In a typical MAM case, why would 4-6 weeks of supplementary feeding be expected to prevent 'relapse' after 6-18 weeks more? In reality it doesn't and treating MAM again should simply be done. In a study we did in Malawi almost 10 years ago we showed that 1 recurrence of MAM was not indicative of a poor outcome after a year. Recurrence more than once of course is a poor prognosis.
Unfortunately the typical QC benchmarks were not chosen because the best programs could achieve these in any circumstance. The best outcome are seen in circumstances of acute, remarkable food shortage. But more typically we have children in poverty, in stable social situations with a combination of poor diet quality and multiple inflammatory stimuli. When I choose program benchmarks I look for coverage foremost and then availability of therapeutic/ supplementary foods. I strive for 75% coverage, determined by a real house-to-house survey and stockouts < 2% of the time.
Answered:
3 years agoHi Alex,
In terms of standards / reporting you might find this a useful overview:
and some resources here: https://www.cmamreport.com .... although as with any tools / standards they need to be interpreted appropriately for the country / programme context. Even if you have good results / reports are you sure the data collection and reporting is of reliable quality?
I whole heartedly agree with Mark that high coverage has to be your priority - with a slightly nuanced view on defaulters - I agree that this is not indicative of programme 'failure' although a high default rate is usually associated with a lower coverage and is indicative of the need for a programme response; the main reasons for default should be determined to inform the contextually appropriate changes needed to reduce default / maximise coverage.
Similarly with relapses - I think this happens more than we appreciate and I dont know of many programmes that record information on relapses reliably. In any case, unless you have a high coverage programme it is likely that relapses will go unrecognised (i.e. will spontaneously recover or die) without detection. Ultimately I don't consider relapse rate to be a performance indicator for CMAM (unless the discharge and reporting of 'cured' cases is seriously defective - but that should be detected through routine supervision rather than reliance on reporting of relapses).
In terms of the ratios of defaulters and relapses, Even if your data were accurate, I'm not sure the comparison would be valid. Defaulters have stopped attending the programme for some reason and the relapsed child has previously been cured and has now been readmitted.
When looking at defaulters vs. new admissions it is better to look at data trends rather than ratios. Interpret the trends against expected numbers of admissions and reasons for changing levels default - take a look at the SQUEAC manual for examples of comparing trends in data against seasonal calendars - again this comes back to the importance of focussing on the direct assessment coverage.
https://www.fantaproject.org/monitoring-and-evaluation/squeac-sleac
Answered:
3 years ago