Can You Prove That? Evaluating Communications in Humanitarian Aid

Timo Luege
2 min readJan 31, 2019

Originally published on January 31, 2019.

Evaluating communications activities is hard. Or to say it with the famous quote by John Wanamaker “Half the money I spend on advertising is wasted; the trouble is I don’t know which half.” While the situation has somewhat improved since Wanamaker’s times — particularly when it comes to online marketing, online sales and online fundraising — things are still largely the same when trying to evaluate the effectiveness of communications in the messy offline world and even more so during humanitarian crises.

For example: How do you demonstrate that it was your information campaign that led to an increase in refugees applying for asylum in Greece, as opposed to information they may have heard in the camp or from another organization?

Because causality is often so difficult to prove, organizations frequently rely on evaluating outputs (i.e. number of information sessions held) rather than outcomes (i.e. actions taken based on information that you shared). I find that very unsatisfying. The worst example I ever saw was a crisis where the only indicator for the success of a comms team’s work was how many sheets of paper were distributed each month. Not helpful.

I’m bringing this up because I think we need to do better. I also believe that protection activities face many similar challenges to communications activities, particularly where it comes to attribution and causality.

For example: Did the number of rapes in this camp go down because we increased the number of focus group discussions about gender-based violence or because a WASH organization moved the latrines from the dark woods to a brightly-lit area at the edge of the camp?

In their new guide “ Evaluation of Protection in Humanitarian Action “, ALNAP discusses many of these challenges and provides advice on how to deal with them as part of evaluations. If you are working in communications, you might find that many of the topics discussed in the guide can be applied to your programs as well. There are no easy solutions (are there ever?), but at the very least the guide can help you understand the thinking behind current best practice and common limitations.

Download ALNAP’s “ Evaluation of Protection in Humanitarian Action

--

--