In this post we report on an interview with Andrew Lowenthal, from EngageMedia, who are one of the V4C.org network members. We ask Andrew about EngageMedia’s experience of measuring the impact of their Video for Change projects.

About EngageMedia
EngageMedia is a non-profit organisation, based in Australia and Indonesia, that employs twelve staff. Since 2005 EngageMedia have used video, the internet and free software technologies to create and support social and environmental change. EngageMedia are perhaps most well known for their training events for video activists, social movements and human rights organisations working across the Asia Pacific region. They are also known for Plumi, an open source software platform they have developed to support the distribution of video for change projects. This platform also supports video subtitling and EngageMedia have their own subtitling group, which focuses on mobilising human rights and social justice video content across languages, particularly in the South east Asia and Pacific regions. This group has more than 350 members and 80 active subtitlers.

EngageMedia, Video for Change and Measuring Impact
Andrew Lowenthal, Executive Director and co-founder of EngageMedia, says they use the term ‘Video for Change’ because it covers a range of different kinds of work that they are engaged with including video activism, participatory video and video advocacy: “I would describe video for change as a practice whereby video provides the critical tool through which change-makers seek to augment the social impact of their work.”
Andrew says that EngageMedia believe that evaluating impact is crucial to their work, since it provides feedback loops: “you need a constant process of testing and feedback and reflection and modification of your strategy. If you don’t have that feedback then you keep doing the same thing without knowing whether it is the best approach.”

However, Andrew also identifies a range of challenges they face in evaluating impact: “The biggest problem we have with this is that funders don’t really fund impact assessment. So the whole process for us around assessing impact is really based on when we need to write a funding acquittal. So we go through the promised outputs and outcomes and success indicators and report on how much they were achieved or not [but] there is no meta-level assessment impact to help us know how well are we doing at achieving broader organizational goals”.

“The other problem with us measuring impact as a small organization is just capacity: there is specific knowledge involved in doing this.” Andrew suggests that what could be useful to EngageMedia are some appropriate tools and particularly models and frameworks that are first tested by a range of Video for Change organizations and projects. “I feel I need a model so I can say, ‘yes I can see how we can do this [measure impact]’”. Andrew accepts creating a model to work across even their own very diverse projects would be a challenge: “The model would need to be flexible so it can be made context specific; or alternatively you would need a range of models.”

In terms of individual videos made through EngageMedia workshops or uploaded to EngageMedia.org video-sharing portal, the most common way impact has been understood thus far is through video views and shares and through anecdotal stories that are relayed to them. However, Andrew is concerned that too much emphasis can be placed on video views and shares by both donors and Video for Change organizations: “In many ways the Rodney King incident, the Kony video, these are aberrations. Most videos don’t get these kinds of viewer numbers. At the same time these kinds of videos get held up as totems that show the power of video even though 99.99% of videos don’t have impact in that same way. In most cases Video for Change leads to impact through a cumulative cultural process that slowly shifts people’s thinking about particular kinds of issues. So for us, when we are asked to provide an example of impact relating to engagemedia.org, we can say ‘this platform helped to distribute this video and we helped subtitle it and as a result these people in Korea used that video to screen to migrant workers who were then able to mobilise more people around their issue; so the video was useful in helping achieve that.’ Because we work across so many issues we can’t start tracking developments on the rights of migrant workers in Korea and how effective or not the video has been in supporting those developments; but what we can say is this organsiation working on that issue found our work useful and these are the things it helped them to do in an immediate sense.”

Understanding how impact works through the EngageMedia platform is a real challenge for EngageMedia: “We can use analytics and see how many people downloaded it, at what time and from what country, but that tells you nothing about impact. We don’t know what they did with the video; how they responded to it. We can try and do a survey around this; but mostly we collect stories anecdotally.”

In the next post we will feature a Video for Change and Impact Case study from EngageMedia that will focus on one of measuring impact around one of their video training project.