In October 2014 we asked Video for Change practitioners to complete a survey to help us better understand impact design and impact assessment practices and needs.

41 people completed our survey and by far most were working in Asia (91%) followed by North America (21%), Latin America (19%) and Australia/Oceania (19%). There were also 4 respondents working in the MENA region and 4 in Sub-Saharan Africa. The majority of the respondents (51%), worked as independent video-makers, 38% worked with organisations who exclusively engaged with video and 36% worked with organisations that focus on social change but not necessarily on the use of video. Most of the respondents had more than 5 years experience working in the area of Video for Change.

Despite the fact that just over half (57%) of the respondents felt satisfied or somewhat satisfied with their current processes for defining the strategy of their Video for Change initiatives, the vast majority of respondents (88%) still felt that they would benefit from more knowledge/processes to support this. Similarly, 88% of all respondents felt they would benefit from more knowledge/processes to support them to measure or assess impact with just 5 respondents (12%) stating they already knew enough and did not require this. 

The survey findings suggested that most (85%) of respondents believe that the ethical practices they (or their organisations) most value in their work are some degree  reflected in their models and practices for designing for impact.

The responses also suggest that the four ethical principles we have proposed  be used to guide the development of our Impact Toolkit, resonate strongly with Video for Change practitioners. All respondents indicated that at least one of these four principles is critical to their way of working: one third (33%) selected just one principle, 10% selected two, 26% selected three and just under one third (31%) selected all four.


All of the respondents indicated that they were already using or had used at least one method for assessing the impact of their work. Interviews were by far the most popular method used for assessing impact, followed by capturing media or political attention and the use of online analytics tools.

Respondents also provided examples of resources or tools that they draw from in designing for and assessing impact. This included a number of resources we have not yet considered in our Impact Toolkit project. For example, one respondent said they had found this book on ‘Participatory Statistics’ very useful; another respondent explained that their ongoing participation in D-Word, an online community and discussion form for Documentary-Makers, was a key resource since it allowed them to discuss needs and learn about new practices and resources. Another participant provided a useful link to a collection of resources that support the Most Significant Change evaluation technique.

Participants also shared persistent challenges that prevent them from carrying out effective and meaningful impact assessment. This included a lack of funding from clients, NGOs or funders; time constraints, a lack of knowledge and useful tools, and a lack of clarity about the benefits of spending limited resources in this way.

Overall, the survey findings are one method we are using to inform the ongoing development of our Video for Change Impact Toolkit . Thanks to everyone who shared their experiences and thoughts with us!