Partnerships in Practice: Reflections on the Indashyikirwa Research-Programming Partnership in Rwanda
Partnerships between organisations implementing programmes and the research teams evaluating them can sometimes be fraught with tensions and misunderstandings. This can be due to team members’ different backgrounds, competencies and priorities, but also due to differing perspectives on what counts as good ‘evidence’ of programme success. However, when these partnerships work well, both the programme and the research can flourish and produce positive outcomes for all.
Indashyikirwa (Agents for Change) was a four-year intimate partner violence (IPV) prevention programme in Rwanda, implemented by CARE International Rwanda, Rwanda Men’s Resource Centre (RWAMREC) and Rwanda Women’s Network (RWN). There are four main components to the programme: (1) Intensive participatory training with couples (couples’ curriculum); (2) Community-based activism with a sub-set of trained couples; (3) Direct support to survivors of IPV through the women’s safe spaces; and (4) Training and engagement of opinion leaders.
The programme was implemented alongside an external impact evaluation, conducted as part of the What Works to Prevent Violence against Women and Girls Programme. Following the end of the programme in August 2018, several programme staff and I (from the research team) reflected on what we have learned from our partnership.
The programme team admitted to their initial scepticism and lack of clarity in 2014 about what the impact evaluation would entail. There was concern they would be scrutinised by ‘high up’ academics who do not understand the contextual or programme realities. Many of these practitioners did not understand the benefits of impact evaluations, as their previous experiences entailed receiving research findings only at the end of a programme, which did not generally align with their experiences in practice. Yet, they went on to describe how the Indashyikirwa research and programming partnership quickly became something different and valuable. We discussed why this was the case and came up with four main reasons.
Firstly, the evaluation conducted by the research team informed and strengthened the pilot programme in several concrete ways:
- The research team accompanied the pre-testing of the various curricula – observing sessions and interviewing participants and facilitators. This generated important insights and lessons, which were used to improve both the content and pedagogy. For example, this research highlighted the importance of having both a male and female facilitator for the couples’ curriculum.
- The formative research into the operation of social norms around gender roles and violence against women provided important insights that were used to shape the content of the activism materials as well as ongoing trainings with couples, opinion leaders, community activists and safe space facilitators.
- The ongoing longitudinal qualitative research and observations with couples, opinion leaders, women’s space facilitators and attendees, and community activists provided important real time feedback to the programme team, who could then apply this in the field – especially when developing the ongoing trainings, and for responding to implementation challenges.
- The research included two rounds of qualitative interviews with programme staff to collect their valuable insights on programme successes and challenges (an area often neglected in impact evaluations).
Secondly, significant efforts were made by the research team to build the capacity of programme staff to understand and utilise the research. This included a series of trainings and workshops on: understanding randomization and randomised control trials, qualitative sampling, data analysis and writing skills. These efforts helped improve trust, and enhance understanding among staff of the value of different research methods and external impact evaluations. Programme staff especially appreciated being involved as co-authors in research publications.
Thirdly, programme staff were regularly asked to provide inputs into the research findings and to reflect on whether the findings matched what they were seeing on the ground. Programme staff also frequently presented their monitoring findings to the research team. This ensured that practice based insights and evidence were applied and helped to strengthen contextual and programmatic interpretation of the research findings. As a result, the monitoring data and the external impact evaluation data complemented each other to support both the research and programme implementation.
Finally, this was all made possible due to the time and budget allocated by both teams for regular meetings, coordination, relationship building, and learning from each other. This certainly pays off, especially for impact evaluations of complex programmes, and IPV prevention programmes which are so context-dependant.
Overall, there was consensus among the research and programme teams that the Indashyikirwa programme would not have had the success it did if not for the kind of partnership we had.