Reflections from emerging evaluators in shaping Voluntary Organizations for Professional Evaluation capacity-building initiatives

Copyright: © 2020. The Authors. Licensee: AOSIS. This work is licensed under the Creative Commons Attribution License. Voluntary Organizations for Professional Evaluation (VOPEs) are increasingly realising the importance of ‘mainstreaming’ emerging evaluators (EEs) in capacity-building initiatives for sustaining the evaluation profession. This article aimed to address the importance and role of VOPEs in developing EEs. The article describes the global key issues shaping VOPEs’ interventions for EEs, South African Monitoring and Evaluation Association’s Emerging Evaluators Programme and reflections by two EEs from different sectors on the future of Monitoring and Evaluation (M&E) in South Africa. The views from the EEs’ reflections show the differences and similarities across their different sectors. Recommendations are proposed on the importance of developing EE programmes that are contextually relevant.

by Stevahn et al. (2005), such as the 'new evaluator', the 'accidental evaluator' and the 'professional in transition'. The 'new evaluator' is an evaluator who has just joined the evaluation field after an M&E academic qualification, with no past work experience in evaluation and needs relevant work experience. The 'accidental evaluator' enters the profession of evaluation by virtue of being assigned evaluation tasks in the organisation and needs formal training. The 'professional in transition' refers to an individual who makes a conscious decision to change from their current profession to become an evaluator and needs to navigate a new discipline where they were used to a different one. Common to all three pathways is that they all need some form of capacity building, including training, exposure, mentoring and continuous professional development.
(2) The multi-disciplinary nature of M&E: M&E practitioners require 'extensive and deliberate socialization' to the field (Bertrand Jones 2014) and need to grasp the interdisciplinary aspects of the field. (3) M&E as an evolving discipline: Methods and approaches have become more complex owing to the era of big data and rapid technological development, growing connectivity and the era of SDGS which are accompanied by broad targets and demanding policy agendas (Picciotto 2019;Schwandt et al. 2016). (4) The changed operating context and the new policy directions are further leading to a demand for locally relevant skills. Picciotto (2019) clarified that all these changes will mean that evaluators will have to master social networking, crowd-sourced learning and big data analysis.
The four highlighted key issues mean that as evaluators we need to rethink the way we support EEs and that the discipline is evolving so we need to be equipped to remain relevant. Thus, for EEs to operate competently within the M&E profession, they require support in various areas to build these competencies. This necessitates the facilitation of continuous capacity development activities for EEs. As part of implementing the concept note strategies, SAMEA biennially sponsors more than 20 EE scholarships to attend and participate in the conference and confers two awards to the most promising EEs. This award celebrates evaluators who, in the earliest stages of their career, made significant contributions to the field or practice of evaluation and demonstrated quality and effectiveness in their work. During the SAMEA 2019 conference, the 'M&E skills in a changing world' strand had a session on upskilling young and emerging evaluators. The session presentations and discussions included experiences of EEs, development of mentorship opportunities for EEs, defining who should be a mentor and skills needed for EEs. Beyond the conference session presentations and discussions, SAMEA further gave two EEs an opportunity to share in writing their reflections, experiences and opinions of M&E in a changing world, and on how SAMEA as the national VOPE can contribute to the development of EEs.

South African Monitoring and Evaluation Association Emerging Evaluators Programme
Obakeng is a public sector EE who reflects on the future of M&E by discussing his evaluation experience and the key developments that currently and in the future will shape the practice of evaluation in the public sector. His discussion further looks at the skills the public sector would need to invest in EEs. Amkelwa is an Non-Governmental Organisation (NGO) sector EE who reflects on the future of M&E by reflecting on her evaluation experience and discussing her predictions of evaluation in the next few years to come. Both EEs conclude by highlighting the role SAMEA needs to take in supporting EEs.

Contextualising the reflections
These reflections are drawn from experience of the author, Obakeng, as an EE within the South African public sector. For the past 5 years, I have worked as an evaluation officer at the National Department of Planning, Monitoring and Evaluation, a centre of government department that is the custodian on the South African National Evaluation System. My experience includes providing project management support to government-commissioned evaluations, contributing to the production of evaluation communication tools, participating in capacity-building initiatives such as attending workshops, participating in evaluation design clinics, working closely with various evaluation specialists and being part of the revision of the National Evaluation Policy Framework (NEPF) that will shape the future of evaluations in the public sector. The reflections are guided by four perceived key developments that will shape the practice of evaluation in the public sector in the next 5 years. These include the revision of the NEPF process, adopting the hybrid model in undertaking evaluations, constraints in government budgets and the need for rapid evaluations.

My context: South African public sector evaluation in the next 5 years
The practice of evaluation in the South African public sector can be said to have been formalised by the establishment of the Department of Planning, Monitoring and Evaluation (DPME) in 2010 and the cabinet approval of the NEPF in 2011, which paved ways for the implementation of a National Evaluation System (NES). The NEPF was used as a tool to formalise and institutionalise evaluation across the public sector by providing guidance on the practice of evaluation and also build state capacity in the field of evaluations. An evaluation of the NES prompted the need to revise the NEPF and also how the NES would be implemented going forward, which led to the revision and development of the NEPF 2019-2024. The NEPF 2019-2024 provides guidance and shapes the practice of public sector evaluation in the next 5 years. The main changes to the NEPF include the integration of State-owned entities into the NES, ensuring that the policy takes into account gender equality and women's empowerment priorities relating to women, the development needs of youth and the concerns of persons with disabilities as well as other vulnerable groups in society when undertaking evaluation projects and devolving an all-encompassing evaluation capacity development approach that aims to empower the State in the effective implementation of evaluations (NEPF 2019). Furthermore, the revision process of the NEPF has also indicated the need for a hybrid model in conducting evaluations, zooming in into the district model and also a co-production model on outsourced evaluations.
Based on the revised NEPF 2019-2024, evaluations in the public sector for the next 5 years will include hybrid model in conducting evaluations. The NEPF revision process has opened a discussion and a need to adapt a hybrid model in undertaking the evaluations in the public sector to strike a balance between internally and externally undertaken evaluations. The idea will be to adopt some form of a coproduction model in which some of the evaluation deliverables will be undertaken internally as opposed to outsourcing every component of the evaluation. The DPME has also developed a rapid evaluation guideline to be piloted and used across government. Rapid evaluations will require internal government staff to play an active role. This will mean a shift in the role of the evaluation unit within DPME and other M&E units across government, from management of evaluation to carrying out evaluations. At the same time, government is facing increasing fiscal constraints. Government budget for evaluation has shown a decreased pattern; for example, the budget for 'consultants: business & advisory services' declined by approximately R 4 709 000 between the financial years 2015/2016 and 2019/2020. 1 This will also reduce the available resources for commissioning evaluation. This offers both a challenge and an opportunity: a challenge in that it is likely to affect the wider evaluation ecosystem as government issues fewer evaluation tenders, but an opportunity for government internal staff to strengthen their evaluator capabilities as they carry out more evaluative assignments.

Strengthening skills for emerging evaluators in the public sector
Currently EEs in the public sector are exposed mainly to the management of evaluation assignments. The internship programme for new entrants in the evaluation field requires contracted service providers to work closely with EEs/officials from previously disadvantaged groups, and the introduction of a series of short courses and workshops in collaboration with the National School Government. Despite the capacitybuilding initiatives put in place, EEs still need technical skills in evaluations to be able to undertake evaluations and not entirely focus on managing evaluations. This could be achieved through following a targeted experimental workplace immersion approach in which EEs learn by doing actual technical evaluation work. The technical skills needed to be strengthened includes the ability to design and conceptualise evaluations, developing data collection instruments, data collection skills, data analysis and report writing.

How South African Monitoring and Evaluation Association as a Voluntary Organization for Professional Evaluation can continue to support emerging evaluators?
South African Monitoring and Evaluation Association can continue supporting EEs by providing meaningful mentorship that allows EEs to learn by doing, and further develop specific competencies in evaluation. Other ways to support EEs are to provide capacity building in the form of workshops, as well as provide a platform (virtually) for emerging evaluators to interact, share ideas and learn from one another. Recognising and rewarding EEs' contribution to the sector are also important. Receiving the SAMEA Emerging Evaluators Award was my highlight of the conference. Furthermore, my presentation was well received and triggered some interesting discussions with key individuals in the M&E practice. This itself was such an amazing experience, as the two events symbolised growth in my career and the beginning of a brighter future for me in the evaluation space.

What I have learned?
Being an EE in this dynamic world requires one to adapt quickly to change, learn on one's feet and be innovative and creative in the practice of M&E because the world is changing at a fast pace. What works today might not work tomorrow. Emerging evaluators should also be prepared to work under serious resource constraints and in politically challenging environments. To grow in the field of evaluation, one needs to be involved in the technical work of evaluations, conducting evaluations and exploring with different evaluation methods and approaches. Emerging evaluators should be pioneers of change in communities.

Contextualising the reflections
The reflections that follow are drawn from the experience of the author, Amkelwa, as an EE within the South African NGO sector. I have been an M&E Intern at JET Education Services (JET) since 2019 and have been mainly involved in the evaluation of education interventions. JET is a research NGO that works with stakeholders to improve the quality of education and the relationship between education and skills development. Our services include providing M&E expertise to clients, which include government, non-governmental organisations, donors, foundations, corporates and multilateral institutions. The reflections of M&E in the next 10 years are based on my one-and-a-half year as an EE at JET. This is guided by my experience as an external evaluator which predicts Integration of M&E professions in policy formulation and programme planning cycles.

Integration of M&E professions in policy formulation and programme planning cycles
Most evaluations in Africa are largely commissioned by government departments, international donors or development agencies (Bamberger 2000;Mouton et al. 2014;Tirivanhu et al. 2018). In this regard, external evaluators are currently mostly commissioned to measure the progress of funded interventions with no prior involvement in the planning or implementation of the programmes.
In future, M&E practitioners will promote the use of evaluation to inform policy and programme development through involvement in the planning stage of an intervention to inform policy making and achieve sustainable development. My impression is that programme funders in the education sector are shifting towards holistic intervention by engaging evaluators in the planning stage. This has helped to develop structured programme implementation that speaks to the intended outcomes. In the near future, evaluations will focus on providing policy and programme implementers with information that is relevant to their context and environment. The use of evaluation to inform policymaking and programme development can create a positive interaction that will help to achieve sustainable development. This will help focus not only on whether the intended intervention resulted in the desired outcomes but also on how the intended intervention could be better implemented in the context of the community. By providing credible information based on robust evaluation findings, evaluation can play a role in influencing policy makers and programme implementers. Awareness of this role could assist M&E practitioners to secure better budgets for their work, and tell their stories in a better way.

Rapid technological developments
Another likely change is the use of MERL TECH (Monitoring, Evaluation, Research and Learning) with the use of information and communication technologies (ICT) by evaluators. M&E practitioners need to use technology to collapse the difference between time and space. M&E findings need to be made instantly available to inform decision-making processes. Currently, M&E technology systems are still less advanced in South Africa as compared to developed nations, and hence there is little use of technology in the field. South Africa's growth in this field is constrained by the fact that many individuals tasked with M&E duties lack the necessary skills and training in the use of ICT to collect and analyse data. There is, however, a need for a variety of systems that can be easily used by communities to collect data because the use of technology in data collection and analysis is already making an impact in the M&E space. This will promote public participation in evaluating developmental programmes in their communities. Furthermore, storage of data in the cloud would allow for evaluators to share and access data in a centralised location. This would help evaluators to speak to each other's findings and would allow policy makers to make decisions that are informed by inter sectoral findings. The use of data visualisation is also becoming important in simplifying information and making it accessible for public consumption and for policy makers who do not want to read long reports. Such information is user friendly and can be used as evidence in discussions taking place on social media.

Usefulness of conference discussions
There are two things that were of significance for me at the conference: (1) the role of technology in M&E and (2) the language used by practitioners in the field. M&E helps decision makers to make informed decisions based on real evidence. Collection and analysis of data give legitimacy to the recommendations made by evaluators. As mentioned earlier, data collection and analysis can be time consuming especially where one needs to collect a large amount of data; therefore, the focus on technology in M&E in the conference was relevant. The conference showcased examples of use of technology that promoted contextually relevant and timeous intervention. The discussions also highlighted remaining concerns about technology and digital data in evaluations which SAMEA needs to explore further. This included issues of privacy and confidentiality; data costs and poor network coverage in the country; costs of some of the technology, particularly for small organisations and finally the risk of loss of property because of crime during data collection.
Other poignant discussions I found focused on decolonising M&E. The language used in M&E is based on western standards. The tools that are used to collect data measure success in terms of western standards and are not interpreted into our African context. There is a need to look at instrument development for data collection. Mixed data collection methods should be used to promote public participation in the evaluation, so that the concerns of all stakeholders and beneficiaries are taken into consideration to hold the funder accountable. However, the relevance of the tools and cultural context where the tools will be used needs to be taken into account. For example, in western culture having children's books at home symbolises the parent's interest in reading for their children; in African culture telling fairy tale stories in the evening is equivalent to parents teaching their children. Oral education needs to be equally valued as reading books. Tools should not only focus on discovering whether the intended results were met but must also seek to promote organisational accountability in ensuring that the research is contextually relevant to the community and its needs whilst also ensuring that post-research feedback is provided.

How South African Monitoring and Evaluation Association can continue to support emerging evaluators?
South African Monitoring and Evaluation Association is not a regulating body which means membership is voluntary. The role of a voluntary organisation that seeks to professionalise the field is to promote professional development for its members (IOCE n.d.). South African Monitoring and Evaluation Association continues to promote the credibility of the profession. Affiliation to the body gives EEs credibility for their work; it gives them a voice to present their findings. One way of promoting professional development is to advocate for a better curriculum at universities that offer M&E studies. South African Monitoring and Evaluation Association should help to ensure that curriculum is relevant to the changing face of M&E, that M&E responds to the need for contextual relevance to the African continent and that there is a standardised production of graduates. Although there is no body to regulate the profession, there is a need to make sure that universities produce the standard of graduates required in the field, and the role of SAMEA should be to advocate for curriculum that responds to those requirements. Beyond university training, the capacity development workshops that SAMEA hosts need to continue to build relevant skills for EEs. They should give EEs the skills they need in their daily work in the space, like using different types of software for data collection or visualisation.
South African Monitoring and Evaluation Association needs to do more to encourage EEs to publish in the field. South African Monitoring and Evaluation Association could host writing workshops or retreats to encourage EEs to publish in the field. The outputs of the research will assist with professionalisation of the field, which also requires that there is research output towards developing the field in the continent. South African Monitoring and Evaluation Association should also promote collaboration with research institutes to give EEs a space to interrogate the practice of M&E in African context, to deconstruct and decolonise the field to suit the context of the communities they work in. South African Monitoring and Evaluation Association should also help create employment opportunities for its members. This does not mean the body needs to hire people but that it needs to promote the field and make it marketable to potential employers. By so doing, EEs will get opportunities for employment.

Conclusion
The reflections from the two EEs in two different sectors illustrate that the challenges they face are mainly on capacity development issues that call for the continuous training and mentorship of EEs. They both raised solutions that call for SAMEA to: drive the facilitation of relevant and up to date capacity development workshops; promote EE opportunities and platforms and develop evaluator competencies. As expected, these suggestions are aligned to the literature reviewed on development of EE programmes for this paper.
From the SAMEA Emerging Evaluators concept note 2019, the literature reviewed for this paper and, above all, the reflections from the two EEs, this paper concludes that it is important for VOPEs to develop EE programmes whilst taking the following into consideration: • The different career pathways of evaluators which call for different capacity development approaches. • Emerging evaluators in different sectors might have varying needs. • It is important for VOPEs to facilitate trainings and further advocate for higher education institutions offering M&E programmes and to include MERL Tech with the use of ICT in data collection and analysis. • It is important to develop EE programmes and strategies that are contextually relevant to their communities and to the evaluation discipline. • Based on the lessons learnt from this paper, it is important to include EE voices and events in VOPE organised events that include conferences, workshops, seminars and publications.
It is therefore clear that developing EE contextually relevant programmes by VOPEs is an urgent matter. Furthermore, the establishment of EE platforms is critical not only for each EE's development but for the sustainability of the evaluation profession.