Voluntary Organizations for Professional Evaluation (VOPEs) are increasingly realising the importance of ‘mainstreaming’ emerging evaluators (EEs) in capacity-building initiatives for sustaining the evaluation profession. This article aimed to address the importance and role of VOPEs in developing EEs. The article describes the global key issues shaping VOPEs’ interventions for EEs, South African Monitoring and Evaluation Association’s Emerging Evaluators Programme and reflections by two EEs from different sectors on the future of Monitoring and Evaluation (M&E) in South Africa. The views from the EEs’ reflections show the differences and similarities across their different sectors. Recommendations are proposed on the importance of developing EE programmes that are contextually relevant.
This article is valuable for all VOPEs and stakeholders with the intent of supporting emerging evaluators.
Voluntary Organizations for Professional Evaluation (VOPEs) are increasingly realising the importance of ‘mainstreaming’ emerging evaluators (EEs) in capacity-building initiatives for sustaining the evaluation profession. The Young and Emerging Evaluators initiative is an EvalPartners initiative, which was introduced by the International Organisation for Cooperation in Evaluation (IOCE) and United Nations, and has been taken up and supported by Regional and National VOPEs (EvalPartners
A review of literature highlighted that the key issues shaping VOPEs’ interventions for EEs are as follows: consideration of the different career pathways, the multidisciplinary nature of M&E, M&E as an evolving discipline and the demand for locally relevant skills. These issues are explained as follows:
The changed operating context and the new policy directions are further leading to a demand for
The four highlighted key issues mean that as evaluators we need to rethink the way we support EEs and that the discipline is evolving so we need to be equipped to remain relevant. Thus, for EEs to operate competently within the M&E profession, they require support in various areas to build these competencies. This necessitates the facilitation of continuous capacity development activities for EEs.
South African Monitoring and Evaluation Association’s contribution to EE capacity building has been the initiation of the EEs programme. The programme is guided by an EE’s concept note developed by the association in 2019. It is facilitated as one of SAMEA’s portfolios and acts as a platform where EEs voluntarily participate, share information and gain exposure, access and referrals to other evaluation platforms. Beyond the consideration of a transformative agenda based on South Africa’s history, the strategies in the concept note were based on the following: the 2018 Twende Mbele study on
The five components of the strategies for the SAMEA concept note are as follows:
Identifying companies and institutions interested in hosting M&E Interns & maintaining an online database of these companies.
Maintaining and communicating an updated list of relevant trainings and courses on the SAMEA website.
Increasing membership and participation in EE platforms.
Facilitating a mentorship/internship programme.
Facilitating EE networks.
As part of implementing the concept note strategies, SAMEA biennially sponsors more than 20 EE scholarships to attend and participate in the conference and confers two awards to the most promising EEs. This award celebrates evaluators who, in the earliest stages of their career, made significant contributions to the field or practice of evaluation and demonstrated quality and effectiveness in their work. During the SAMEA 2019 conference, the ‘M&E skills in a changing world’ strand had a session on
Obakeng is a public sector EE who reflects on the future of M&E by discussing his evaluation experience and the key developments that currently and in the future will shape the practice of evaluation in the public sector. His discussion further looks at the skills the public sector would need to invest in EEs. Amkelwa is an Non-Governmental Organisation (NGO) sector EE who reflects on the future of M&E by reflecting on her evaluation experience and discussing her predictions of evaluation in the next few years to come. Both EEs conclude by highlighting the role SAMEA needs to take in supporting EEs.
These reflections are drawn from experience of the author, Obakeng, as an EE within the South African public sector. For the past 5 years, I have worked as an evaluation officer at the National Department of Planning, Monitoring and Evaluation, a centre of government department that is the custodian on the South African National Evaluation System. My experience includes providing project management support to government-commissioned evaluations, contributing to the production of evaluation communication tools, participating in capacity-building initiatives such as attending workshops, participating in evaluation design clinics, working closely with various evaluation specialists and being part of the revision of the National Evaluation Policy Framework (NEPF) that will shape the future of evaluations in the public sector. The reflections are guided by four perceived key developments that will shape the practice of evaluation in the public sector in the next 5 years. These include the
The practice of evaluation in the South African public sector can be said to have been formalised by the establishment of the Department of Planning, Monitoring and Evaluation (DPME) in 2010 and the cabinet approval of the NEPF in 2011, which paved ways for the implementation of a National Evaluation System (NES). The NEPF was used as a tool to formalise and institutionalise evaluation across the public sector by providing guidance on the practice of evaluation and also build state capacity in the field of evaluations. An evaluation of the NES prompted the need to revise the NEPF and also how the NES would be implemented going forward, which led to the revision and development of the NEPF 2019–2024. The NEPF 2019–2024 provides guidance and shapes the practice of public sector evaluation in the next 5 years. The main changes to the NEPF include the integration of State-owned entities into the NES, ensuring that the policy takes into account gender equality and women’s empowerment priorities relating to women, the development needs of youth and the concerns of persons with disabilities as well as other vulnerable groups in society when undertaking evaluation projects and devolving an all-encompassing evaluation capacity development approach that aims to empower the State in the effective implementation of evaluations (NEPF 2019). Furthermore, the revision process of the NEPF has also indicated the need for a hybrid model in conducting evaluations, zooming in into the district model and also a co-production model on outsourced evaluations.
Based on the revised NEPF 2019–2024, evaluations in the public sector for the next 5 years will include hybrid model in conducting evaluations. The NEPF revision process has opened a discussion and a need to adapt a hybrid model in undertaking the evaluations in the public sector to strike a balance between internally and externally undertaken evaluations. The idea will be to adopt some form of a co-production model in which some of the evaluation deliverables will be undertaken internally as opposed to outsourcing every component of the evaluation. The DPME has also developed a rapid evaluation guideline to be piloted and used across government. Rapid evaluations will require internal government staff to play an active role. This will mean a shift in the role of the evaluation unit within DPME and other M&E units across government, from management of evaluation to carrying out evaluations. At the same time, government is facing increasing fiscal constraints. Government budget for evaluation has shown a decreased pattern; for example, the budget for ‘consultants: business & advisory services’ declined by approximately R 4 709 000 between the financial years 2015/2016 and 2019/2020.
Currently EEs in the public sector are exposed mainly to the management of evaluation assignments. The internship programme for new entrants in the evaluation field requires contracted service providers to work closely with EEs/officials from previously disadvantaged groups, and the introduction of a series of short courses and workshops in collaboration with the National School Government. Despite the capacity-building initiatives put in place, EEs still need
South African Monitoring and Evaluation Association can continue supporting EEs by providing meaningful mentorship that allows EEs to learn by doing, and further develop specific competencies in evaluation. Other ways to support EEs are to provide capacity building in the form of workshops, as well as provide a platform (virtually) for emerging evaluators to interact, share ideas and learn from one another. Recognising and rewarding EEs’ contribution to the sector are also important. Receiving the SAMEA Emerging Evaluators Award was my highlight of the conference. Furthermore, my presentation was well received and triggered some interesting discussions with key individuals in the M&E practice. This itself was such an amazing experience, as the two events symbolised growth in my career and the beginning of a brighter future for me in the evaluation space.
Being an EE in this dynamic world requires one to adapt quickly to change, learn on one’s feet and be innovative and creative in the practice of M&E because the world is changing at a fast pace. What works today might not work tomorrow. Emerging evaluators should also be prepared to work under serious resource constraints and in politically challenging environments. To grow in the field of evaluation, one needs to be involved in the technical work of evaluations, conducting evaluations and exploring with different evaluation methods and approaches. Emerging evaluators should be pioneers of change in communities.
The reflections that follow are drawn from the experience of the author, Amkelwa, as an EE within the South African NGO sector. I have been an M&E Intern at JET Education Services (JET) since 2019 and have been mainly involved in the evaluation of education interventions. JET is a research NGO that works with stakeholders to improve the quality of education and the relationship between education and skills development. Our services include providing M&E expertise to clients, which include government, non-governmental organisations, donors, foundations, corporates and multi-lateral institutions. The reflections of M&E in the next 10 years are based on my one-and-a-half year as an EE at JET. This is guided by my experience as an external evaluator which predicts Integration of M&E professions in policy formulation and programme planning cycles.
Most evaluations in Africa are largely commissioned by government departments, international donors or development agencies (Bamberger
In future, M&E practitioners will promote the use of evaluation to inform policy and programme development through involvement in the planning stage of an intervention to inform policy making and achieve sustainable development. My impression is that programme funders in the education sector are shifting towards holistic intervention by engaging evaluators in the planning stage. This has helped to develop structured programme implementation that speaks to the intended outcomes. In the near future, evaluations will focus on providing policy and programme implementers with information that is relevant to their context and environment. The use of evaluation to inform policymaking and programme development can create a positive interaction that will help to achieve sustainable development. This will help focus not only on whether the intended intervention resulted in the desired outcomes but also on how the intended intervention could be better implemented in the context of the community. By providing credible information based on robust evaluation findings, evaluation can play a role in influencing policy makers and programme implementers. Awareness of this role could assist M&E practitioners to secure better budgets for their work, and tell their stories in a better way.
Another likely change is the use of MERL TECH (Monitoring, Evaluation, Research and Learning) with the use of information and communication technologies (ICT) by evaluators. M&E practitioners need to use technology to collapse the difference between time and space. M&E findings need to be made instantly available to inform decision-making processes. Currently, M&E technology systems are still less advanced in South Africa as compared to developed nations, and hence there is little use of technology in the field. South Africa’s growth in this field is constrained by the fact that many individuals tasked with M&E duties lack the necessary skills and training in the use of ICT to collect and analyse data. There is, however, a need for a variety of systems that can be easily used by communities to collect data because the use of technology in data collection and analysis is already making an impact in the M&E space. This will promote public participation in evaluating developmental programmes in their communities. Furthermore, storage of data in the cloud would allow for evaluators to share and access data in a centralised location. This would help evaluators to speak to each other’s findings and would allow policy makers to make decisions that are informed by inter sectoral findings. The use of data visualisation is also becoming important in simplifying information and making it accessible for public consumption and for policy makers who do not want to read long reports. Such information is user friendly and can be used as evidence in discussions taking place on social media.
There are two things that were of significance for me at the conference: (1) the role of technology in M&E and (2) the language used by practitioners in the field. M&E helps decision makers to make informed decisions based on real evidence. Collection and analysis of data give legitimacy to the recommendations made by evaluators. As mentioned earlier, data collection and analysis can be time consuming especially where one needs to collect a large amount of data; therefore, the focus on technology in M&E in the conference was relevant. The conference showcased examples of use of technology that promoted contextually relevant and timeous intervention. The discussions also highlighted remaining concerns about technology and digital data in evaluations which SAMEA needs to explore further. This included issues of privacy and confidentiality; data costs and poor network coverage in the country; costs of some of the technology, particularly for small organisations and finally the risk of loss of property because of crime during data collection.
Other poignant discussions I found focused on decolonising M&E. The language used in M&E is based on western standards. The tools that are used to collect data measure success in terms of western standards and are not interpreted into our African context. There is a need to look at instrument development for data collection. Mixed data collection methods should be used to promote public participation in the evaluation, so that the concerns of all stakeholders and beneficiaries are taken into consideration to hold the funder accountable. However, the relevance of the tools and cultural context where the tools will be used needs to be taken into account. For example, in western culture having children’s books at home symbolises the parent’s interest in reading for their children; in African culture telling fairy tale stories in the evening is equivalent to parents teaching their children. Oral education needs to be equally valued as reading books. Tools should not only focus on discovering whether the intended results were met but must also seek to promote organisational accountability in ensuring that the research is contextually relevant to the community and its needs whilst also ensuring that post-research feedback is provided.
South African Monitoring and Evaluation Association is not a regulating body which means membership is voluntary. The role of a voluntary organisation that seeks to professionalise the field is to promote professional development for its members (IOCE
South African Monitoring and Evaluation Association needs to do more to encourage EEs to publish in the field. South African Monitoring and Evaluation Association could host writing workshops or retreats to encourage EEs to publish in the field. The outputs of the research will assist with professionalisation of the field, which also requires that there is research output towards developing the field in the continent. South African Monitoring and Evaluation Association should also promote collaboration with research institutes to give EEs a space to interrogate the practice of M&E in African context, to deconstruct and decolonise the field to suit the context of the communities they work in. South African Monitoring and Evaluation Association should also help create employment opportunities for its members. This does not mean the body needs to hire people but that it needs to promote the field and make it marketable to potential employers. By so doing, EEs will get opportunities for employment.
The reflections from the two EEs in two different sectors illustrate that the challenges they face are mainly on capacity development issues that call for the continuous training and mentorship of EEs. They both raised solutions that call for SAMEA to: drive the facilitation of relevant and up to date capacity development workshops; promote EE opportunities and platforms and develop evaluator competencies. As expected, these suggestions are aligned to the literature reviewed on development of EE programmes for this paper.
From the SAMEA Emerging Evaluators concept note 2019, the literature reviewed for this paper and, above all, the reflections from the two EEs, this paper concludes that it is important for VOPEs to develop EE programmes whilst taking the following into consideration:
The different career pathways of evaluators which call for different capacity development approaches.
Emerging evaluators in different sectors might have varying needs.
It is important for VOPEs to facilitate trainings and further advocate for higher education institutions offering M&E programmes and to include MERL Tech with the use of ICT in data collection and analysis.
It is important to develop EE programmes and strategies that are contextually relevant to their communities and to the evaluation discipline.
Based on the lessons learnt from this paper, it is important to include EE voices and events in VOPE organised events that include conferences, workshops, seminars and publications.
It is therefore clear that developing EE contextually relevant programmes by VOPEs is an urgent matter. Furthermore, the establishment of EE platforms is critical not only for each EE’s development but for the sustainability of the evaluation profession.
The authors acknowledge the South African Monitoring and Evaluation Association (SAMEA) for the content from their research, networks and strategies used in their Emerging Evaluators portfolio.
The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this research article.
N.T.N., O.G.M. and A.M. contributed equally to this research article.
This article followed all ethical standards for research without direct contact with human or animal subjects.
This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
Data sharing is not applicable to this article as no new data were created or analysed in this study.
The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.
The figures are based on an analysis of the DPME budget for evaluation for the financial years 2015/2016 to 2019/2020.