About the Author(s)


Nozipho T. Ngwabi Email symbol
Centre for Research on Evaluation, Science and Technology (CREST), Faculty of Arts and Social Science, Stellenbosch University, Stellenbosch, South Africa

Obakeng G. Mpyana symbol
Department of Planning, Monitoring and Evaluation, CD: Evaluation, The Presidency, Pretoria, South Africa

Amkelwa Mapatwana symbol
Department of Monitoring and Evaluation, JET Education Services, Johannesburg, South Africa

Citation


Ngwabi, N.T., Mpyana, O.G. & Mapatwana, A., 2020, ‘Reflections from emerging evaluators in shaping Voluntary Organizations for Professional Evaluation capacity-building initiatives’, African Evaluation Journal 8(1), a509. https://doi.org/10.4102/aej.v8i1.509

Note: Special Collection: SAMEA 7th Biennial Conference 2019.

Original Research

Reflections from emerging evaluators in shaping Voluntary Organizations for Professional Evaluation capacity-building initiatives

Nozipho T. Ngwabi, Obakeng G. Mpyana, Amkelwa Mapatwana

Received: 03 Aug. 2020; Accepted: 13 Aug. 2020; Published: 05 Nov. 2020

Copyright: © 2020. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Voluntary Organizations for Professional Evaluation (VOPEs) are increasingly realising the importance of ‘mainstreaming’ emerging evaluators (EEs) in capacity-building initiatives for sustaining the evaluation profession. This article aimed to address the importance and role of VOPEs in developing EEs. The article describes the global key issues shaping VOPEs’ interventions for EEs, South African Monitoring and Evaluation Association’s Emerging Evaluators Programme and reflections by two EEs from different sectors on the future of Monitoring and Evaluation (M&E) in South Africa. The views from the EEs’ reflections show the differences and similarities across their different sectors. Recommendations are proposed on the importance of developing EE programmes that are contextually relevant.

This article is valuable for all VOPEs and stakeholders with the intent of supporting emerging evaluators.

Keywords: Emerging Evaluators; Voluntary Organizations for Professional Evaluation (VOPEs); Capacity building; South African Monitoring and Evaluation Association; Monitoring and evaluation.

Voluntary Organizations for Professional Evaluation (VOPEs) are increasingly realising the importance of ‘mainstreaming’ emerging evaluators (EEs) in capacity-building initiatives for sustaining the evaluation profession. The Young and Emerging Evaluators initiative is an EvalPartners initiative, which was introduced by the International Organisation for Cooperation in Evaluation (IOCE) and United Nations, and has been taken up and supported by Regional and National VOPEs (EvalPartners n.d.). The African Evaluation Association (AfrEA) recently initiated an EEs’ regional network, and a number of national VOPEs in Africa are networking with and facilitating EEs’ activities (AfrEA 2019). To support EEs, the South African Monitoring and Evaluation Association (SAMEA) used the EvalYouth concept note as a footing to define South African ‘EEs’ and develop a SAMEA concept note which is contextualised to the South African political and socio-economic background (EvalYouth 2016; SAMEA 2019). Furthermore, SAMEA is currently finalising the development of a competencies framework that will guide its capacity development initiatives and possible career pathways in evaluation. This article outlines the global key issues shaping VOPEs’ interventions for EEs, SAMEA’s Emerging Evaluators Programme and reflections by two EEs on the future of M&E in South Africa and concludes with lessons for other VOPEs and stakeholders with the intent of supporting EEs.

Global key issues shaping Voluntary Organizations for Professional Evaluation interventions for emerging evaluators

A review of literature highlighted that the key issues shaping VOPEs’ interventions for EEs are as follows: consideration of the different career pathways, the multidisciplinary nature of M&E, M&E as an evolving discipline and the demand for locally relevant skills. These issues are explained as follows:

  1. The consideration of different career pathways: Behind the scenes, Monitoring and Evaluation (M&E) activities are supported by M&E practitioners with varying levels of experience, capacity and backgrounds based on the career pathways they took. Although various VOPEs, scholars, institutions and donors have defined M&E pathways in accordance to varying contexts, each evaluator is likely to have taken one of the three pathways as defined by Stevahn et al. (2005), such as the ‘new evaluator’, the ‘accidental evaluator’ and the ‘professional in transition’. The ‘new evaluator’ is an evaluator who has just joined the evaluation field after an M&E academic qualification, with no past work experience in evaluation and needs relevant work experience. The ‘accidental evaluator’ enters the profession of evaluation by virtue of being assigned evaluation tasks in the organisation and needs formal training. The ‘professional in transition’ refers to an individual who makes a conscious decision to change from their current profession to become an evaluator and needs to navigate a new discipline where they were used to a different one. Common to all three pathways is that they all need some form of capacity building, including training, exposure, mentoring and continuous professional development.

  2. The multi-disciplinary nature of M&E: M&E practitioners require ‘extensive and deliberate socialization’ to the field (Bertrand Jones 2014) and need to grasp the interdisciplinary aspects of the field.

  3. M&E as an evolving discipline: Methods and approaches have become more complex owing to the era of big data and rapid technological development, growing connectivity and the era of SDGS which are accompanied by broad targets and demanding policy agendas (Picciotto 2019; Schwandt et al. 2016).

  4. The changed operating context and the new policy directions are further leading to a demand for locally relevant skills. Picciotto (2019) clarified that all these changes will mean that evaluators will have to master social networking, crowd-sourced learning and big data analysis.

The four highlighted key issues mean that as evaluators we need to rethink the way we support EEs and that the discipline is evolving so we need to be equipped to remain relevant. Thus, for EEs to operate competently within the M&E profession, they require support in various areas to build these competencies. This necessitates the facilitation of continuous capacity development activities for EEs.

South African Monitoring and Evaluation Association Emerging Evaluators Programme

South African Monitoring and Evaluation Association’s contribution to EE capacity building has been the initiation of the EEs programme. The programme is guided by an EE’s concept note developed by the association in 2019. It is facilitated as one of SAMEA’s portfolios and acts as a platform where EEs voluntarily participate, share information and gain exposure, access and referrals to other evaluation platforms. Beyond the consideration of a transformative agenda based on South Africa’s history, the strategies in the concept note were based on the following: the 2018 Twende Mbele study on Diagnostic on the supply and demand of evaluators in South Africa (Phillips 2018); the Zenex Foundation 2018 study on Monitoring and Evaluation capacity: a landscape analysis; the SAMEA Emerging Evaluators Topical Interest Group (TIG) discussions; SAMEA Yahoo listserv discussions on Emerging Evaluators and SAMEA member surveys conducted in 2017 and 2019.

The five components of the strategies for the SAMEA concept note are as follows:

  • Identifying companies and institutions interested in hosting M&E Interns & maintaining an online database of these companies.
  • Maintaining and communicating an updated list of relevant trainings and courses on the SAMEA website.
  • Increasing membership and participation in EE platforms.
  • Facilitating a mentorship/internship programme.
  • Facilitating EE networks.

As part of implementing the concept note strategies, SAMEA biennially sponsors more than 20 EE scholarships to attend and participate in the conference and confers two awards to the most promising EEs. This award celebrates evaluators who, in the earliest stages of their career, made significant contributions to the field or practice of evaluation and demonstrated quality and effectiveness in their work. During the SAMEA 2019 conference, the ‘M&E skills in a changing world’ strand had a session on upskilling young and emerging evaluators. The session presentations and discussions included experiences of EEs, development of mentorship opportunities for EEs, defining who should be a mentor and skills needed for EEs. Beyond the conference session presentations and discussions, SAMEA further gave two EEs an opportunity to share in writing their reflections, experiences and opinions of M&E in a changing world, and on how SAMEA as the national VOPE can contribute to the development of EEs.

Obakeng is a public sector EE who reflects on the future of M&E by discussing his evaluation experience and the key developments that currently and in the future will shape the practice of evaluation in the public sector. His discussion further looks at the skills the public sector would need to invest in EEs. Amkelwa is an Non-Governmental Organisation (NGO) sector EE who reflects on the future of M&E by reflecting on her evaluation experience and discussing her predictions of evaluation in the next few years to come. Both EEs conclude by highlighting the role SAMEA needs to take in supporting EEs.

Reflections of an emerging evaluator in the public sector
Contextualising the reflections

These reflections are drawn from experience of the author, Obakeng, as an EE within the South African public sector. For the past 5 years, I have worked as an evaluation officer at the National Department of Planning, Monitoring and Evaluation, a centre of government department that is the custodian on the South African National Evaluation System. My experience includes providing project management support to government-commissioned evaluations, contributing to the production of evaluation communication tools, participating in capacity-building initiatives such as attending workshops, participating in evaluation design clinics, working closely with various evaluation specialists and being part of the revision of the National Evaluation Policy Framework (NEPF) that will shape the future of evaluations in the public sector. The reflections are guided by four perceived key developments that will shape the practice of evaluation in the public sector in the next 5 years. These include the revision of the NEPF process, adopting the hybrid model in undertaking evaluations, constraints in government budgets and the need for rapid evaluations.

My context: South African public sector evaluation in the next 5 years

The practice of evaluation in the South African public sector can be said to have been formalised by the establishment of the Department of Planning, Monitoring and Evaluation (DPME) in 2010 and the cabinet approval of the NEPF in 2011, which paved ways for the implementation of a National Evaluation System (NES). The NEPF was used as a tool to formalise and institutionalise evaluation across the public sector by providing guidance on the practice of evaluation and also build state capacity in the field of evaluations. An evaluation of the NES prompted the need to revise the NEPF and also how the NES would be implemented going forward, which led to the revision and development of the NEPF 2019–2024. The NEPF 2019–2024 provides guidance and shapes the practice of public sector evaluation in the next 5 years. The main changes to the NEPF include the integration of State-owned entities into the NES, ensuring that the policy takes into account gender equality and women’s empowerment priorities relating to women, the development needs of youth and the concerns of persons with disabilities as well as other vulnerable groups in society when undertaking evaluation projects and devolving an all-encompassing evaluation capacity development approach that aims to empower the State in the effective implementation of evaluations (NEPF 2019). Furthermore, the revision process of the NEPF has also indicated the need for a hybrid model in conducting evaluations, zooming in into the district model and also a co-production model on outsourced evaluations.

Based on the revised NEPF 2019–2024, evaluations in the public sector for the next 5 years will include hybrid model in conducting evaluations. The NEPF revision process has opened a discussion and a need to adapt a hybrid model in undertaking the evaluations in the public sector to strike a balance between internally and externally undertaken evaluations. The idea will be to adopt some form of a co-production model in which some of the evaluation deliverables will be undertaken internally as opposed to outsourcing every component of the evaluation. The DPME has also developed a rapid evaluation guideline to be piloted and used across government. Rapid evaluations will require internal government staff to play an active role. This will mean a shift in the role of the evaluation unit within DPME and other M&E units across government, from management of evaluation to carrying out evaluations. At the same time, government is facing increasing fiscal constraints. Government budget for evaluation has shown a decreased pattern; for example, the budget for ‘consultants: business & advisory services’ declined by approximately R 4 709 000 between the financial years 2015/2016 and 2019/2020.1 This will also reduce the available resources for commissioning evaluation. This offers both a challenge and an opportunity: a challenge in that it is likely to affect the wider evaluation ecosystem as government issues fewer evaluation tenders, but an opportunity for government internal staff to strengthen their evaluator capabilities as they carry out more evaluative assignments.

Strengthening skills for emerging evaluators in the public sector

Currently EEs in the public sector are exposed mainly to the management of evaluation assignments. The internship programme for new entrants in the evaluation field requires contracted service providers to work closely with EEs/officials from previously disadvantaged groups, and the introduction of a series of short courses and workshops in collaboration with the National School Government. Despite the capacity-building initiatives put in place, EEs still need technical skills in evaluations to be able to undertake evaluations and not entirely focus on managing evaluations. This could be achieved through following a targeted experimental workplace immersion approach in which EEs learn by doing actual technical evaluation work. The technical skills needed to be strengthened includes the ability to design and conceptualise evaluations, developing data collection instruments, data collection skills, data analysis and report writing.

How South African Monitoring and Evaluation Association as a Voluntary Organization for Professional Evaluation can continue to support emerging evaluators?

South African Monitoring and Evaluation Association can continue supporting EEs by providing meaningful mentorship that allows EEs to learn by doing, and further develop specific competencies in evaluation. Other ways to support EEs are to provide capacity building in the form of workshops, as well as provide a platform (virtually) for emerging evaluators to interact, share ideas and learn from one another. Recognising and rewarding EEs’ contribution to the sector are also important. Receiving the SAMEA Emerging Evaluators Award was my highlight of the conference. Furthermore, my presentation was well received and triggered some interesting discussions with key individuals in the M&E practice. This itself was such an amazing experience, as the two events symbolised growth in my career and the beginning of a brighter future for me in the evaluation space.

What I have learned?

Being an EE in this dynamic world requires one to adapt quickly to change, learn on one’s feet and be innovative and creative in the practice of M&E because the world is changing at a fast pace. What works today might not work tomorrow. Emerging evaluators should also be prepared to work under serious resource constraints and in politically challenging environments. To grow in the field of evaluation, one needs to be involved in the technical work of evaluations, conducting evaluations and exploring with different evaluation methods and approaches. Emerging evaluators should be pioneers of change in communities.

Reflections of an emerging evaluator in the NGO sector
Contextualising the reflections

The reflections that follow are drawn from the experience of the author, Amkelwa, as an EE within the South African NGO sector. I have been an M&E Intern at JET Education Services (JET) since 2019 and have been mainly involved in the evaluation of education interventions. JET is a research NGO that works with stakeholders to improve the quality of education and the relationship between education and skills development. Our services include providing M&E expertise to clients, which include government, non-governmental organisations, donors, foundations, corporates and multi-lateral institutions. The reflections of M&E in the next 10 years are based on my one-and-a-half year as an EE at JET. This is guided by my experience as an external evaluator which predicts Integration of M&E professions in policy formulation and programme planning cycles.

Integration of M&E professions in policy formulation and programme planning cycles

Most evaluations in Africa are largely commissioned by government departments, international donors or development agencies (Bamberger 2000; Mouton et al. 2014; Tirivanhu et al. 2018). In this regard, external evaluators are currently mostly commissioned to measure the progress of funded interventions with no prior involvement in the planning or implementation of the programmes.

In future, M&E practitioners will promote the use of evaluation to inform policy and programme development through involvement in the planning stage of an intervention to inform policy making and achieve sustainable development. My impression is that programme funders in the education sector are shifting towards holistic intervention by engaging evaluators in the planning stage. This has helped to develop structured programme implementation that speaks to the intended outcomes. In the near future, evaluations will focus on providing policy and programme implementers with information that is relevant to their context and environment. The use of evaluation to inform policymaking and programme development can create a positive interaction that will help to achieve sustainable development. This will help focus not only on whether the intended intervention resulted in the desired outcomes but also on how the intended intervention could be better implemented in the context of the community. By providing credible information based on robust evaluation findings, evaluation can play a role in influencing policy makers and programme implementers. Awareness of this role could assist M&E practitioners to secure better budgets for their work, and tell their stories in a better way.

Rapid technological developments

Another likely change is the use of MERL TECH (Monitoring, Evaluation, Research and Learning) with the use of information and communication technologies (ICT) by evaluators. M&E practitioners need to use technology to collapse the difference between time and space. M&E findings need to be made instantly available to inform decision-making processes. Currently, M&E technology systems are still less advanced in South Africa as compared to developed nations, and hence there is little use of technology in the field. South Africa’s growth in this field is constrained by the fact that many individuals tasked with M&E duties lack the necessary skills and training in the use of ICT to collect and analyse data. There is, however, a need for a variety of systems that can be easily used by communities to collect data because the use of technology in data collection and analysis is already making an impact in the M&E space. This will promote public participation in evaluating developmental programmes in their communities. Furthermore, storage of data in the cloud would allow for evaluators to share and access data in a centralised location. This would help evaluators to speak to each other’s findings and would allow policy makers to make decisions that are informed by inter sectoral findings. The use of data visualisation is also becoming important in simplifying information and making it accessible for public consumption and for policy makers who do not want to read long reports. Such information is user friendly and can be used as evidence in discussions taking place on social media.

Usefulness of conference discussions

There are two things that were of significance for me at the conference: (1) the role of technology in M&E and (2) the language used by practitioners in the field. M&E helps decision makers to make informed decisions based on real evidence. Collection and analysis of data give legitimacy to the recommendations made by evaluators. As mentioned earlier, data collection and analysis can be time consuming especially where one needs to collect a large amount of data; therefore, the focus on technology in M&E in the conference was relevant. The conference showcased examples of use of technology that promoted contextually relevant and timeous intervention. The discussions also highlighted remaining concerns about technology and digital data in evaluations which SAMEA needs to explore further. This included issues of privacy and confidentiality; data costs and poor network coverage in the country; costs of some of the technology, particularly for small organisations and finally the risk of loss of property because of crime during data collection.

Other poignant discussions I found focused on decolonising M&E. The language used in M&E is based on western standards. The tools that are used to collect data measure success in terms of western standards and are not interpreted into our African context. There is a need to look at instrument development for data collection. Mixed data collection methods should be used to promote public participation in the evaluation, so that the concerns of all stakeholders and beneficiaries are taken into consideration to hold the funder accountable. However, the relevance of the tools and cultural context where the tools will be used needs to be taken into account. For example, in western culture having children’s books at home symbolises the parent’s interest in reading for their children; in African culture telling fairy tale stories in the evening is equivalent to parents teaching their children. Oral education needs to be equally valued as reading books. Tools should not only focus on discovering whether the intended results were met but must also seek to promote organisational accountability in ensuring that the research is contextually relevant to the community and its needs whilst also ensuring that post-research feedback is provided.

How South African Monitoring and Evaluation Association can continue to support emerging evaluators?

South African Monitoring and Evaluation Association is not a regulating body which means membership is voluntary. The role of a voluntary organisation that seeks to professionalise the field is to promote professional development for its members (IOCE n.d.). South African Monitoring and Evaluation Association continues to promote the credibility of the profession. Affiliation to the body gives EEs credibility for their work; it gives them a voice to present their findings. One way of promoting professional development is to advocate for a better curriculum at universities that offer M&E studies. South African Monitoring and Evaluation Association should help to ensure that curriculum is relevant to the changing face of M&E, that M&E responds to the need for contextual relevance to the African continent and that there is a standardised production of graduates. Although there is no body to regulate the profession, there is a need to make sure that universities produce the standard of graduates required in the field, and the role of SAMEA should be to advocate for curriculum that responds to those requirements. Beyond university training, the capacity development workshops that SAMEA hosts need to continue to build relevant skills for EEs. They should give EEs the skills they need in their daily work in the space, like using different types of software for data collection or visualisation.

South African Monitoring and Evaluation Association needs to do more to encourage EEs to publish in the field. South African Monitoring and Evaluation Association could host writing workshops or retreats to encourage EEs to publish in the field. The outputs of the research will assist with professionalisation of the field, which also requires that there is research output towards developing the field in the continent. South African Monitoring and Evaluation Association should also promote collaboration with research institutes to give EEs a space to interrogate the practice of M&E in African context, to deconstruct and decolonise the field to suit the context of the communities they work in. South African Monitoring and Evaluation Association should also help create employment opportunities for its members. This does not mean the body needs to hire people but that it needs to promote the field and make it marketable to potential employers. By so doing, EEs will get opportunities for employment.

Conclusion

The reflections from the two EEs in two different sectors illustrate that the challenges they face are mainly on capacity development issues that call for the continuous training and mentorship of EEs. They both raised solutions that call for SAMEA to: drive the facilitation of relevant and up to date capacity development workshops; promote EE opportunities and platforms and develop evaluator competencies. As expected, these suggestions are aligned to the literature reviewed on development of EE programmes for this paper.

From the SAMEA Emerging Evaluators concept note 2019, the literature reviewed for this paper and, above all, the reflections from the two EEs, this paper concludes that it is important for VOPEs to develop EE programmes whilst taking the following into consideration:

  • The different career pathways of evaluators which call for different capacity development approaches.
  • Emerging evaluators in different sectors might have varying needs.
  • It is important for VOPEs to facilitate trainings and further advocate for higher education institutions offering M&E programmes and to include MERL Tech with the use of ICT in data collection and analysis.
  • It is important to develop EE programmes and strategies that are contextually relevant to their communities and to the evaluation discipline.
  • Based on the lessons learnt from this paper, it is important to include EE voices and events in VOPE organised events that include conferences, workshops, seminars and publications.

It is therefore clear that developing EE contextually relevant programmes by VOPEs is an urgent matter. Furthermore, the establishment of EE platforms is critical not only for each EE’s development but for the sustainability of the evaluation profession.

Acknowledgements

The authors acknowledge the South African Monitoring and Evaluation Association (SAMEA) for the content from their research, networks and strategies used in their Emerging Evaluators portfolio.

Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this research article.

Authors’ contributions

N.T.N., O.G.M. and A.M. contributed equally to this research article.

Ethical considerations

This article followed all ethical standards for research without direct contact with human or animal subjects.

Funding information

This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Data availability

Data sharing is not applicable to this article as no new data were created or analysed in this study.

Disclaimer

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.

References

African Evaluation Association (AfrEA), 2019, ‘African evaluation guidelines: Standards and norms’, presented at the African Evaluation Association Conference, Abijan, Côte d’Ivoire, March, 2019.

Bamberger, M., 2000, ‘The evaluation of international development programs: A view from the front’, American Journal of Evaluation 21(1), 95–102.

Bertrand Jones, T., 2014, ‘Socializing emerging evaluators: The use of mentoring to develop evaluation competence,’ New Directions for Evaluation 2014(143), 83–96.

Department of Performance Monitoring and Evaluation, 2019, National Evaluation Policy Evaluation Framework, viewed 20 February 2020, from https://www.dpme.gov.za/keyfocusareas/evaluationsSite/Evaluations/National%20Policy%20framework%20Nov%202019.pdf.

EvalPartners, n.d., Home, viewed 06 July 2020, from https://www.evalpartners.org/.

EvalYouth, 2016, ‘EvalYouth concept note’, in EvalPartners, viewed 06 July 2020, from https://www.evalpartners.org/sites/default/files/documents/evalyouth/EvalYouth%20Concept%20Note%20-%20July%202016.pdf.

IOCE, n.d., Mechanisms for professionalization, viewed 29 July 2020, from https://vopetoolkit.ioce.net/en/section/53-mechanisms-professionalization.

Mouton, C., Rabie, B., Cloete, F. & De Coning, C., 2014, ‘Historical development & practice of evaluation’, in F. Cloete, B. Rabie & C. De Coning (eds.), Evaluation Management in South Africa and Africa, pp. 28–78, Sun Press, Stellenbosch.

Phillips, S., 2018, ‘Twende Mbele diagnostic on the supply and demand of evaluators in South Africa’, viewed 09 January 2020, from https://www.dpme.gov.za/keyfocusareas/gwmeSite/GovermentWide%20M%20and%20E/Diagnostic%20report%20Twende%20Mbele%20demand%20and%20supply%20of%20evaluation%2021%20January.pdf.

Picciotto, R., 2019, ‘Is evaluation obsolete in a post-truth world?’, Evaluation and Program Planning 73, 88–96.

Schwandt, T., Ofir, Z., Lucks, D., El-Saddick, K. & D’Errico, S., 2016, ‘Evaluation: A crucial ingredient for SDG success’, in IIED Briefing Papers, IIED, n.l.

South African Monitoring and Evaluation Association (SAMEA), 2019, SAMEA Emerging Evaluators Programme Concept note, viewed 06 February 2020, from https://www.samea.org.za/summernotefile/dump?summernotefile_id=124.

Stevahn, L., King, J.A., Ghere, G. & Minnema, J., 2005, ‘Establishing essential competencies for program evaluators’, American Journal of Evaluation 26(1), 43–59.

Tirivanhu, P., Robertson, H., Waller, C. & Chirau, T., 2018, ‘Assessing evaluation education in African tertiary education institutions: Opportunities and reflections’, South African Journal of Higher Education 32(4), 229–244.

Zenex Foundation, 2018, ‘Monitoring and evaluation capacity: A landscape analysis’, viewed 09 January 2020, from https://www.zenexfoundation.org.za/images/Final_Report_on_Landscape_Study_29_August_2018_LD.pdf.

Footnote

1. The figures are based on an analysis of the DPME budget for evaluation for the financial years 2015/2016 to 2019/2020.


 

Crossref Citations

1. Transforming voluntary organisations for professional evaluation into central pillars of national evaluation systems
Tebogo Fish, Ayabulela Dlakavu, Matodzi Amisi, Steven Masvaure
African Evaluation Journal  vol: 10  issue: 1  year: 2022  
doi: 10.4102/aej.v10i1.608