About the Author(s)


Lesedi S. Matlala Email symbol
School of Public Management, Governance, and Public Policy, College of Business and Economics, University of Johannesburg, Johannesburg, South Africa

Citation


Matlala, L.S., 2024, ‘Empowering emerging evaluators in evaluation project phases: Perspectives and recommendations’, African Evaluation Journal 12(1), a780. https://doi.org/10.4102/aej.v12i1.780

Original Research

Empowering emerging evaluators in evaluation project phases: Perspectives and recommendations

Lesedi S. Matlala

Received: 12 Aug. 2024; Accepted: 30 Oct. 2024; Published: 04 Dec. 2024

Copyright: © 2024. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background: Engaging emerging evaluators (EEs) in evaluation projects offers substantial benefits for both evaluators and organisations; however, effective guidance on empowering them is often limited.

Objectives: This article provides actionable recommendations for empowering EEs across all phases of evaluation projects, drawing primarily on insights from EEs and their mentors. Leveraging the Eval4Action campaign’s standards for meaningful youth engagement, this article aims to enrich EEs’ learning experiences, foster their professional growth, and enhance evaluation quality and impact.

Method: A qualitative and participatory approach was used, incorporating the author’s mentorship experiences, discussions with EEs and mentors, and consultations with established programmes such as the South African Monitoring and Evaluation Association’s (SAMEA) EE initiative and the EvalYouth Global Network. A comprehensive literature review on youth engagement and partnerships further informs this study.

Results: Findings indicate that effective EE engagement relies on well-defined roles, open communication, and genuine appreciation of EEs’ contributions. Despite these needs, many EEs reported being limited to data collection roles, which restricts their opportunities for growth and limits their developmental impact within evaluations.

Conclusion: Adopting these recommendations can foster a culture of empowerment and inclusion, enhancing both the quality and effectiveness of evaluation projects.

Contribution: This study underscores the importance of engaging EEs in meaningful roles throughout projects, advancing inclusivity and effectiveness in evaluations. Empowering EEs leads to more comprehensive assessments, providing deeper insights for policy and programme evaluation.

Keywords: emerging evaluators; evaluation projects; empowerment; practical recommendations; engagement.

Introduction

In a world with a population of 8 billion, leveraging demographic changes is essential to advance the Sustainable Development Goals (SDGs) and address global crises such as climate change. Creating opportunities and spaces for the growing youth population in various spheres, including Monitoring and Evaluation (M&E), is critical to achieving an inclusive and sustainable world. The United Nations (UN) Secretary-General’s ‘Our Common Agenda’ emphasises the significance of intergenerational collaboration in tackling today’s complex issues (United Nations 2018). This collaboration is essential in the field of evaluation, where the involvement of young and emerging evaluators (EE) can bring fresh perspectives and innovative approaches to longstanding challenges.

The landscape of M&E in South Africa and across Africa is undergoing significant transformation, driven by a rising demand for rigorous evaluation to inform evidence-based decision-making in both the public and private sectors. Despite establishing the National Evaluation System by the Department of Planning, Monitoring, and Evaluation (DPME) in 2011 and expanding M&E training programmes in higher education in South Africa, a substantial gap in evaluation capacity persists (DPME 2019; Phillips 2018). This gap is most noticeable in the shortage of skilled evaluators, which has yet to keep pace with the increasing demand for high-quality evaluations (Zenex Foundation 2018).

Numerous organisations and initiatives in South Africa have started hosting EEs to address the capacity challenge in the evaluation sector. Among these, the South African Monitoring and Evaluation Association (SAMEA) initiated an EEs programme. This initiative provides a platform for EEs to voluntarily participate, share information, gain exposure and access evaluation opportunities. The SAMEA EE programme, an essential portfolio of SAMEA, requires participants to be members to benefit from its core activities and future leadership opportunities. This structured approach bridges the capacity gap by nurturing a new generation of skilled evaluators (SAMEA 2019). The relevance of the SAMEA EE programme is underscored by recent studies, including the Twende Mbele initiative – a collaborative effort among the governments of South Africa, Uganda and Benin – and a study by the Zenex Foundation on the M&E capacity landscape. These studies highlighted the dynamics of evaluation demand and supply in South Africa, and emphasised the necessity of strategies specifically designed to support EEs (Phillips 2018; Zenex 2018). Feedback from these studies, along with input from the SAMEA EE Technical Interest Group (TIG), SAMEA Yahoo listserv discussions, and surveys conducted by SAMEA in 2017 and 2019, pointed to the critical need for a structured strategy to develop and empower EEs in South Africa.

Despite these efforts, more explicit guidance on effectively empowering and engaging EEs throughout the evaluation process is still lacking. While programmes such as the SAMEA EE initiative provide a foundational framework, there is an ongoing need for practical recommendations and strategies to maximise the potential of EEs. This gap in guidance is significant because, without a clear, actionable plan, the risk of underutilising EEs persists, impeding the development of their skills and the overall quality of evaluation projects.

Similar challenges in engaging and empowering emerging professionals can be observed across various fields and countries. For instance, in the healthcare sector, young professionals often need help gaining meaningful roles beyond essential responsibilities. Studies have shown that young healthcare practitioners frequently experience limited opportunities for involvement in decision-making processes, which hinders their professional growth and the overall improvement of healthcare services (Catalano et al. 2019).

In adolescent health and medication safety, youth advisory boards have been implemented to involve young people in research planning and execution. However, even in these programmes, there are barriers to full participation and effective engagement, such as insufficient training and support, which can limit the contributions of young advisers (Abraham, Rosenberger & Poku 2023). This mirrors the experiences of EEs, who often find themselves restricted to data collection roles without the opportunity to engage in more substantive aspects of evaluation projects.

Globally, the problem of engaging young researchers and professionals is also evident. For example, in youth civic engagement programmes, youth councils sometimes inadvertently reinforce social inequalities by not adequately including diverse youth voices or providing sufficient mentorship and support (Martini et al. 2018). In mental health research, young people participating in co-production initiatives with researchers often face challenges such as tokenistic involvement and a lack of genuine collaboration, which limits their ability to contribute meaningfully (Greer et al. 2018; Wadman et al. 2019).

Addressing these issues requires an intentional focus on creating environments where young professionals and EEs can meaningfully contribute to all phases of projects, from planning and design to execution and reporting. Effective engagement and empowerment of EEs can lead to more comprehensive and insightful evaluations as diverse perspectives and fresh approaches are integrated into the evaluation process (Hawke et al. 2020). However, many EEs currently find themselves limited in their data collection roles and need more opportunities for deeper involvement and professional growth (SAMEA 2019).

This article aims to provide practical recommendations for empowering EEs across all stages of evaluation projects, with a specific focus on insights gathered from EE mentors across diverse sectors, including government, non-governmental organisations (NGOs), private consulting firms and Voluntary Organisations for Professional Evaluation (VOPEs). By leveraging the standards for enhancing meaningful youth engagement in evaluation developed by the Eval4Action campaign, the article aims to establish a structured framework for guiding these recommendations. Drawing on mentorship experiences spanning various sectors, the article endeavours to enrich the professional development journey of EEs, facilitating their growth and competence in evaluation practice. Ultimately, this approach seeks to elevate evaluation projects’ overall effectiveness and impact, contributing significantly to capacity-building initiatives within the evaluation community in South Africa and beyond.

The remainder of the article is structured as follows. Firstly, the literature review section will explore existing research on the engagement and empowerment of emerging professionals, drawing parallels from other fields such as healthcare, youth civic engagement and mental health research. This section will provide a comprehensive context for understanding the challenges and opportunities in involving EEs in evaluation projects, grounded within a theoretical framework of capacity-building and participatory approaches in evaluation. Secondly, the methodology section will detail the research design, data collection methods, sampling approach and analysis techniques used in this study. This section will explain how mentors’ perspectives from the SAMEA EE programme were gathered and analysed to develop practical recommendations. The findings section will present the key themes and insights derived from the data, highlighting specific strategies and practices that have effectively engaged EEs. The discussion section will interpret these findings in the context of the broader literature and theoretical framework, identifying implications for both practice and theory. Finally, the conclusion will summarise the study’s main contributions, offering actionable recommendations for evaluators, organisations and stakeholders to enhance the inclusion and empowerment of EEs in evaluation projects. The article will close with suggestions for future research and potential pathways for further strengthening evaluation capacity in South Africa and beyond.

Literature review

The engagement of emerging practitioners, including interns and early-career professionals, in research and evaluation projects is a topic that has been explored across various disciplines and geographical contexts. Globally, there is a consensus on the value of involving emerging practitioners to enrich research outcomes and foster professional development. However, numerous studies have identified significant challenges and gaps in effectively integrating these individuals into research projects.

For instance, according to the findings of Hawke et al. (2020), young professionals in health research in the United States (US) often encounter significant limitations in their roles primarily being confined to data collection tasks without opportunities for deeper involvement in research design or analysis. Their study, which involved a comprehensive survey and in-depth interviews with young researchers across various health research institutions, revealed that this restricted participation not only hampers the professional growth of these emerging professionals but also limits the potential for innovative contributions that diverse perspectives can bring to research projects. Hawke et al. (2020) concluded that such constraints indicate a broader trend where EEs are not fully integrated into the core phases of research, thus stifling their ability to influence project outcomes meaningfully. Consequently, this lack of engagement results in missed opportunities for leveraging the unique insights and creativity that young professionals offer, ultimately affecting the overall quality and impact of the research.

Similarly, in mental health research, McCabe et al. (2023) discovered persistent barriers to meaningful involvement of young researchers despite concerted efforts to include them in substantive roles. Their study surveyed and conducted focus groups with young professionals in various mental health organisations, highlighting power dynamics and organisational culture issues that often led to tokenistic participation. Young researchers reported feeling undervalued and excluded from critical stages of the research process, such as the design and analysis phases, which significantly undermined their engagement and the overall quality of the research. These findings are consistent with reports from other sectors, including public health (Bower et al. 2023) and environmental science (Shamrova & Cummings 2017), where emerging practitioners similarly struggle to transcend peripheral roles and become integral members of research teams.

In the United Kingdom, Laurence (2021) explored the engagement of young people in mental health research, specifically focussing on organisations such as the National Health Service (NHS) and various mental health charities. This study identified several barriers to meaningful involvement, including a need for more mentorship and clear guidance. The study emphasised that the inclusion of young researchers often fails to translate into substantive engagement without intentional efforts to provide structured support and mentorship. This is particularly relevant in evaluation projects, where the complexity and scale of the work require clear roles and responsibilities to ensure the effective participation of all team members. The study further revealed that young researchers often felt their contributions were undervalued and that they needed to be given opportunities to develop critical skills necessary for career advancement. Additionally, the research highlighted the importance of creating an inclusive environment that fosters open communication and provides young researchers with a platform to voice their ideas and concerns. These findings underscore the need for systemic organisational changes to facilitate genuine and impactful engagement of EEs explored (Laurence 2021; Hawke et al. 2021).

Emerging practitioners face similar challenges across various sectors beyond health research globally and in Africa. In environmental science, for example, a study by Geza et al. (2020) in Namibia examined the involvement of young scientists in climate change research. The study found that young researchers often encountered obstacles related to inadequate funding and limited opportunities for collaboration with experienced researchers. This hindered their ability to contribute effectively and gain recognition for their work. Additionally, the study pointed out that traditional academic structures often marginalised young scientists, making it difficult to access necessary resources and support.

In the agricultural sector, a study conducted by Kadzamira and Kazembe (2015) focussed on the participation of young researchers in agricultural innovation projects in Kenya. They discovered that young researchers frequently needed access to advanced training and professional networks, which impeded their ability to implement innovative solutions and gain valuable field experience. Their research highlighted that young agricultural researchers were often confined to routine tasks rather than involved in strategic planning and decision-making processes, stifling their professional growth and contributions.

Similarly, emerging researchers in Africa face significant barriers in the education sector. According to a study done by Khanaposhtani et al. (2022) on educational research in Ethiopia, young researchers often struggled with limited access to academic resources, such as research databases and journals, as well as insufficient mentorship from senior academics. The study emphasised that young researchers were frequently excluded from critical stages of the research process, including project design and data analysis, which limited their professional development and the impact of their contributions.

In South Africa, the engagement of emerging researchers is critical across various sectors because of the country’s growing demand for skilled professionals. The health sector, for instance, has witnessed significant efforts to incorporate emerging practitioners into research activities. A study by Mandoh et al. (2021) highlighted that emerging health researchers often need more training and better access to research resources, which restrict their active participation in significant phases of research projects. Similar issues are present in the agricultural research sector. Chipfupa and Tagwi (2021) noted that emerging agricultural scientists often encounter difficulties accessing mentorship and collaborative opportunities, which are crucial for their professional growth and the sector’s advancement. In the education sector, Hopma and Sergeant (2015) reported on the challenges faced by new educators and researchers in contributing to educational research projects. Their study found that emerging researchers often need help with inadequate mentorship and limited involvement in project design and implementation despite the high demand for innovative educational research. This stifles their professional development and limits the infusion of fresh perspectives and creative approaches in the field. Furthermore, Hawke et al. (2020) observed similar issues in environmental research, where emerging researchers are often relegated to peripheral roles, primarily because of traditional hierarchical structures and a lack of supportive frameworks that facilitate their active engagement in all research phases.

The challenges faced by emerging researchers across various sectors, as highlighted in studies by Adeogun (2015), Wadman et al. (2019), Chipfupa and Tagwi (2021) and Hawke et al. (2020), underscore the critical need for structured mentorship, adequate training and institutional support. These findings are consistent with the barriers identified in health research and other fields, revealing pervasive issues that transcends disciplinary boundaries. The current study seeks to address these gaps by focussing on the experiences and perspectives of EEs in South Africa, aiming to identify effective strategies for their engagement and professional development. By exploring these themes in the context of evaluation projects, this research contributes to the broader discourse on youth participation in research. It provides actionable insights for fostering a more inclusive and supportive environment for emerging practitioners in the evaluation community.

Theoretical framework: Standards for meaningful engagement of youth in evaluation

Understanding and enhancing youth participation in evaluation projects requires a robust theoretical foundation and practical guidelines. This section integrates two complementary frameworks: Hart’s Ladder of Participation and the Eval4Action standards. Hart’s Ladder provides a structured framework for assessing the depth of youth engagement, from tokenistic involvement to genuine partnership in decision-making. This model is particularly relevant for evaluating the roles and opportunities of EEs within evaluation projects. Complementing Hart’s framework, the Eval4Action standards offer comprehensive guidelines to foster meaningful youth participation across diverse sectors. Together, these frameworks provide a comprehensive lens through which to explore and recommend strategies that enhance the effectiveness and impact of EEs in evaluation practice.

Hart’s ladder of participation

This study is underpinned by Hart’s Ladder of Participation (see Figure 1), a model that offers a structured framework for understanding and enhancing youth participation in various initiatives. Hart’s model, developed initially to categorise the varying degrees of child and youth participation in decision-making processes, is particularly relevant for evaluating the engagement of EEs in evaluation projects. The model comprises eight rungs, ranging from non-participation (manipulation, decoration and tokenism) to varying degrees of genuine participation (assigned but informed, consulted and informed, adult-initiated shared decisions with children, child-initiated and directed, and child-initiated shared decisions with adults) (Hart 1992).

FIGURE 1: Hart’s ladder of participation.

Applying Hart’s Ladder to the context of EEs provides a robust framework for assessing EE engagement and identifying strategies to move towards more meaningful and impactful participation. At the lower rungs of the ladder, EEs may find themselves in tokenistic roles where their involvement is superficial and lacks genuine influence over evaluation processes. This aligns with findings from various studies indicating that EEs often face limited involvement in critical phases of evaluation projects, such as planning, design and analysis, which restricts their professional development and the diversity of perspectives in evaluation outcomes (SAMEA 2019; Zenex Foundation 2018).

To address these challenges, striving towards the higher rungs of Hart’s Ladder, where EEs are actively engaged in decision-making processes and share leadership responsibilities with more experienced evaluators, is essential. This can be achieved through structured mentorship programmes, precise role definitions and opportunities for EEs to lead specific aspects of evaluation projects. Such approaches not only enhance the skills and competencies of EEs but also contribute to the overall quality and impact of evaluation projects by incorporating fresh insights and innovative perspectives (Greer et al. 2018; Hawke et al. 2020).

Moreover, Hart’s Ladder emphasises moving beyond consultation to genuine collaboration and shared decision-making. The context of this study underscores the need for a paradigm shift from traditional hierarchical structures to more inclusive and participatory approaches in evaluation projects. This involves creating an enabling environment where EEs can contribute meaningfully and are recognised as valuable partners in the evaluation process. Empirical evidence from global contexts, such as the work of Wadman et al. (2019) in Kenya and Chipfupa and Tagwi (2021) in South Africa, supports the notion that empowering emerging practitioners through participatory frameworks leads to more effective and impactful research outcomes.

In sum, Hart’s Ladder of Participation provides a compelling theoretical foundation for this study by highlighting the importance of meaningful engagement and shared decision-making in enhancing the involvement of EEs in evaluation projects. By leveraging this model, the study aims to identify and recommend strategies that move EEs towards greater participation, ultimately enriching their professional development and the quality of evaluation projects.

Eval4Action standards

The Eval4Action campaign has formulated comprehensive standards to foster meaningful youth participation in evaluation across diverse sectors, including government, private enterprises and VOPEs (EvalYouth 2016). These standards are structured into six dimensions, each designed to address critical aspects of integrating youth perspectives and contributions into evaluation practices (EvalPartners 2022):

Leadership and accountability

This dimension emphasises the pivotal role of organisational leadership in championing youth inclusion within evaluation initiatives. Leadership commitment is essential for endorsing youth engagement principles and ensuring accountability throughout organisational operations. In governmental contexts, this involves policymakers advocating for policies that support youth participation in evaluations, thereby embedding a culture of accountability towards the youth demographic (EvalPartners 2022).

Practice

The Practice dimension advocates for integrating youth participation across all stages of evaluation processes. It entails developing evaluation guidelines and tools that explicitly incorporate mechanisms for engaging youth in planning, data collection, analysis and dissemination. Adapting organisational practices in sectors such as private enterprises and VOPEs to include youth voices enhances the relevance and comprehensiveness of evaluation outcomes (EvalPartners 2022).

Advocacy and capacity development

This dimension focusses on mobilising national governments and local partners to prioritise initiatives to enhance youth engagement in evaluation. Collaboration between governmental bodies, NGOs and international organisations is crucial for advocating robust capacity-building programmes that equip young professionals with the skills to contribute effectively to evaluations, thereby strengthening the evaluation workforce (EvalPartners 2022).

Knowledge management and communication

Highlighting the importance of effective communication strategies and knowledge-sharing platforms, this dimension promotes the value of youth engagement in evaluation. Establishing platforms for youth to share their experiences and perspectives enriches organisational knowledge bases and supports continuous learning in VOPEs and private enterprises (EvalPartners 2022).

Human resources

The human resources dimension focusses on facilitating young professionals’ access to employment opportunities within the evaluation labour market. In governmental and private sectors, this involves creating pathways and initiatives that enable youth to access internships, mentorship programmes and job placements in evaluation roles. Supporting youth in gaining practical experience enhances their contribution to evaluation practices, nurturing a pipeline of skilled evaluators who bring diverse perspectives to address societal challenges (EvalPartners 2022).

Financial resources

Emphasising the allocation of adequate funding within organisational budgets, this dimension sustains youth engagement in evaluation initiatives. Securing resources for capacity-building workshops, research projects and networking events supports youth-led evaluation efforts. It ensures the long-term viability of youth engagement strategies in sectors such as VOPEs and private enterprises.

Overall, the dimensions provide a framework that complements and supports the study’s objectives by addressing the necessary elements for effectively empowering and integrating EEs into evaluation project phases. They offer a structured approach to ensuring that EEs are involved, valued and developed throughout the evaluation process. Focussing on leadership commitment, practical integration, capacity-building, knowledge-sharing and resource allocation aligns with the study’s emphasis on creating supportive environments and opportunities for EEs. This alignment enhances the study’s recommendations by providing a comprehensive foundation for implementing strategies that foster meaningful engagement and growth for new evaluators across various project phases.

Research methods and design

Grounded in a constructivist epistemological paradigm, this research aims to understand the experiences and perspectives of EEs concerning their roles and contributions to evaluation projects. The qualitative component involves Focus Group Discussions (FGDs) with the EEs and semi-structured interviews with EE’s mentors and other relevant stakeholders, such as youth advisory groups. These methods allow for an in-depth exploration of participants’ experiences, challenges and suggestions for improvement. Furthermore, the research employs a participatory approach, actively involving EEs in the research process. This participatory methodology empowers EEs by giving them a voice and ensures that their insights and perspectives are integral to the study’s findings and recommendations.

Sampling approach

A purposive sampling approach was used to select participants with first-hand experience as EEs in evaluation projects and mentors actively mentoring EEs. This method allowed for intentionally selecting participants based on specific criteria aligned with the study’s objectives, ensuring the collected data were rich and relevant (Patton 2006). Participants were chosen based on their availability, willingness to participate and diversity in terms of background, experience and expertise in evaluation. The sampling criteria included factors such as experience level, work sector, geographic location and type of evaluation projects involved.

The recruitment of EEs began with outreach to the SAMEA EE programme, which has a well-established network of young evaluators. Additionally, EEs were recruited from various organisations across different sectors, including government agencies, NGOs, and private sector entities.

An invitation was disseminated through professional networks, social media platforms and organisational mailing lists detailing the study’s objectives and participation requirements to ensure a broad and representative sample. Interested individuals were asked to complete preliminary selection questions to provide background information on their current roles and experience levels for participating in the study. These questions helped to identify a diverse pool of candidates for the interviews, from which a final selection was made to ensure a balanced representation of sectors and experience levels. Mentors were recruited based on their active involvement in mentoring EEs within the selected organisations. Invitations were extended to experienced evaluators known for their contributions to capacity-building initiatives within their respective fields. These invitations were sent via professional networks and direct emails. Mentors were selected to ensure they represented a wide array of sectors and brought varied perspectives on the mentoring process and the development of EEs. This comprehensive recruitment process aimed to gather diverse insights and experiences, thereby enriching the overall quality and depth of the data collected (Nieuwenhuis 2016).

The participants

The study involved 15 EEs and 12 mentors, aiming to capture a comprehensive range of experiences and perspectives within the evaluation field. The EEs included graduate students, early-career professionals and individuals transitioning into evaluation roles. Participants were recruited from various evaluation organisations (including VOPEs, private organisations, NGOs and government organisations), as displayed in Table 1. This recruitment strategy ensured a diverse pool of participants with different levels of experience and expertise, which is vital for thoroughly exploring the experiences and perspectives of EEs across various settings. The diversity of the participants, including geographic location, organisational affiliation and evaluation experience, is detailed in Table 1, which outlines the organisations and qualifications of the EEs.

TABLE 1: Emerging evaluators’ organisations and their qualifications.

Table 2 provides similar information for the mentors involved in the study. It highlights the mentors’ extensive experience and high level of education, which positioned them well to provide valuable insights into the mentorship process and the professional development of EEs. The mentors’ diverse affiliations with different organisations – from government to private consultancies and NGOs – ensured a broad perspective on the challenges and strategies involved in mentoring EEs. Table 1 and Table 2 highlight the range of backgrounds, ensuring a multifaceted examination of mentorship and development processes within the evaluation field.

TABLE 2: Mentors’ organisations and their qualifications.
Data collection methods

Data were collected primarily through FGDs with EEs and semi-structured interviews with EE mentors. These methods provided platforms for participants to share their insights and experiences regarding their involvement in evaluation projects. The FGDs and interviews were conducted online to facilitate accessibility and convenience. Four FGDs were performed, each comprising a mix of EEs to ensure diverse perspectives. In addition to the FGDs and interviews, supplementary data were collected through document requests and email consultations. This allowed for further exploration of themes and perspectives that emerged during the interviews. These data collection methods align with the principles of qualitative research, enabling the exploration of complex phenomena from the perspectives of those directly involved (Creswell 2015). This approach ensured a comprehensive understanding of the experiences and challenges faced by EEs and their mentors, contributing to the depth and richness of the collected data.

Data analysis methods

Thematic analysis was employed to analyse the qualitative data collected from the interviews and FGDs with EEs and their mentors. This method involved identifying patterns, themes and categories within the data, allowing for the organisation and interpretation of qualitative findings (Braun & Clarke 2006). The process began with familiarisation, involving thorough reading and re-reading of transcripts, followed by generating initial codes to represent meaningful data elements. These codes were then sorted into potential themes, which were reviewed and refined to ensure accuracy and coherence. Finally, clear definitions and names for each theme were developed, and the findings were woven into a coherent narrative.

To ensure rigour and coherence in theme development, the analysis was grounded in a coherent theoretical framework, informed by Hart’s Ladder of Participation and the Eval4Action standards. This approach provided a structured lens for interpreting the data, addressing potential inconsistencies in thematic analysis (Holloway & Todres 2003). The study revealed several key themes across the data. The FGDs and interviews highlighted the barriers to engagement, the need for mentorship, organisational support, recognition and appreciation, inclusive participation, methodological adaptations, comprehensive engagement, capacity-building and the impact of empowerment. Each theme was substantiated by patterns identified in the data. Table 3 summarises the key themes identified through thematic analysis, distinguishing between FGDs and interviews.

TABLE 3: Summary of themes identified from focus group discussions and interviews.
Ethical considerations

Ethical approval to conduct this study was obtained from the University of Johannesburg School of Public Management, Governance and Public Policy Research Ethics Committee (No. 24PMGPP155). Letters of permission were obtained from the participants’ organisations. Ethical considerations were paramount throughout this research. Informed consent was obtained from all participants before their involvement, ensuring they were fully aware of the study’s purpose, procedures and rights as participants. Confidentiality and anonymity were strictly maintained; all personally identifiable information was removed or anonymised when reporting findings. Participants were assured that their participation was voluntary and that they could withdraw from the study at any point without any negative consequences. The study adhered to the ethical guidelines established by the university’s ethics review board, ensuring full compliance with ethical standards for conducting research involving human participants (Creswell & Creswell 2017).

Findings and discussion

The findings from the study reveal significant insights into the experiences of EEs and their mentors within the evaluation projects. This section presents an overview of the key themes identified through FGDs with EEs and interviews with mentors. The discussion integrates these findings with relevant literature to provide a comprehensive understanding of the barriers, support mechanisms and strategies for enhancing the engagement of EEs.

Barriers to effective engagement of emerging evaluators

The findings from both the FGDs and individual interviews with EEs illuminate a complex landscape of challenges and experiences in their engagement with evaluation projects. Drawing on theoretical frameworks such as Hart’s Ladder of Participation (Hart 1992), the study reveals that EEs across various organisational settings encounter distinct barriers that shape their professional trajectories. In governmental and non-profit sectors, EEs frequently encounter structural constraints that limit their involvement in peripheral tasks, such as data collection or report preparation, without meaningful input into decision-making processes. This marginalisation hinders their professional growth and perpetuates a cycle of limited influence in shaping evaluation outcomes (EvalPartners 2022). The consequences of this limited engagement are multifaceted, affecting both the immediate contributions of the EEs and their long-term career development. By relegating EEs to minor roles, organisations take advantage of the fresh perspectives and innovative ideas these emerging professionals can bring. This issue underscores the importance of re-evaluating the roles assigned to EEs to ensure that they have opportunities to engage in more substantive aspects of evaluation work.

For instance, GEE 15 stated, ‘I find myself relegated to administrative tasks in my role at a government organisation, which limits my ability to gain deeper insights into evaluation methodologies’, reflecting a common frustration among EEs in bureaucratic settings.

Conversely, EEs hosted by VOPEs and private consultancies often report more favourable conditions facilitating their active engagement throughout the evaluation lifecycle. These contexts enable EEs to participate in planning, design and strategic decision-making, fostering their skill development and professional autonomy (EvalPartners 2022). Such immersive experiences enhance their technical proficiency and cultivate a deeper understanding of evaluation processes and their application in diverse contexts. The involvement of EEs in all stages of evaluation projects in these settings demonstrates a commitment to their professional growth and a recognition of their potential contributions. This comprehensive engagement is crucial for developing well-rounded evaluators who can confidently navigate the complexities of evaluation work and contribute meaningfully to their organisations.

For example, VEE3 shared, ‘At the organisation where I work, sometimes I am involved in designing evaluation frameworks and analysing data, which has significantly broadened my expertise’, underscoring the value of hands-on involvement in shaping evaluation outcomes.

Tokenism emerged as a pervasive issue affecting EEs across all sectors, although its impact varied. Many EEs recounted instances where their contributions felt superficial, aimed more at meeting diversity quotas than genuinely leveraging their insights for meaningful impact. This tokenistic engagement undermines EEs’ sense of professional agency and diminishes their motivation to contribute to evaluation projects actively. The phenomenon of tokenism highlights a critical gap in how organisations perceive and utilise the contributions of EEs. By focussing on surface-level diversity rather than meaningful inclusion, organisations need to harness the full potential of their emerging professionals (EvalPartners 2022). Addressing tokenism requires a fundamental shift in organisational culture and practices to ensure EEs are valued for their skills and insights rather than being included as a formality.

For instance, NEE10 expressed, ‘I often feel my opinions are sought just to fulfil a checkbox, rather than being valued for their potential to improve our evaluation strategies’, highlighting the disillusionment caused by tokenism.

Mentors, providing insights into these dynamics, underscored the profound implications of tokenistic practices on EEs’ professional development and organisational culture. They emphasised that meaningful engagement is pivotal for fostering EEs’ confidence and competence in evaluation, thus preparing them for future leadership roles (EvalPartners 2022). Mentors noted that their professional growth is significantly accelerated when EEs are genuinely involved in substantial tasks, such as designing evaluation frameworks and analysing data. This involvement enhances their technical skills and boosts their confidence in their ability to contribute meaningfully to evaluation projects. Furthermore, mentors highlighted that creating an environment where EEs feel valued and heard encourages a culture of continuous learning and innovation within the organisation. Addressing tokenism requires an organisational commitment to inclusive practices that prioritise genuine collaboration and respect for diverse perspectives, aligning with principles advocated by participatory evaluation frameworks. Mentors suggested that organisations adopt structured mentorship programmes and provide regular feedback to ensure that EEs are integrated into all evaluation work stages, fostering a more inclusive and productive organisational culture.

The role of mentorship in enhancing emerging evaluators’ participation

The findings for the study underscore the pivotal role of mentorship programmes in shaping the participation and professional development of EEs across diverse organisational settings. Drawing on theoretical perspectives such as social learning theory (Bandura 1977), the study illuminates how structured mentorship fosters knowledge transfer, skill acquisition and professional identity formation among EEs. Participants consistently highlighted the importance of structured mentorship programmes in providing guidance, feedback and opportunities for hands-on learning, which are crucial for navigating the complexities of evaluation projects. This structured support is essential for EEs to develop a comprehensive understanding of evaluation methodologies and gain the confidence to apply these methods in real-world settings. The mentorship process also allows EEs to build a professional network, which can be invaluable for their career progression.

For instance, PEE7 while emphasising the transformative impact of mentorship on their professional trajectory stated:

‘I attribute much of my growth to the mentorship I received early in my career, which not only taught me technical skills but also instilled confidence in my ability to contribute meaningfully.’

Mentors, in turn, play a central role in supporting EEs by providing tailored guidance and creating opportunities for experiential learning. They view mentorship as a reciprocal process that enriches both parties, fostering a culture of knowledge-sharing and continuous improvement within evaluation practice. By engaging with EEs, mentors can help them navigate organisational challenges and develop a nuanced understanding of evaluation methodologies and practices (EvalPartners 2022; SAMEA 2019). This relationship also benefits mentors by providing fresh perspectives and innovative ideas from EEs, enhancing the overall evaluation process. Additionally, mentors can identify and address specific learning needs of EEs, ensuring that the mentorship experience is personalised and effective.

Mentor VMM3 highlighting the multifaceted approach to mentorship that supports EEs’ holistic development shared, ‘As a mentor, my goal is to not only impart technical skills but also nurture critical thinking and decision-making abilities in EEs’.

Structured mentorship programmes are seen as instrumental in mitigating the challenges faced by EEs, such as limited access to decision-making processes and tokenistic engagement. By providing a supportive environment for skill development and professional growth, mentorship programmes empower EEs to navigate organisational hierarchies and contribute effectively to evaluation projects (EvalPartners 2022; SAMEA 2019). Moreover, mentors underscored the role of advocacy in promoting the visibility and recognition of EEs’ contributions within their organisations, advocating for inclusive practices that amplify diverse voices and perspectives. This advocacy is crucial in creating an organisational culture that values and leverages the unique insights of EEs, leading to more comprehensive and effective evaluation outcomes. By integrating EEs into core activities, organisations can enhance the quality and impact of their evaluation projects.

Mentors stressed the importance of adapting mentorship strategies to align with EEs’ individual learning needs and career aspirations, enhancing their readiness to assume leadership roles in evaluation. This tailored approach ensures that EEs receive the specific support they need to grow professionally and contribute meaningfully to their organisations.

Organisational support and its influence on emerging evaluators’ engagement

Organisational support is crucial in fostering meaningful engagement of EEs within evaluation projects, as highlighted by the study’s findings. Clear delineation of roles and open communication channels emerged as fundamental factors that enhance EE participation and satisfaction in their roles. Emerging evaluators preferred organisational environments where expectations and responsibilities were clearly defined, enabling them to contribute effectively and align their efforts with project goals. This clarity not only mitigates confusion but also empowers EEs to take ownership of their tasks and responsibilities, thereby promoting a sense of professional autonomy and accountability. Such environments allow us to focus on delivering high-quality outputs and encourage them to take the initiative, fostering a proactive approach to their work (EvalPartners 2022; SAMEA 2019).

PEE7 stated, ‘I found that having clear roles from the outset allowed me to focus on delivering quality outputs without second-guessing my contributions’, reflecting on the importance of role clarity in facilitating their engagement in evaluation projects.

Moreover, open communication channels were identified as essential for fostering a collaborative and inclusive work environment. Emerging evaluators emphasised the value of being able to voice their perspectives and concerns, feeling that their input was valued and considered in decision-making processes. This transparent communication fosters trust between EEs and their colleagues or supervisors, laying the foundation for constructive feedback and continuous improvement. Open dialogue ensures that EEs feel heard and respected, which is critical for maintaining their motivation and commitment to the project (EvalPartners 2022; SAMEA 2019). Furthermore, it allows for the timely resolution of any issues or misunderstandings, contributing to a more efficient and harmonious working environment.

Highlighting the role of open communication in promoting EE engagement and satisfaction, VEE2 mentioned, ‘Being able to discuss ideas and challenges with my team openly made me feel like a valued member of the evaluation process’.

In contrast, organisational cultures that lack clear roles or suppress open communication risk alienating EEs and stifling their contributions. Emerging evaluators cited instances where ambiguous expectations or hierarchical barriers limited their ability to participate in decision-making or innovation within evaluation projects fully. Addressing these barriers requires an organisational commitment to fostering inclusive practices and creating supportive environments that prioritise open dialogue and mutual respect. By actively dismantling hierarchical barriers and promoting a culture of inclusivity, organisations can ensure that all team members, including EEs, feel empowered to contribute their best work.

Recognition and appreciation of EEs’ contributions emerged as another critical factor influencing engagement and motivation. Emerging evaluators expressed a desire for their efforts to be acknowledged and valued by organisational leaders and peers, reinforcing their sense of professional identity and encouraging continued commitment to their roles (Zenex Foundation 2018). Effective recognition practices include celebrating milestones, showcasing successes and integrating feedback mechanisms that acknowledge the unique perspectives EEs bring to evaluation projects. Such recognition boosts morale and validates the EEs’ efforts, making them feel integral to the project’s success and encouraging them to continue striving for excellence.

For instance, PEE8 noted, ‘When my contributions were recognised and appreciated, it motivated me to actively seek out new challenges and opportunities for growth within the evaluation field’, underscoring the impact of recognition on EE engagement and morale.

These findings underscore the importance of organisational structures and cultures that actively support and value the contributions of EEs. By providing clear roles, fostering open communication and recognising individual achievements, organisations can create an environment where EEs feel empowered and motivated to engage in evaluation projects fully. This benefits the EEs by enhancing their professional development and job satisfaction, and enriches the evaluation process with diverse perspectives and innovative ideas. Investing in these supportive practices is crucial for cultivating a dynamic and inclusive evaluation community that can adapt to emerging challenges and opportunities in the field. Moving forward, organisations should consider these elements integral to their strategies for engaging EEs, ensuring their potential is fully realised and their contributions are maximised.

Strategies for avoiding tokenism and promoting inclusive participation

In evaluation projects, strategies for avoiding tokenism and promoting genuine, inclusive participation among EEs are critical for fostering meaningful engagement and maximising their contributions. Best practices identified from the study emphasise the importance of intentional involvement throughout all evaluation phases, from initial planning to final implementation and dissemination. This approach ensures that EEs are not merely included for appearance but are integral to decision-making processes and outcome assessments. Such inclusive practices are aligned with participatory evaluation frameworks, which advocate for the active involvement of all stakeholders to enhance the relevance and effectiveness of evaluation outcomes (EvalPartners 2022; SAMEA 2019).

For instance, mentor MNT2 mentioned, ‘Engaging EEs from the outset in project planning allowed us to benefit from their fresh perspectives and innovative ideas, which enriched the evaluation process significantly’. This highlights the impact of early and sustained involvement in promoting genuine participation.

Moreover, case studies from successful EE engagement underscored the effectiveness of mentorship programmes that pair EEs with experienced evaluators who actively guide and support their professional development. These programmes provide practical skills and knowledge, and cultivate a supportive environment where EEs feel empowered to contribute meaningfully and confidently (EvalPartners 2022; SAMEA 2019). By fostering a culture of learning and collaboration, mentorship programmes help bridge the gap between novice and experienced evaluators, ensuring that EEs can leverage their unique insights while gaining valuable professional expertise.

For instance, mentor MNT5 stated, ‘One of our successful strategies was to pair EEs with mentors who provided technical guidance and encouraged them to take on leadership roles within the evaluation team’. This illustrates the role of mentorship in promoting authentic participation.

Conversely, tokenistic practices often arise when EEs are relegated to passive roles or excluded from critical decision-making processes, undermining their potential impact and discouraging long-term engagement. Organisations must establish frameworks that outline roles, responsibilities and expectations for EEs to mitigate tokenism, ensuring their contributions are substantive and valued. These frameworks should be supported by ongoing training and professional development opportunities that equip EEs with the skills and knowledge to navigate complex evaluation landscapes effectively.

For example, EEE12 noted, ‘Tokenism becomes apparent when EEs are asked to participate, but their inputs are not seriously considered or integrated into the evaluation outcomes’. This highlights the challenges faced in projects where their involvement felt superficial.

Methodological adaptations to accommodate emerging evaluators

In evaluation methodologies, adapting approaches to accommodate EEs is crucial for fostering active participation and ensuring their perspectives are integrated meaningfully. Flexibility in evaluation methods allows for adjustments that cater to the unique needs and strengths of EEs, thereby enhancing their contributions to evaluation projects (EvalPartners 2022).

Mentor GMMT9 emphasising the role of methodological adaptability in empowering EEs stated, ‘Adopting flexible methods like participatory evaluation techniques enabled us to leverage the diverse skills of EEs, encouraging them to lead aspects of data collection and analysis’.

Innovative approaches to EE inclusion go beyond traditional methodologies to embrace new technologies and participatory frameworks that amplify their voices and contributions. Techniques such as digital storytelling and collaborative data analysis workshops provide platforms for us to share our perspectives and insights in ways that are accessible and impactful (Greer et al. 2018).

For example, VEE2 noted, ‘We found that incorporating digital tools and interactive workshops not only engaged EEs more effectively but also diversified our evaluation methods, yielding richer insights and recommendations’, highlighting the benefits of innovative approaches.

Methodological adaptations also involve revisiting traditional evaluation practices to ensure they are inclusive and equitable for EEs from diverse backgrounds and experiences. This includes challenging biases and power dynamics within evaluation teams and fostering environments where EEs feel empowered to challenge assumptions and contribute authentically.

Mentor PMMT7, illustrating the transformative potential of methodological adaptations shared, ‘By rethinking our approach to data collection and analysis, we created spaces where EEs felt comfortable sharing their perspectives, which enriched our understanding of programme impacts’.

Through the lens of literature and participant experiences, it becomes evident that when EEs are integrated into the evaluation process substantively – being provided with clear roles, consistent mentorship and opportunities for authentic involvement – their contributions are significantly enhanced, benefiting both the individual and the organisation. Conversely, practices that border on tokenism or marginalise EEs’ input not only stifle their potential but also undermine the overall effectiveness of the evaluation process. These insights highlight the importance of creating environments where EEs are genuinely valued and can fully contribute, aligning with broader goals of inclusive and participatory evaluation practices.

The broader impact of empowering emerging evaluators

The involvement of EEs in evaluation projects has far-reaching implications that extend beyond individual assignments to impact the overall quality and effectiveness of evaluation practices. By actively involving EEs, there is a noticeable enhancement in data collection, analysis and interpretation processes. Emerging evaluators bring fresh perspectives and innovative approaches that enrich the depth and breadth of evaluations, leading to more comprehensive insights and actionable recommendations, as supported by the literature (Wadman et al. 2019).

Mentor VMM1, emphasised this stating: ‘The active participation of EEs in our evaluations has significantly enriched our understanding of programme outcomes, allowing us to identify nuanced impacts that would have been overlooked otherwise’.

Moreover, the study’s findings indicate that empowering EEs contributes to the growth and evolution of the evaluation community. When organisations commit to fostering an inclusive environment, they improve their immediate evaluation outcomes and contribute to the evaluation profession’s broader development. Including diverse perspectives strengthens the workforce, making it more robust and adaptable to emerging challenges. This, in turn, enhances the credibility and relevance of evaluation findings, aligning with the principles of continuous learning and improvement within the profession (SAMEA 2019).

Mentor NMM5 underscored this broader impact by observing, ‘EEs bring unique insights and methodologies that challenge conventional practices and stimulate innovation in our evaluation approaches’.

Furthermore, the study reveals that empowering EEs is crucial for building a pipeline of skilled evaluators well-prepared to tackle complex societal challenges through evidence-based practices. By investing in the professional development of EEs and providing them with opportunities for leadership and mentorship, the evaluation community can ensure a sustainable future. This future is one where diverse voices contribute to informed decision-making and the development of sound policies, thus extending the impact of evaluation practices far beyond individual projects. Reflecting on this perspective, Mentor GMM10 stated, ‘Investing in EEs enhances our project outcomes and strengthens the evaluation field by fostering a new generation of evaluators passionate about creating meaningful change’.

The collective insights from this study affirm that integrating EEs into evaluation projects is not merely beneficial but essential for advancing the field of evaluation. As EEs contribute fresh perspectives and innovative methods, they enhance the quality and depth of individual evaluations and drive the evolution of evaluation practices on a broader scale. The active involvement and empowerment of EEs create a dynamic and forward-looking evaluation community capable of addressing complex societal challenges with rigour and creativity. The continuous investment in the development and inclusion of EEs ensures that the field remains responsive, inclusive and capable of generating meaningful, impactful findings that resonate within and beyond the immediate scope of evaluation projects. This ongoing cycle of empowerment and contribution underscores the vital role EEs play in shaping the future of evaluation, making their involvement a cornerstone of sustainable and effective evaluation practices.

Limitations

This study had several limitations that need to be acknowledged. First of all, using purposive sampling, while effective for targeting specific participants with relevant experience, may limit the generalisability of the findings. Although sufficient for qualitative analysis, the sample size needed to be bigger, which might not capture the full diversity of experiences within the broader population of EEs and mentors. Additionally, relying on self-reported data from interviews and FGDs may introduce bias, as participants might have provided socially desirable responses or may not accurately recall past events. Despite these limitations, the study provides valuable insights into the experiences and perspectives of EEs and mentors in the evaluation field, contributing to developing effective strategies for their empowerment and engagement.

Recommendations
For government agencies

Government agencies promote youth participation in evaluation processes and foster inclusive and evidence-based policymaking. To effectively engage EEs, the agencies need to:

  • Develop and implement clear policies that mandate the inclusion of EEs in evaluation teams and decision-making processes.
  • Initiate structured internship programmes that offer EEs hands-on experience in evaluation methodologies and practices.
  • Invest in training and capacity-building initiatives tailored to EEs, focussing on evaluation skills and professional development.
For private enterprises

Private enterprises can leverage EEs’ perspectives to enhance the effectiveness and relevance of their evaluation practices by:

  • Incorporating EEs into evaluation teams to bring diverse perspectives and innovative approaches to assessment processes.
  • Establishing mentorship programmes where experienced evaluators guide EEs, fostering skill development and knowledge transfer.
  • Adapting evaluation methods to be more inclusive and responsive to the needs and insights of EEs, ensuring their meaningful participation.
For non-governmental organisations

Non-governmental organisations can empower EEs to contribute meaningfully to evaluation efforts aimed at driving social impact by:

  • Providing platforms for EEs to lead and contribute to evaluation projects, empowering them to influence outcomes.
  • Advocating for policies and practices prioritising youth inclusion in evaluation initiatives within the NGO sector.
  • Facilitating knowledge-sharing and learning exchanges between EEs and experienced evaluators to enhance skills and expertise.
For voluntary organisations for professional evaluation

Voluntary organisations for professional evaluation can harness the potential of EEs to strengthen community-driven evaluation practices by:

  • Engaging local communities: Involve EEs in evaluations directly impacting local communities, ensuring evaluations reflect community needs and priorities.
  • Building collaborative partnerships: Foster partnerships with other organisations to expand opportunities for EEs to engage in diverse evaluation projects.
  • Offering continuous support: Provide ongoing support and mentorship to EEs throughout evaluation processes, enhancing their professional growth and contribution.

Conclusions and areas of research

This study explored the complexities of engaging EEs in evaluation projects, guided by frameworks such as Hart’s Ladder of Participation and the Eval4Action standards. The findings underscore the critical importance of meaningful youth engagement and inclusive practices in evaluation contexts. Emerging evaluators often need help with tokenism, unclear roles and inadequate recognition, which can hinder their full participation and professional development. However, structured mentorship programmes, robust organisational support and flexible evaluation methodologies are crucial elements that can enhance EEs’ involvement and contributions. By empowering EEs and integrating their diverse perspectives into evaluation processes, organisations can significantly improve the quality and impact of evaluation outcomes, thereby advancing evidence-based decision-making and SDGs.

Acknowledgements

Competing interests

The author declares that they have no financial or personal relationship(s) that may have inappropriately influenced them in writing this article.

Author’s contributions

I declare that I am the sole author of this research article.

Funding information

This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Data availability

Data sharing is not applicable to this article as no new data were created or analysed in this study.

Disclaimer

The views and opinions expressed in this article are those of the author and are the product of professional research. The article does not necessarily reflect the official policy or position of any affiliated institution, funder, agency or that of the publisher. The author is responsible for this article’s results, findings and content.

References

Abraham, O., Rosenberger, C.A. & Poku, V.O., 2023, ‘Implementing a youth advisory board to inform adolescent health and medication safety research’, Research in Social and Administrative Pharmacy 19(4), 681–685. https://doi.org/10.1016/j.sapharm.2022.12.003

Adeogun, S.O., 2015, ‘Participatory diagnostic survey of constraints to youth involvement in cocoa production in Cross River State of Nigeria’, Journal of Agricultural Sciences 60, 211–225. https://doi.org/10.2298/JAS1502211A

Bandura, A., 1977, Social learning theory, Prentice Hall, Englewood Cliffs, NJ.

Braun, V. & Clarke, V., 2006, ‘Using thematic analysis in psychology’, Qualitative Research in Psychology 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Catalano, R.F., Skinner, M.L., Alvarado, G., Kapungu, C., Reavley, N., Patton, G.C. et al., 2019, ‘Positive youth development programs in low- and middle-income countries: A conceptual framework and systematic review of efficacy’, Journal of Adolescent Health 65(1), 15–31. https://doi.org/10.1016/j.jadohealth.2019.01.024

Chipfupa, U. & Tagwi, A., 2021, ‘Youth’s participation in agriculture: A fallacy or achievable possibility? Evidence from rural South Africa’, South African Journal of Economic and Management Sciences 24(1), a4004. https://doi.org/10.4102/sajems.v24i1.4004

Creswell, J., 2015, Research design: Qualitative, quantitative and mixed methods approaches, Pearson Education Inc., Thousand Oaks, CA.

Creswell, J.W. & Creswell, J.D., 2017, Research design: Qualitative, quantitative, and mixed methods approaches, 4th edn., Sage, Newbury Park, CA.

Department of Performance Monitoring and Evaluation (DPME), 2019, National evaluation policy evaluation framework, viewed 12 May 2024, from https://www.dpme.gov.za/keyfocusareas/evaluationsSite/Evaluations/National%20Policy%20framework%20Nov%202019.pdf.

EvalPartners, 2022, Standards to enhance the meaningful engagement of youth in evaluation, viewed 06 July 2024, from https://www.evalpartners.org/.

EvalYouth, 2016, EvalYouth concept note, EvalPartners, viewed 06 July 2024, from https://www.evalpartners.org/sites/default/files/documents/evalyouth/EvalYouth%20Concept%20Note%20-%20July%202016.pdf.

Greer, A.M., Amlani, A., Pauly, B., Burmeister, C. & Buxton, J.A., 2018, ‘Participant, peer and PEEP: Considerations and strategies for involving people who have used illicit substances as assistants and advisors in research’, BMC Public Health 18(1), 834. https://doi.org/10.1186/s12889-018-5765-2

Geza, W., Ngidi, M., Ojo, T., Adetoro, A.A., Slotow, R. & Mabhaudhi, T., 2020, ‘Youth participation in agriculture: A scoping review’, Sustainability 13(16), 9120. https://doi.org/10.3390/su13169120

Hart, R.A., 1992, Children’s participation: From tokenism to citizenship, United Nations Children’s Fund International Child Development Centre, Florence.

Hart, R.A., 2008, ‘Stepping back from “the ladder”: Reflections on a participatory work model with children’, in Participation and learning: Perspectives on education and the environment, health and sustainability, pp. 19–31, Springer, Netherlands.

Hawke, L.D., Darnay, K., Relihan, J., Khaleghi-Moghaddam, M., Barbic, S., Lachance, L. et al., 2020, ‘Enhancing researcher capacity to engage youth in research: Researchers’ engagement experiences, barriers, and capacity development priorities’, Health Expectations 23(3), 584–592. https://doi.org/10.1111/hex.13032.

Hawke, L.D., Monga, S., Korczak, D., Hayes, E., Relihan, J., Darnay, K. et al., 2021, ‘Impacts of the COVID-19 pandemic on youth mental health among youth with physical health challenges’, Early Intervention in Psychiatry 15(5), 1146–1153. https://doi.org/10.1111/eip.13052

Holloway, I. & Todres, L., 2003, ‘The status of method: Flexibility, consistency and coherence’, Qualitative Research 3(3), 345–357. https://doi.org/10.1177/1468794103033004

Hopma, A. & Sergeant, L., 2015, Planning education with and for youth, IIEP Policy Forum Series, UNESCO IIEP, Paris.

Kadzamira, M.A. & Kazembe, C., 2015, ‘Youth engagement in agricultural policy processes in Malawi’, Development Southern Africa 32(6), 801–814. https://doi.org/10.1080/0376835X.2015.1063984

Khanaposhtani, M.G., Ballard, H.L., Lorke, J., Miller, A.E., Pratt-Taweh, S., Jennewein, J. et al., 2022, ‘Examining youth participation in ongoing community and citizen science programs in 3 different out-of-school settings’, Environmental Education Research 28(12), 1730–1754. https://doi.org/10.1080/13504622.2022.2078480

Laurence, J., 2021, ‘The impact of youth engagement on life satisfaction: A quasi-experimental field study of a UK national youth engagement scheme’, European Sociological Review 37(2), 305–329. https://doi.org/10.1093/esr/jcaa059

Mandoh, M., Redfern, J., Mihrshahi, S., Cheng, H.L., Phongsavan, P. & Partridge, S.R., 2021, ‘Shifting from tokenism to meaningful adolescent participation in research for obesity prevention: A systematic scoping review’, Frontiers in Public Health 9, 789535. https://doi.org/10.3389/fpubh.2021.789535

Martini, M., Rollero, C., Rizzo, M., Di Carlo, S., De Piccoli, N. & Fedi, A., 2023, ‘Educating youth to civic engagement for social justice: Evaluation of a secondary school project’, Behavioral Sciences (Basel) 13(8), 650. https://doi.org/10.3390/bs13080650

McCabe, E., Amarbayan, M.M., Rabi, S., Mendoza, J., Naqvi, S.F., Thapa Bajgain, K. et al., 2023, ‘Youth engagement in mental health research: A systematic review’, Health Expectations 26(1), 30–50. https://doi.org/10.1111/hex.13650

Nieuwenhuis, J., 2016, ‘Qualitative research designs and data-gathering techniques’, in K. Maree (ed.), First steps in research, 2nd edn., pp. 72–103, Van Schaik Publishers, Pretoria, SA.

Patton, M.Q., 2006, Qualitative evaluation and research methods, 2nd edn, Thousand Oaks, CA.

Phillips, S., 2018, Twende Mbele diagnostic on the supply and demand of evaluators in South Africa, viewed 09 January 2020, from https://www.dpme.gov.za/keyfocusareas/gwmeSite/GovermentWide%20M%20and%20E/Diagnostic%20report%20Twende%20Mbele%20demand%20and%20supply%20of%20evaluation%2021%20January.pdf.

Shamrova, D.P. & Cummings, C.E., 2017, ‘Participatory action research (PAR) with children and youth: An integrative review of methodology and PAR outcomes for participants, organizations, and communities’, Children and Youth Services Review 81(C), 400–412. https://doi.org/10.1016/j.childyouth.2017.08.030

South African Monitoring and Evaluation Association (SAMEA), 2019, SAMEA emerging evaluators programme concept note, viewed 06 February 2024, from https://www.samea.org.za/summernotefile/dump?summernotefile_id=124.

Wadman, R., Thwaites, R., Palmer, M., Boulton, M. & Clarke, D., 2019, ‘Experience of the transition from child and adolescent mental health services to adult mental health services: A qualitative thematic review’, BMJ Open 9(5), e027007. https://doi.org/10.1136/bmjopen-2018-027007

United Nations, 2018, World youth report: Youth and the 2030 agenda for sustainable development, United Nations publication, New York, NY.

Zenex Foundation, 2018, Monitoring and evaluation capacity: A landscape analysis, viewed 09 January 2020, from https://www.zenexfoundation.org.za/images/Final_Report_on_Landscape_Study_29_August_2018_LD.pdf.



Crossref Citations

No related citations found.