About the Author(s)


Zeenat Ishmail symbol
Department of the Premier, Strategic Management Information, Western Cape Government, Cape Town, South Africa

Victoria L. Tully Email symbol
Chief Directorate Strategic Management Information, Department of the Premier, Western Cape Government, Cape Town, South Africa

Citation


Ishmail, Z. & Tully, V.L., 2020, ‘An overview of the provincial evaluation system of the Western Cape Government of South Africa as a response to the evaluation of the National Evaluation System’, African Evaluation Journal 8(1), a425. https://doi.org/10.4102/aej.v8i1.425

Note: Special Collection: 9th AfrEA International Conference 2019.

Original Research

An overview of the provincial evaluation system of the Western Cape Government of South Africa as a response to the evaluation of the National Evaluation System

Zeenat Ishmail, Victoria L. Tully

Received: 15 Aug. 2019; Accepted: 13 Feb. 2019; Published: 09 Apr. 2020

Copyright: © 2020. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background: The Western Cape Government (WCG) has recently implemented its seventh (7th) Provincial Evaluation Plan that is aligned to the National Evaluation Policy Framework adopted by South Africa in November 2011. The National Evaluation System (NES) and its evolution at the provincial level has been aligned to its Province-wide Monitoring and Evaluation System (PWMES). The Department of the Premier in the WCG centrally drives this system. The institutionalisation of a Provincial Evaluation System (PES) serves as a model for any sub-national government to practice government evaluations at provincial, regional or local levels. To date, thirty-three (33) evaluations have been completed and a further eleven (11) are at various stages of implementation.

Objectives: Since the initial rollout of the PES in the WCG, the system has developed, strengthened and matured. This comes from a need for better access to, and understanding of, evaluations.

Method: The paper has drawn on the recent evaluation of the NES which selected the Western Cape as a case study; and uses the findings to put improvements in place in the areas of policy, methodology, organisation, capacity, participation with other actors, and use.

Results: The main value in sharing this work, is the building of an enabling environment to roll out evaluations on a regional level, that are in line with the national evaluation approach, whilst at the same time facilitating integration with provincial strategic priorities.

Conclusion: This paper reflects on the way in which the WCG of South Africa has undertaken the implementation and maturation of the NES at a sub-national level.

Keywords: Evaluation systems; Regional evaluation system; National evaluation system; South Africa; M&E; Evaluation capacity development; Institutionalisation; Evaluation use.

About this article

The National Evaluation Policy Framework (NEPF) (Department of Planning, Monitoring and Evaluation [DPME] 2011) sets the approach for establishing a National Evaluation System (NES) for South Africa. The undertaking of institutionalising government evaluations at a province level was piloted in the Western Cape Government (WCG) during 2012–2013, and the Province is now embarking on its seventh year of implementation.

The Department of the Premier (DotP) drives evaluations in the province. The provincial institutionalisation of the NES bears significance in that the rollout resulted in an alignment to the Province-wide Monitoring and Evaluation System (PWMES). The authors of this article will reflect on the way in which the WCG of South Africa has implemented the Provincial Evaluation System (PES), and how this has evolved and matured over time. Reference is made to the findings of the recent evaluation of the NES, which selected the Western Cape (WC) as a case study. The WCG uses these findings to put improvements in place in the areas of policy, methodology, organisation, capacity, participation with other actors and use. The learnings from the provincial institutionalisation of the South African NES and its subsequent maturity, provide firm building blocks for an enabling environment that allows for the strengthening of regional evaluation ecosystems.

Departments are the key stakeholders in conducting government evaluations on key programmes and projects that are aligned to transversal provincial priorities and to departmental mandates. Since March 2013, the WCG has completed two ‘three-year rolling’ provincial evaluation plans (PEPs) within the context of the NES and guided by the NEPF. Progress is annually documented in an evaluation update.

This institutionalisation has strengthened the DotP’s role in advocating the generation of evidence from evaluations to demonstrate objective results for the implementation of these programmes and projects rather than just relying on performance monitoring evidence. This is a huge mindset change in government towards the achievement of the outcomes of its provincial priorities.

The institutionalisation is further strengthened through the profiling of all evaluations conducted, the development of a directory of evaluations and making evaluations accessible via the WCG’s monitoring and evaluation technology platform, namely the Business Intelligence (BI) system. Through these developments in the PES, the WCG is positioning itself firmly to adopt an evidence-based approach.

This approach provides a sound basis to the evaluation journey and how it has matured over the years. It also provides an opportunity to take stock of the current evaluation eco-system and to reposition evaluations within the broader strategic context of leveraging data and evidence as a strategic asset in the WCG. This means that evaluations have a bigger purpose together with other key data and evidence services already being driven by the DotP. It is also perfectly timed as the WCG embarks on a new 5-year trajectory, and there is a demand for evidence to account to the people of the province. Evaluation has contributed to the WCG objectives towards being a data- and evidence-driven organisation.

This article commences with an overview of the NES in South Africa and its evolution at the provincial level. The next section examines the institutionalisation of evaluations in the WC and how the PES has matured, looking at key achievements. Elements of the National Evaluation System- a lens on the Provincial Evaluation System three applies the findings of the evaluation of the NES and examines how these can be utilised to strengthen the PES, looking at policy, methodology, organisation, capacity, participation of other actors and quality and use. This article concludes with a summary of the key milestones and describes the repositioning approach for evaluations in the WC over the next 5 years. Finally, the strategic approach to building an evidence agenda is described.

Background of National and Provincial Evaluation Systems

Understanding the National Evaluation System

The development of the NES of South Africa is a mandate of the Department of Planning, Monitoring and Evaluation (DPME) in the presidency. Department of Planning, Monitoring and Evaluation is the custodian of the NEPF that was approved by the cabinet on 23 November 2011. The NEPF provides the basis for a system of evaluations across government. It promotes quality evaluations that can be used for learning and to improve the effectiveness and impact of government. The NEPF suggests that the purpose of evaluation relates to improving policy or performance; improving accountability; and improving decision-making and knowledge about what works and what does not with regard to public policies, plans, programmes or projects. In addition to linking evaluation to the planning and budgeting processes, the framework also advocates to improve the quality of evaluations undertaken and to ensure that evaluation findings are utilised to improve government performance.

The implementation of the NES involved the setting up of a unit in DPME for outcomes evaluations and research, establishing technical support structures, developing a procurement system to source service providers and putting in place capacity-building initiatives for the various stakeholders involved in the process. Work undertaken included an audit of evaluations, the development of standards and competencies as well as the generation of a series of guidelines that provide templates and parameters within which evaluations and improvement plans should be produced. The first rolling 3-year National Evaluation Plan (NEP) from 2013–2014 to 2015–2016 was approved by the cabinet in November 2012, and the third 3-year plan commenced at the start of the 2019–2020 financial year. By March 2019, 44 evaluation reports were commissioned and there are 65 active evaluations.

The development of the NES should not be viewed in isolation. The Policy Framework for the Government-wide Monitoring and Evaluation System (GWMES) as approved by the cabinet in November 2007 describes three ‘data terrains’ that underpin the monitoring and evaluation (M&E) system, namely programme performance information, official statistics and evaluation information. Although the presidency is the custodian of the GWMES, with National Treasury, they published the Framework for Programme Performance Information, and Statistics South Africa (2008) published the South African Statistics Quality Assessment Framework to provide policy frameworks for the first two terrains. The NEPF compliments the set of policies, which make up the GWMES.

A schematic representation of the GWMES approach, adapted by WCG to include the responsibility areas, is presented in Figure 1.

FIGURE 1: The three data terrains underpinning the Government-wide Monitoring and Evaluation Systems, adapted by the Western Cape Government to include responsibility areas.

Defined by the NEPF, evaluation as a process is carried out throughout the intervention cycle as per the six recommended types of evaluations: diagnostic evaluations, design evaluations, implementation evaluations, impact evaluations, economic evaluations and evaluation synthesis. Each type of evaluation has a specific objective and purpose to be applied most appropriately to gain maximum benefit from the evaluation. A schematic representation of these evaluation types is presented in Figure 2.

FIGURE 2: The evaluation types in the National Evaluation Policy Framework.

The National Evaluation System and its evolution at a provincial level

The first PEPs were piloted in 2012–2013 in selected provinces. Currently, there are eight PEPs in South Africa.1 Since 2012, the WCG has institutionalised the NEPF at a provincial level through the development and implementation of the PEP and has evolved to conducting quality evaluations to improve government’s effectiveness, efficiency, impact and sustainability. The WCG was selected as a pilot province to input into the development of the country’s first formal evaluation system and the rollout of the NEPF. Since then, the province has steadily progressed in terms of the maturity of evaluations conducted, and implementing departments are valuing the utility and usability of planned evaluations.

The WCG confirmed a need for a structured evaluation system for better use of evaluations and for measuring outcomes across provincial government, leading to improved and informed evidence-based decision-making. This need emanated from the WC being selected as a pilot province for the development of a PES from the recently established NES. In 2011, during this design phase, the WCG conducted an audit on all evaluations and research that had been conducted in the province and realised that 118 studies had been conducted. Many of these studies were not written up in a coherent or structured manner and had been undertaken sporadically. Emanating from this work, the first directory of evaluations was developed. Many of the studies could not be used to meaningfully contribute as evidence but provided a departure point to value the role of evaluations to complement the already strong monitoring systems in place. The DotP approached the practice of government evaluations within the broader framework of the PWMES that is premised on the GWMES and the related data terrains. This approach could be taken as the WCG had a centralised transversal approach to lead and institutionalise Results-based Monitoring and Evaluation (RBME) across all WCG departments. The institutionalisation of the NES at a provincial level was centred on the existing programme and project performance information system and the RBME2 approach was adopted by the WCG. This facilitated the integration of provincial evaluations of WCG policy interventions towards management for results. This meant that these data terrains became sub-systems of the broader PWMES. Within this context, one needs to understand that at a national level, evaluation evidence forms part of the NES, although at a provincial level, the evaluation system relates to a sub-system, which is integral to the PWMES.

The evaluation journey can be mapped with timelines for the period since the pilot to 2019. Figure 3 illustrates this timeline. The period 2011–2012 covered the pilot and evaluation audit. The period 2013–2019 covered the institutionalisation of the evaluation system that includes strengthening the elements of the system, the number of evaluations conducted and documentation of the evaluation journey on an annual basis in the form of an evaluation update.

FIGURE 3: Institutionalising evidence through South Africa’s National Evaluation System, presented by Western Cape Government at the Global Evidence Summit, Australia 2018.

The rollout of the Provincial Evaluation System in the Western Cape

Emerging from the establishment of the NES, the WCG has set a platform across South Africa for the building of an institutional enabling environment for evaluations amongst its 13 departments. Its use is ultimately to enable evidence-based development for informed service delivery and give effect to the post-2015 development agenda, government priorities and relevant WCG policies, programmes and projects. The criteria and process used for the selection of evaluations for the WC PEP are also in line with the NEPF and the NEP. Consistent with the NEPF, it is imperative to note that the WC PEP focusses on a variety of government interventions, with an emphasis on key strategic policy thematic areas as promoted by the Provincial Cabinet, currently referred to as the Provincial Strategic Goals (PSG).

The ‘Call for Evaluations’ takes place annually. Departments are then required to respond to this call by submitting evaluation proposals in the form of concept notes for key policies, programmes and projects that they propose to evaluate. The management of the PEP follows a 3-year cycle with each annual cycle aligned to its own business processes, from the ‘call for evaluations’, the tracking of recommendations through the development and monitoring of improvement plans. These phases are transversally supported and managed by the PWME Directorate, situated within DotP.

Institutionalising the Provincial Evaluation System

The status of evaluations in the Provincial Evaluation System

The first 3-year PEP cycle concluded in March 2016, and the second 3-year cycle concluded in March 2019. During the first PEP (2013–2016), 23 evaluations were commissioned. The findings and recommendations of the 14 completed evaluations are already being used to improve performance and accountability. During the second ‘3-year cycle’ (2016–2017 to 2018–2019), 29 evaluations were commissioned, 20 of which were completed by March 2019. At this point, 11 of the 13 provincial departments have participated in the PEP process.

To date, the WCG has produced five of the six types of evaluations, with an emphasis on 16 impact and 16 implementation evaluations. It is envisaged that through the recent membership to three evaluations, that is, there will be an increased focus on conducting impact evaluations in the province.

The number and type of evaluations that have been conducted in the WCG to date are shown in Figure 4.

FIGURE 4: A schematic representation of the number and types of evaluations in the Western Cape Government Provincial Evaluation Plans 2013–2019.

Strengthening the Provincial Evaluation System using the findings from the evaluation of the National Evaluation System

The WCG continuously strives to improve the PES. It was envisaged that the findings of the implementation evaluation of the NES conducted by Genesis Analytics, DPME (2018 a, b, c), would navigate the way forward to strengthen the NES and its institutionalisation, and the subsequent evolution at a provincial level.

The purpose of the evaluation of the NES was to assess whether its implementation is having an impact on the programmes and policies evaluated, the departments involved and other key stakeholders and to determine how the system needs to be strengthened. The WCG has made a concerted effort to use the recommendations emanating from this evaluation to strengthen the implementation issues on a provincial level.

As discussed in Goldman et al. (2019), the evaluation made use of a theory-based approach including international benchmarking; five national and four provincial case studies; 112 key informant interviews; a survey with 86 responses; and a cost–benefit analysis of a sample of evaluations. From 2011 to the end of December 2016, 67 national evaluations were completed or underway, covering over $10 billion of government expenditure. Seven of the nine provinces had PEPs, and 68 of 155 national and provincial departments had departmental evaluation plans (DEPs). The key findings provided a range of recommendations to strengthen the system ranging from legislation to strengthening the mandate, greater resources for the NES, strengthening capacity development, better communication and better tracking of and use of evaluation recommendations and improvements.

Elements of the National Evaluation System

A lens on the Provincial Evaluation System

The evaluation of the NES used Holvoet and Renard’s (2007) six characteristics as the theoretical framework and guide for the analytical framework. These are policy, methodology, organisation, capacity, participation of other actors, and quality and use. The evaluation provides a summary of the comparisons of these characteristics between South Africa and other selected African countries.

The following section examines these characteristics in relation to the PES. It provides insight into the institutionalisation of the NES at a sub-national level and the learnings that can be applied. The findings of the NES evaluation are then used to put improvements in place for the WC PES.

Policy

In the NES, evaluation plans exist on three levels. The DPME Evaluation Research Unit supports all NEP evaluations and coordinates the NES across the government with Offices of the Premier playing a similar role in provinces. In 2016, evaluation results were used for the first time to inform the national budget process.

The WC has institutionalised the NEPF, and evaluation plans exist on two levels. There is a PEP in place, with a focus on the PSGs leading to the National Objectives (NOs). Departmental evaluation plans are also in place with a departmental focus.

There were four key recommendations from the evaluation of the NES with regard to evaluation policy in the WC:

  1. Financial support and co-funding arrangements to be made to incentivise evaluations.

  2. Linkage of evaluations to the decision-making around budget allocations.

  3. Linkages of the evaluations in the PEP with the PSGs with considered synthesis.

  4. More departments to be involved in conducting evaluations.

The WC is advocating evidence-based policy-making, with advanced thinking towards linking evaluations to planning and budget cycles well underway. A guideline for linking evaluations to planning and budget was developed (DotP) in 2018. This guideline directed a standardised approach by the driver or transversal departments, namely DotP and Provincial Treasury (PT), in terms of the funding of evaluations for strategic interventions, including provincial budget policy priorities, linked to the five PSGs, which have received budget allocations since 2014. In support of this, PT allocated funding for evaluations. Planning for evaluations has landed in the provincial planning cycle, with proposed budgets and annual performance plans (APPs) being assessed for evaluation evidence, either planned evaluations, evaluations that are underway or the utilisation of evaluation evidence. This had a positive impact on the number of departments that are involved in implementing evaluations in the province and the growing understanding of the benefits and use. Furthermore, this is to ensure that multi-year interventions with huge budgets will be evaluated during their lifecycle, and learnings can be drawn from this evidence to assist decision-makers with future planning in line with the ‘MITS’3 model, to maintain, innovate, terminate or sustain such an intervention.

The WCG’s evaluation process serves as a standard item on both the agendas of the Departmental Executive Committee and the provincial top management (PTM), which further enhances its integrative approach. Regular briefing notes and updated communiqués are compiled in terms of the function and process, and this is further enhanced through identified champions within the province, who continuously promote a culture of evaluations through various sharing initiatives and presentations. Provincial top management keeps evaluations on their planning agendas, and strategic evaluations that typically straddle the transversal space are viewed as key to unlocking the full potential of evaluation and quality evidence use for better decision-making.

The WCG is currently well underway with a policy initiative to invest in a data governance programme. In this regard, a Provincial Data Office oversees the availability, accessibility, security and ethics of data and evidence. It is also envisaged that improved data quality will unlock the potential to undertake high-quality impact evaluations and to ensure the methodological soundness of future evaluations. This work supports the building of an evidence architecture for evaluation and research evidence.

Within the broader evaluation policy context, the DotP is ready to review the PES and utilise this evidence to reposition its approach to respond to the demand for evidence for the provincial strategic priorities. This paves the way for the WCG to embark on a new 5-year trajectory to account to the people of the province.

Methodology

In the NES, programmes are selected for evaluation, and the methods applied are appropriate for the stage of that intervention. There is a mix of bottom-up and strategic proposals from DPME and National Treasury. Most evaluations undertaken are implementation evaluations, as they have a more rapid feedback into policy. Guidelines are utilised for observation.

In the WC PES, there is a demand for the various types of evaluations with five of the six evaluation types being utilised to date. For the first 4 years (2012–2016), impact and implementation evaluations were favoured. The past 2 years saw a spread across five types of evaluations being conducted, not only impact and implementation (16) evaluations, but also diagnostic, design and economic evaluations. The availability of good-quality data and programme documentation is key to ensure the methodological soundness of an evaluation. DotP provided technical input and commentaries on the DPME guidelines for the six types of evaluation and has since then developed a series of additional guidelines. This is an ongoing process and forms part of the development of strategic frameworks and methodologies attributed to PWME.

The selection of evaluations follows a formal written process whereby evaluations can be spontaneously proposed or interventions can be identified for evaluation through the planning and budgeting process.

The criteria used for the selection of departmental evaluations to be included in the PEP remain in line with the NEPF. The WCG has placed a priority on evaluations of existing interventions that:

  1. are a provincial priority

  2. are innovative

  3. signify a keen public interest

  4. have not been evaluated recently

  5. are at a critical stage, where decisions need to be taken for which an evaluation is required to provide the necessary data and information

  6. have monitoring data and/or spatial information to inform the evaluation process

  7. have a potential budget for evaluation.

The provincial guideline that sets out the specification and criteria for departments that wish to apply for PT funding for evaluations notes the following criteria to be applied.

A proposed evaluation must be:

  1. a policy priority – strategic and transversal intervention linked to the PSGs and the NOs

  2. a budget policy priority – key policy allocations provided in current and previous Medium-term Expenditure Frameworks (MTEFs)

  3. innovative – signify areas where it is important to improve and learn, ultimately changing the lives of the people

  4. a design, implementation or impact evaluation

  5. linked to a budget priority of the department or a large budget programme.

Concept notes are submitted for proposed evaluations. All concept notes that apply for strategic funding are assessed by the Provincial Evaluation Steering Committee (PESC), utilising the above criteria. It is envisioned that going forward the evaluation of interventions will be predetermined and aligned to the accountability framework of the provincial strategic plan (PSP).

The WCG produced two provincial guidelines during 2014–2015. This was accomplished to facilitate a standardised PEP implementation approach across the 13 departments. These guidelines include the development of a project plan to implement an evaluation, and the provision of an annual updated milestone and activity schedule for the WC PEP. In line with the RBME approach, these guidelines were translated into an official project plan as part of the broader PWME and the Biz Projects System on the BI.

The evaluation of the NES found the following:

  1. There is still some confusion in the WC regarding the types of evaluations available.

  2. Some interventions are not suitable for impact evaluations because of a lack of quality data.

Organisation

Nationally, the NES is centrally located to manage the evaluations. There are decentralised M&E units in departments and agencies. Few of these have the capacity to support evaluations. There are some national departments that have evaluation directorates, but these are rare.

In the WCG, DotP serves as the coordination and oversight structure for evaluations in the province, although also hosting and coordinating the Evaluation Technical Working Group (ETWG). The department provides support and technical advice to the other WCG departments as they progress through the evaluation ‘journey’, whilst working with departmental M&E officials and programme managers.

The evaluation of the NES found the following:

  1. There were gaps in departmental evaluation technical expertise, and these needed to be identified and addressed.

  2. Programme managers needed to be included more in the evaluation process as this would make evaluations easier to conduct and improvements easier to implement.

In response to the recommendation from the evaluation of the NES to establish partnerships, the WCGs participation in the national ETWG steered the provincial establishment of a cross-government ETWG. The working group receives preference for capacity-building courses and participation in the NES and serves as a forum for the evaluation of stakeholders within the province. The WCG utilises the ETWG as a learning and sharing space for all stakeholders involved in evaluations. Membership is encouraged for programme managers for the duration of the implementation of an evaluation of their programme. The DotP serves as members on the departmental strategic evaluation steering committees in conjunction with the departmental M&E officials and the programme managers. DotP’s role is to provide technical advice and coordination.

DotP also coordinates progress reporting to DPME, assistance with improvement plans and provincial evaluation capacity building.

The PESC was established in the WC in 2018. This is a collaboration between PT and DotP and is mandated to manage the transversal allocation of budgets for key funded strategic evaluations and to support the advocacy for PES evaluations.

The WCG is the key advocate for the promotion of evaluation and evidence-based policy-making and programming at international, national, regional and local levels. This advocacy is achieved through contribution to the international evaluation community through participation in various Voluntary Organisations for Professional Evaluations (VOPES) and presentation at various national and international evaluation and evidence forums.

Capacity

Capacity weaknesses exist in the NES. South Africa has not undertaken a skills assessment of technical staff, and there is further work to be carried out to develop competencies for evaluators and government staff who manage evaluations. Capacity needs have emerged through practice. Evaluation courses have been developed and have been rolled out. Advocacy campaigns have taken place to promote M&E.

For evaluations to be effective, staff members need to understand the terminology and also the evaluative process. Ongoing advocacy and learning are crucial, and in the WCG this is championed by DotP through the previously mentioned ETWG engagements. In the NES evaluation, capacity concerns in the WC were highlighted as a constraint for conducting evaluations. A number of initiatives have been undertaken by DotP to provide capacity building for programme managers and M&E officials; this is supported by the Provincial Training Institute (PTI) and DPME. DotP provides one-on-one capacity-building workshops and design clinics with implementing departments and provides implementing departments with technical guidance and various approaches to evaluations within the NEPF. This includes the development of concept notes, terms of reference and the development of DEPs.

The evaluation of the NES highlighted the need for senior managers to understand the benefits and use of evaluation evidence. The WCG has initiated participation in a high-level evidence course provided by the university to serve this purpose. It is proposed that this will be rolled out in 2019–2020 as part of the ongoing capacity building and advocacy initiatives for PTM.

Participation of other actors

The evaluation of the NES found that there is a key role played by The South African Monitoring and Evaluation Association (SAMEA) and CLEAR AA, through their participation in the steering committee for the evaluation of the NES. In South Africa, there is a systematic engagement with parliament on the results of national evaluations. Universities also deliver capacity development work and may also bid for undertaking evaluations. More work is needed to bring evidence brokers such as think tanks on board to actively contribute in the evaluation space.

The evaluation of the NES found the following:

  1. There is a need for more evaluators.

  2. Universities should be utilised to increase the pool, act as peer reviewers and support co-funding.

In the WCG, academic institutions play an important learning function in the development of evaluation culture and practices, as they often present as service providers and some also offer M&E courses. The universities in the WC also serve as peer reviewers and supplement evaluation budgets by co-funding opportunities.4 Some WCG departments also have good connections with private sector partners who participate in evaluations by forming part of steering committees and providing comments on deliverables such as the terms of references.

The WCG is in the process of establishing a panel of vetted service providers for evaluations and research projects, which should assist in streamlining the supply chain process, provide more opportunities for emerging evaluators and go some way to address the supply-and-demand challenges currently experienced in the field.

Quality and use

The evaluation of the NES did not look at quality in relation to the PES.

In the NES, there is a focus on ensuring the use and benefits of evaluations. Findings are presented to stakeholders and senior management. There is a process of dissemination through policy briefs and thematic workshops. Results of evaluations are presented to the cabinet, which gives weight to the implementation of findings and use thereof. There is a formal follow-up process through the development of improvement plans, and 6 monthly progress reports are required for 2 years. Department of Planning, Monitoring and Evaluation also holds a repository of completed evaluations that is fully accessible to the public. Since 2016, evaluation evidence is starting to be used to inform the budget process.

The evaluation of the NES found, with regard to use in the WC PES, the following:

  1. Understanding evaluations evidence was key to its usefulness.

  2. Demonstrating the value and benefits of evaluation will encourage further evaluations to be conducted and recommendations to be used for programme improvements.

In the WC, there is a focus on creating an enabling environment for the evaluative process. The WCG has utilised the evaluation quality assessment system as developed by DPME. All evaluations completed in the PEP to date have been sent to DPME for quality assessment (QA), and all receive a score before being placed in the DPME repository. These assessments are subjected to the application of engagements with the relevant heads of departments who submitted the evaluations. Of importance is that the discussions include the QA reports as well as the strategy on how to deepen or strengthen future evaluations conducted by the departments.

The WCG has also employed the services of an external peer review to assist in maintaining quality and standards in the undertaken evaluations and to support their implementation. The peer reviewer ensures that quality standards are upheld for the evaluation of the six strategic programmes that are currently being implemented. This external expert furthermore helps to identify improvement interventions and conducts associated capacity and advocacy workshops for learning and improvement to elevate the internal capacity within DotP and the WCG.

The audit of evaluations conducted in the WC in 2011 is grounded in the data management and data assessment processes within the broader PWME and the WCG BI system. Emanating from the pilot of the NES, DotP has produced a core directory of common data sources where evaluations conducted are profiled. This informs the use of evaluation findings to support the global policy agenda and sub-national data requirements.

All completed evaluation reports are transferred to the WCG BI system ‘BizBrain’ and archived in a WCG’s evaluation database on the Evaluation Tile. An Evaluation Dictionary or directory of evaluations, in which all completed WC PEP evaluations are profiled, has been compiled, and this is updated as new evaluations are completed. Evidence is stored in respective folders with all meta-data relating to the evaluation. This Evaluations Repository also serves as a key WCG knowledge hub, which is particularly useful during the annual Management Performance Assessment Tool (MPAT) process, when evidence to substantiate the standard relating to evaluations and its link to the government strategic management process is required for the assessment. Emanating from this, the PES in the WCG now has seven phases, as the system has been strengthened with the inclusion of a data-profiling phase (Ishmail, Tully & Wynford 2017).

In the evaluation of the NES, M&E officials in the WCG noted that having access to the evaluations conducted by other departments via the BI system was very useful, as it showcased the work completed, its benefits, successes and the lessons learnt from the process.

Key achievements

When describing the development of the PES, the following illustration was developed to highlight the work carried out across the years. It further highlights the key milestones along the journey, for example, the initial audit of evaluations, the concerted efforts towards capacity building and advocacy, the establishment of the ETWG and the identification of evaluation champions in the departments. The development of the evaluation tile repository on the Biz system coincided with the commencement of the quality assurance process, leading to a peer reviewer being appointed for strategic evaluations. In 2018, the guideline for linking evaluations to planning and budget saw six funded strategic evaluations being undertaken on big budget priority interventions. Following the evaluation of the NES, whereby the WC was a case study, the emphasis has shifted to the utilisation of evidence, with the development of evidence briefs and evidence gap maps.

Figure 5 illustrates the evolution of the WCG Provincial Evaluation System, highlighting the key milestones.

FIGURE 5: The Evolution of the Western Cape Provincial Evaluation System.

Conclusion

The key achievements relating to the WC PES can be attributed to promoting the importance of evaluations for sound evidence-based decision-making and the linking of evaluations to ultimately benefit government programmes and the citizen. This leads to the creation of a data-driven environment that consistently demonstrates the benefits and use of evaluation evidence and maximises its value.

The WCG continues to strive for excellence in evaluations, building and strengthening the PES, with a clear focus on the recommendations to strengthen the system, emanating from the evaluation of the NES. As we commence with our seventh year of implementation and reflect on the past 5-year trajectory, the WCG is committed to building on lessons learnt and progressively developing and strengthening the evaluation eco-system for the future, whereby huge budget interventions are supported by project plans, evaluation research and evidence.

Key to this approach is a review of the PES towards a repositioned approach to strengthen evidence use for the next 5 years, 2019–2024. This will outline the strategic content of evaluations and research towards the building of a solid evidence architecture to support the provincial priorities, with planned evaluation activities for key interventions over the 5-year period and beyond.

Acknowledgements

The authors would like to thank the Heads of Departments and the Director-General of the Western Cape Government, Mr H.C. Malila. They would also like to thank Dr Ian Goldman, currently an internal advisor in Evaluation and Evidence Systems. Ms Amina Mohamed is the Director of Monitoring and Evaluation who leads the Provincial Evaluation System since 2016. Ms Rowina Wynford and Mr Faizel Noordien are both senior officers of Monitoring and Evaluation who have been part of the evaluation journey since 2012.

Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.

Authors’ contributions

Z.I., Chief Director of Strategic Management Information, Department of the Premier, leads the institutionalisation of the Provincial Monitoring and Evaluation System in the WCG. V.L.T. is a senior officer of Monitoring and Evaluation since 2016, specialising in evaluation systems. Z.I. guided the writing of this article that builds on the work conducted since the first paper on WCG evaluation work was presented at the American Evaluation Society in 2015. V.L.T. undertook the writing of this article.

Ethical considerations

This article followed all ethical standards for a research without direct contact with human or animal subjects.

Funding information

This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Data availability statement

Data sharing is not applicable to this article as no new data were created or analysed in this study.

Disclaimer

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.

References

Department of the Planning, Monitoring and Evaluation (DPME), 2011, National Evaluation Policy Framework, Department of the Planning, Monitoring and Evaluation, Pretoria.

Department of Planning, Monitoring and Evaluation (DPME), 2018a, Report on the Evaluation of the National Evaluation System – Summary report, Department of the Planning, Monitoring and Evaluation, Pretoria.

Department of Planning, Monitoring and Evaluation (DPME), 2018b, Report on the Evaluation of the National Evaluation System – Full report, Department of the Planning, Monitoring and Evaluation, Pretoria.

Department of Planning, Monitoring and Evaluation (DPME), 2018c, Presentation on the Evaluation of the National Evaluation System, Department of the Planning, Monitoring and Evaluation, Pretoria.

Department of Planning, Monitoring and Evaluation (DPME), 2018d, Evaluation update, Department of the Planning, Monitoring and Evaluation, Pretoria.

Department of the Premier (DotP), 2015a, Strategic framework for provincial-wide monitoring and evaluation, Department of the Premier, Cape Town.

Department of the Premier (DotP), 2015b, An annual evaluation review on the implementation of the Western Cape Government Provincial Evaluation Plan: Evaluation update No. 2, Department of the Premier, Cape Town.

Department of the Premier (DotP), 2018a, A guideline for linking evaluations to planning and budgeting, Department of the Premier, Cape Town.

Department of the Premier (DotP), 2018b, An annual evaluation review on the implementation of the Western Cape Government Provincial Evaluation Plan: Evaluation update No. 5, Department of the Premier, Cape Town.

Department of the Premier (DotP), 2019, An annual evaluation review on the implementation of the Western Cape Government Provincial Evaluation Plan: Evaluation update No. 6, Department of the Premier, Cape Town.

Goldman, I., Deliwe, C.N., Taylor, S., Ishmail, Z., Smith, L. & Masangu, T., 2019, ‘Evaluation2_ Evaluating the national evaluation system in South Africa: What has been achieved in the first 5 years?’, African Evaluation Journal 7(1), a400. https://doi.org/10.4102/aej.v7i1.400

Holvoet, N. & Renard, R., 2007, ‘Monitoring and evaluation under the PRSP: Solid rock or quick sand?’, Evaluation and Program Planning 30(1), 66–81. https://doi.org/10.1016/j.evalprogplan.2006.09.002

Ishmail, Z., Tully, V. & Wynford R., 2017, ‘Strengthening the provincial evaluation system through the development of profiling standards for better understanding and access to the evaluations conducted – A Western Cape Government case’.

Statistics South Africa (StatsSA), 2008, South African statistical quality assessment framework, Statistics South Africa, Pretoria.

The Presidency, 2007, Framework for managing programme performance information, Department of the Planning, Monitoring and Evaluation, Pretoria.

Footnotes

1. Updated figures as at 11 June 2018, DPME.

2. The RBME approach is an indicator and data management system that measures a compendium of outcome indicators, within the broader context of the Province-wide Monitoring and Evaluation Framework, to inform the development and the strategic agenda of the province.

3. This is a system utilised by Provincial Treasury to assess during the budget process whether interventions should continue as is, be changed and improved, be stopped or kept running and replaced with another intervention.

4. See http://www.chec.ac.za/.


 

Crossref Citations

1. Translating Evaluation Policy Into Practice in Government Organizations
Isabelle Bourgeois, Stéphanie Maltais
American Journal of Evaluation  vol: 44  issue: 3  first page: 353  year: 2023  
doi: 10.1177/10982140221079837