Case StudyEvaluation

Case Study: Design Evaluation of the SWD@Schools Programme

By November 12, 2020No Comments
Design Evaluation of the SWD@Schools Programme feature

Another one of our case studies shines the light on the design evaluation of the SWD@Schools Programme which was started in March 2019 and completed in October 2019.

The project was completed on behalf of Western Cape Government Department of Economic Development and Tourism (DEDAT) together with  Western Cape Education Department (WCED), Oracle SA, 10 Western Cape schools, post-school training service providers (On the Ball College [OTBC], CapaCiTi and EOH).

The project was funded by DEDAT, and you can read more here.

Project Outline:

Recognising the challenges and opportunities to address critical skills shortages within the ICT sector, the SWD@Schools programme was conceptualised. The programme took the form of a strategic partnership between Western Cape DEDAT, WCED and Oracle S.A.

The four phases of the programme that learners move through are:

1) after-school/extracurricular training in Java for school learners in grades 10-12 from eleven schools in the Western Cape;

2) a three-month post-school programme component where the matriculants are provided with advanced Java training and professional skills training;

3) a six-month internship component within an IT-related position, and

4) a component where learners sit for an Oracle certification exam.

WCG DEDAT commissioned DWC to conduct an independent design evaluation of the SWD@Schools programme; looking at its Theory of Change (ToC) and Theory of Action (ToA), assumptions, indicators and data systems. The stakeholders sought to understand how the programme was and is designed to work. As well as whether the design was robust and likely to yield the anticipated outcomes and impact.

A further requirement of the evaluation was that it needed to assist the Department to understand if further evaluations are viable, under what conditions, and what programme fundamentals should be in place for future evaluations.

Project Deliverables:

In order to the above, the evaluation produced:

  • A literature review outlining the need for ICT programming skills in South Africa, similar initiatives internationally and locally, a comparison of the SWD@Schools programme and these similar initiatives, and principles to consider for software training programmes targeted at school-aged learners.
  • A programme theory (consisting of ToC and ToA [logic model]) based on consolidating programme documents, stakeholder engagements and a programme theory workshop.
  • Outcome indicators for the programme.
  • A full evaluation report discussing the robustness of the programme design and areas for improvement, as well as 1/5/25 summary report.

Our Approach:

A mixed-method design and a formative approach were used to understand how the programme was designed. As well as what design and implementation changes occurred. Whether the programme design was robust, what was working and what could be improved. Secondary data review included a desktop review of programme documents and a targeted literature review.

Primary data collection took place through Focus Group Discussions (FGDs) and interviews tailored to five distinct stakeholder groups. Namely, DEDAT Trade Sector Development division and WCED; DEDAT Skills Development and Innovation division; Oracle SA; Implementing partners (OTBC, CapaCiTi, and EOH); and teachers. After the document review, literature review and FGDs / interviews, a programme theory, consisting of a revised ToC and a ToA, as well as outcome indicators were drafted.

This formed the basis for developing and finalising a current ToC and ToA in consultation with SWD@Schools stakeholders through two participatory and interactive workshops, to ensure all stakeholders had a common understanding of intended programme achievements, the extent to which each activity had to be performed, and the underlying assumptions. The first workshop entailed an interactive review of the draft ToC and ToA. During the second one, the ToC and ToA were refined and draft outcome indicators were agreed on. These were reference points for answering EQs.

Data analysis was organised around the evaluation questions (EQs). Data triangulation between various sources and kinds of data was used to enhance the confidence and reliability of findings. Microsoft Excel was used for analysing quantitative data and key themes and relevant quotes were identified from qualitative data through thematic analysis.

Value:

This evaluation provided the opportunity for an articulated programme theory of the current SWD@Schools programme to be derived through a highly consultative process with key stakeholders of the programme (although not all stakeholders could participate fully at all times). The evaluation further provided the opportunity for reflection on the programme theory, allowing key issues of the design to be identified.

While the idea behind the SWD@Schools programme is certainly novel in fast-tracking learners into employment in IT-related positions, the programme theory had room for improvement. DWC made several recommendations related to the contextual strategic positioning of the programme, programme design, monitoring and future evaluations; all geared towards improving the programme and the achievement of its desired outcomes.

Highlights

The evaluation highlighted the complexity of fast-tracking learners into employment without tertiary education, and the challenges in realizing this goal. While the programme had been running for several years, there was not clearly articulated programme theory. This is a valuable output of the evaluation; the programme theory can serve as a live and working document and provide a strong basis for amending the programme design and guiding programme planning going forward. With a redesign of the programme noting the learnings from the evaluation, resources can be more efficiently allocated to activities and processes. This leads to desired outcomes, and more robust monitoring systems to track performance and ensure accountability.

The encouragement of proper documentation of changes made over time will also enhance more systematic reflection and learning. The evaluation further highlighted key data issues, and the importance of not only collecting data as required by annual performance plans but for the purpose of continuously learning and ensuring that a programme is being implemented as intended and on track to meet desired outcomes. The evaluation also provided numerous suitable options for comparison groups for future evaluations. As an outcome evaluation is expected to be undertaken in the near future.