Skip to main content

Case Study

Design Evaluation of the SWD@Schools Programme feature

Case Study: Design Evaluation of the SWD@Schools Programme

By Case Study, Evaluation

Another one of our case studies shines the light on the design evaluation of the SWD@Schools Programme which was started in March 2019 and completed in October 2019.

The project was completed on behalf of Western Cape Government Department of Economic Development and Tourism (DEDAT) together with  Western Cape Education Department (WCED), Oracle SA, 10 Western Cape schools, post-school training service providers (On the Ball College [OTBC], CapaCiTi and EOH).

The project was funded by DEDAT, and you can read more here.

Project Outline:

Recognising the challenges and opportunities to address critical skills shortages within the ICT sector, the SWD@Schools programme was conceptualised. The programme took the form of a strategic partnership between Western Cape DEDAT, WCED and Oracle S.A.

The four phases of the programme that learners move through are:

1) after-school/extracurricular training in Java for school learners in grades 10-12 from eleven schools in the Western Cape;

2) a three-month post-school programme component where the matriculants are provided with advanced Java training and professional skills training;

3) a six-month internship component within an IT-related position, and

4) a component where learners sit for an Oracle certification exam.

WCG DEDAT commissioned DWC to conduct an independent design evaluation of the SWD@Schools programme; looking at its Theory of Change (ToC) and Theory of Action (ToA), assumptions, indicators and data systems. The stakeholders sought to understand how the programme was and is designed to work. As well as whether the design was robust and likely to yield the anticipated outcomes and impact.

A further requirement of the evaluation was that it needed to assist the Department to understand if further evaluations are viable, under what conditions, and what programme fundamentals should be in place for future evaluations.

Project Deliverables:

In order to the above, the evaluation produced:

  • A literature review outlining the need for ICT programming skills in South Africa, similar initiatives internationally and locally, a comparison of the SWD@Schools programme and these similar initiatives, and principles to consider for software training programmes targeted at school-aged learners.
  • A programme theory (consisting of ToC and ToA [logic model]) based on consolidating programme documents, stakeholder engagements and a programme theory workshop.
  • Outcome indicators for the programme.
  • A full evaluation report discussing the robustness of the programme design and areas for improvement, as well as 1/5/25 summary report.

Our Approach:

A mixed-method design and a formative approach were used to understand how the programme was designed. As well as what design and implementation changes occurred. Whether the programme design was robust, what was working and what could be improved. Secondary data review included a desktop review of programme documents and a targeted literature review.

Primary data collection took place through Focus Group Discussions (FGDs) and interviews tailored to five distinct stakeholder groups. Namely, DEDAT Trade Sector Development division and WCED; DEDAT Skills Development and Innovation division; Oracle SA; Implementing partners (OTBC, CapaCiTi, and EOH); and teachers. After the document review, literature review and FGDs / interviews, a programme theory, consisting of a revised ToC and a ToA, as well as outcome indicators were drafted.

This formed the basis for developing and finalising a current ToC and ToA in consultation with SWD@Schools stakeholders through two participatory and interactive workshops, to ensure all stakeholders had a common understanding of intended programme achievements, the extent to which each activity had to be performed, and the underlying assumptions. The first workshop entailed an interactive review of the draft ToC and ToA. During the second one, the ToC and ToA were refined and draft outcome indicators were agreed on. These were reference points for answering EQs.

Data analysis was organised around the evaluation questions (EQs). Data triangulation between various sources and kinds of data was used to enhance the confidence and reliability of findings. Microsoft Excel was used for analysing quantitative data and key themes and relevant quotes were identified from qualitative data through thematic analysis.


This evaluation provided the opportunity for an articulated programme theory of the current SWD@Schools programme to be derived through a highly consultative process with key stakeholders of the programme (although not all stakeholders could participate fully at all times). The evaluation further provided the opportunity for reflection on the programme theory, allowing key issues of the design to be identified.

While the idea behind the SWD@Schools programme is certainly novel in fast-tracking learners into employment in IT-related positions, the programme theory had room for improvement. DWC made several recommendations related to the contextual strategic positioning of the programme, programme design, monitoring and future evaluations; all geared towards improving the programme and the achievement of its desired outcomes.


The evaluation highlighted the complexity of fast-tracking learners into employment without tertiary education, and the challenges in realizing this goal. While the programme had been running for several years, there was not clearly articulated programme theory. This is a valuable output of the evaluation; the programme theory can serve as a live and working document and provide a strong basis for amending the programme design and guiding programme planning going forward. With a redesign of the programme noting the learnings from the evaluation, resources can be more efficiently allocated to activities and processes. This leads to desired outcomes, and more robust monitoring systems to track performance and ensure accountability.

The encouragement of proper documentation of changes made over time will also enhance more systematic reflection and learning. The evaluation further highlighted key data issues, and the importance of not only collecting data as required by annual performance plans but for the purpose of continuously learning and ensuring that a programme is being implemented as intended and on track to meet desired outcomes. The evaluation also provided numerous suitable options for comparison groups for future evaluations. As an outcome evaluation is expected to be undertaken in the near future.

aerial view of mining ground

Case study: TIMS M&E Systems Strengthening Assessment

By Case Study, Evaluation

The Development Works Changemakers’ portfolio is expansive, with several case studies in a variety of development niches. Between January and March 2020, we worked for Wits Health Consortium (WHC) to provide an M&E systems review.

Funded by the Global Fund to Fight AIDS, Tuberculosis, and Malaria, we conducted Monitoring and Evaluation systems strengthening assessment on TB in the mining sector across Southern Africa.

Find out more about the project outline, our approach, the deliverables, values and more.

Project Outline

The Global Fund grant “TB in Mining Sector in Southern Africa” (TIMS) is a ten-country regional grant, supporting a programme which aims to help to reduce the burden of Tuberculosis (TB) in the mining sector in Southern Africa. The grant is being implemented in Botswana, Eswatini, Lesotho, Malawi, Mozambique, Namibia, South Africa, Tanzania, Zambia, and Zimbabwe, in selected Districts with a high burden of TB, large mining populations, and/or high levels of artisanal and small-scale mining.

The Wits Health Consortium (WHC) is the Principal Recipient (PR) of the grant, while implementation at the country-level is undertaken through two Sub Recipients (SR), with each SR managing five countries. These SRs work closely with the National TB Programmes in each of the countries in which they work.

The TIMS programme gathers data from a range of healthcare facilities and occupational health centres in participating countries, specifically data relating to TB prevalence among mining Key Populations (KPs). WHC relies on the national health system workers (e.g nurses, District officials) to collect, capture and report on these indicators to the PR, with the SRs acting as a conduit of this data, and playing a key monitoring, verification and support role.

The PR reports to the Global Fund on the results, which are aggregated across all 10 countries. To help manage this data flow and reporting, WHC established data management systems and processes intended to aid them to consolidate all programme data to submit as part of their semi-annual reports to the Global Fund. WHC, however, has experienced a number of issues with these data management systems and processes and it thus contracted DWC, as M&E experts, to conduct a technical review of its systems, processes and tools and advise on appropriate measures to ensure these challenges are overcome.

Project Deliverables

Project deliverables were:

  • Reviewing the systems, processes and tools at the PR level, SR level, and NTP level to identify areas of weakness and make recommendations to improve the systems. An M&E System Review Report and recommendations were produced, as well as two country mission reports after in-country assessment visits. A customised M&E good practice checklist was also developed and provided.
  • Reviewing the progress update report and supporting documents prior to submission to the Global Fund. A report was produced on the review of all data submitted to the GF.
  • Conducting training of M&E staff at PR and SR levels on data quality assurance aligned to PUDR reporting and GF guidelines. A training workshop was conducted, and a workshop report was produced.


Our Approach:

This was a rapid technical review of the TIMS M&E systems, conducted over a 25 day period in early 2020. The project took place in four stages:

Phase I: Rapid Review of Systems, Processes and Tools at PR Level

The following activities and methods were used in Phase I:

  • An inception meeting was held with the M&E team from the WHC, which led to the development of an inception report and work plan.
  • A document review was then conducted of all relevant documentation on the TIMS M&E systems and tools. In excess of 20 documents were reviewed.
  • An M&E checklist, based on criteria for a good M&E system was drawn up by the evaluation team. This checklist was used as a guiding reference and standard for reviewing the systems in Phases I and II. It also informed the development of all interview tools.
  • A physical inspection of the M&E system at a PR level was conducted.
  • The initial meetings, document review and M&E checklist were used to generate structured interview/survey tools, which were used for interviews with key staff and other role-players at a PR level. These interviews were conducted via Skype/WhatsApp in the week of the 20th of January and focussed on key questions around how the PR is experiencing the system, challenges, system gaps and failures, user-friendliness and good practices. In all, six key individuals were interviewed.

Phase II: Rapid Review of Systems, Processes and Tools at SR and NTP Level

Following the review at PR level, DWC undertook two in-country visits to review the systems, processes and tools at SR and National TB Programme (NTP) level. Country visits were conducted in South Africa and Botswana from 27-29 January 2020.

  • DWC undertook a review of key documents at SR and NTP level (e.g. tools, SOPs, quarterly reports) in cases where these were not included in the PR system review. A total of 11 additional documents were reviewed in this phase, including SOPs, and M&E tools and protocols.
  • DWC then undertook a physical inspection of the M&E systems, processes and tools in two of the grant countries. Based on discussions with the PR, South Africa and Botswana were identified, given that these are both more cost-effective and where the two SRs are based. Of the two days spent in each country, one day was spent with the SR learning about the systems and issues faced at the management site, and one day was planned to be spent in the field learning how data is gathered and what the ground-level dynamics and constraints of data collection are.
  • A structured interview/survey was developed based on the learnings from Phase I and the additional SR- and NTP-related document review. The evaluators conducted the interviews with those involved in the data flow at the country level, including SR staff and available NTP staff. In all, 11 individuals were interviewed during country visits.

Phase III: Rapid Review of Draft Progress Update Report and Supporting Documents

In the third stage of the consultancy, DWC undertook a rapid remote review of the draft progress update report (PUDR) and supporting documents ahead of WHC’s submission to the Global Fund.

This review served to assist and ensure that there are no discrepancies between source documents and data, and the final reported results. Where variances were identified, these were documented in a rapid report, and thereafter amended in the PUDR report.

Phase IV: One-day Training Session

The fourth phase of the assignment involved the design and delivery of a one-day training session for PR and SR M&E staff.  The training session was informed by the findings of the systems and PUDR reviews, as well as GF M&E protocols.


This technical review of M&E systems was highly valuable to the WHC, as well as all other role players within TIMS. It allowed for the independent verification and documentation of the many challenges faced in gathering, monitoring, verifying, collating and reporting on TB data from key mining populations in 10 Southern African countries. This was not a normal M&E systems review. The TIMS M&E Unit was well trained and had set up reasonably good systems, but the complexity of managing a 10-country data gathering and management system caused challenges.

One of the main challenges related to the sheer complexity of trying to manage a health systems intervention with 10 different countries in one region. In this case, TIMS was trying to get each country to gather disaggregated data on mining key populations and their families with the data they already gathered in their TB Registers. Although all part of the SADC region, and using WHO TB tools and processes, each country’s health system differs in many respects.

testimonial quoteAlthough each of these 10 countries signed on to the TIMS programme at a high level, the challenge was in getting standardised data gathering, monitoring and reporting tools, processes and systems across all of these countries. At the onset of TIMS, none of the countries was gathering TB data specifically on mining key populations, although they all used the WHO TB Register system, albeit in slightly different formats, or even languages.

The TIMS programme, therefore, worked with each National TB Programme to introduce a sticker system which would allow those being recorded in TB Registers at primary health facilities to also be identified as a miner, ex-miner, family of these two groups, or living in a mining community.

Each month, health staff were supposed to record and report on the details of those marked by the sticker system. These data were then meant to be captured and reported up to measure against certain targets which TIMS had set itself. However, there were many problems with the entire data chain, from the use of the sticker system to the recording of the data at the facility level to the capturing at the District level, and then to the collation of data at country level, and finally, reporting to the GF.

Problems included the fact that the TIMS M&E Unit was understaffed, and there was no way that two full-time staff could build relationships, train staff, verify data and ensure capturing was accurate in all 10 countries. Their capacity was highly stretched. The NTPs also reported their statistics late, which meant that there was little time for the M&E Unit to ensure it was accurate and to enter it correctly into the reporting tools for the GF. The use of MS Excel spreadsheets also caused many problems with data accuracy, which were only later resolved. A new Management Information System was then introduced to improve on this system and iron out the mistakes.

DWC was able to explore the causes of these challenges and complexities in detail, which assisted not only in documenting the TIMS experience but also in the identification of a number of recommendations for how the systems could be improved. The PUDR review also identified a number of discrepancies and mistakes in the data, which WHC could then correct in time to submit to the GF. Finally, the M&E Systems training was highly useful for PR and SR staff in getting them to understand how to address the challenges they faced and improve their systems.

To stay up to date on all of our projects and industry news, you can follow us on Facebook, Twitter and Linkedin.

Case study: Assessment of after school programming at low and no-fee schools in the Western Cape

By Case Study, Education

From January to March 2019, we worked with Western Cape Government’s Department of the Premier and After School Gamechanger to conduct a research assessment of after school programmes at schools. The education/youth development project was funded by The Western Cape Government.

classroom after school

Project Outline

The After School Game Changer programme was one of seven interventions or “game changers” identified by the Western Cape Government that were seen as most likely to improve opportunities and address some of the greatest challenges facing the citizens of the Western Cape. The Game Changer programmes were: skills development; energy security; high-speed broadband;  eLearning; after-school activities; better living models; and alcohol harm reduction.

They were implemented by the Department of the Premier between 2015 and 2019 and have since been absorbed into their respective line departments.

The After School Game Changer was implemented in some of the 1059 no- and low-fee schools in the Western Cape province. This programme has since been incorporated into the Department of Cultural Affairs and Sport as the After School Programme Office (ASPO).

The After School Game Changer worked to increase the participation of learners from no- and low-schools in after school activities, ensuring regular attendance by significantly improving the attractiveness and quality of such programmes for learners. By 2019 the target was to get 112,000 learners participating regularly in quality programmes – 20% of learners in no- and low-fee schools. This was a joint Game Changer in partnership with local government, provincial government departments, and a number of NGOs. The Mass participation; Opportunity and access; Development and growth (MOD) Programme was the flagship after school programme run by the Department of Cultural Affairs and Sport (DCAS).

Between 2015 and 2018, baseline assessments were undertaken of all 181 MOD Centres and 93 After School Partial Care sites. These assessments were aimed at determining the scope of after school programmes offered at each school, run by the school and its staff, as well as by external organisations. In 2019, the After School Game Changer commissioned another assessment with the aim of measuring the progress in a sample of Western Cape schools with regard to After School Programmes (ASPs) against the baseline assessments conducted in 2018.

The main assessment question was “What is the status of facilities/centres that provide school after-care services as part of the After School Game Changer programme compared to the situation at the time of the baseline assessment?” Two further key considerations were considered: “To what extent are the schools developing and sustaining a culture of after school programming?” and “What does an after school programme at a no- or low-fee school ideally look like?”

Development Works Changemakers (DWC) was contracted to conduct this assessment in January 2019.

Project Deliverables

Project deliverables were:

  • Assessment visits to 112 low or no-fee schools in eight districts around the Western Cape.
  • The production of 112 school reports covering a large range of issues relating to each school’s after school offering.
  • A comprehensive summary report outlining the methods used and the key trends from the data coming out of all 112 assessments.
  • A PowerPoint presentation on the results

client testimonial

Our Approach

DWC assembled a team of experienced researchers who were specifically trained over a two-day period by the lead DWC researcher in all aspects of the fieldwork and data gathering protocols, tools and processes. This team of 12 individuals traversed the province, conducting site visits at each of the 112 schools.

All schools were informed of the research by the Western Cape Education Department (WCED) ahead of the visit. But actual visits were unannounced so that the researcher could see the most natural scenario at a school, rather than a pre-arranged display of after school activities. Researchers spent up to five hours at each school, interviewing key individuals, and observing ASPs.

Detailed interviews were conducted with at least three key individuals at schools with insight into the ASP. These interviews were done through a set of carefully designed questionnaires which were developed using Survey Monkey and loaded onto a tablet device. Surveys were tailored for different interviewees such as Principals, ASP coordinators, teachers and NGO partners. Researchers conducted each interview with their tablet device and uploaded the completed surveys to the cloud as soon as it was over, using data loaded on each device. Quality checking could thus happen very quickly, with the lead researcher monitoring the uploaded surveys as they came in and providing feedback to the researcher.

In addition to the interviews, where possible, at least two after school activities were observed during visits. A detailed observation form was also loaded on the tablet device used by the researchers. These forms were completed during observations and also uploaded to the cloud for checking and analysis.

The project was conducted under considerable time pressure. School visits were conducted concurrently by the 12 researchers, and as this data came in, another team of four researchers analysed the data and wrote each school report up, following a detailed standard report structure. These reports were then edited and streamlined by the lead researcher and another member of the research team.

The lead researcher then analysed the overall data from the 112 Principal/coordinator surveys, as well as data relating to each school’s ASP contained in the school reports. This analysis informed the overall summary report.


This assessment was of high value to the After School Game Changer and its partners. It provided a comprehensive picture of what ASPs are happening at low and no-fee schools in the Western Cape and showed what an after school programme at a low or no-fee school can look like. The assessment provided 112 high quality and comprehensive reports on schools across the province, which show the range of activities and how schools have overcome difficulties to run ASPs.

The reports also allowed a broader analysis of what is happening in each district, based on the sample included in the study. It showed trends, such as districts with higher numbers of good ASPs, where schools focussed in their ASP offerings, as well as common challenges and gaps. The assessment showed that no-fee schools can offer ASPs, but it also showed the conditions under which these can thrive, and what is needed for success in this endeavour.

The assessment also is a resource for the ASPO as it seeks to involve schools and the WCED in a conversation about what can be done to improve after school programming at schools, and what support the WCED can provide in this regard.

Read more

We have been documenting our recent case studies on our blog. Browse our case studies to learn more about our involvement in the research, evaluation and development space.

Case study: Assessment of impact of online courses on digital finance services practitioners

By Case Study, Evaluation

From October 2018 to March 2019, Development Works Changemakers completed their assessment of the Impact of Digital Frontiers Institute (DFI) Online Courses on Practitioners in the Digital Financial Services (DFS) Sector.

The project was funded by FSD Africa, Bill & Melinda Gates Foundation, and Omidyar Network, covering a geographic scope of Mozambique, Malawi, Zambia, Rwanda, and Uganda.

DFI assessment

Project Outline

DFI aims to develop the capacity and enhance the professional development of DFS professionals working in the private, public, or development sectors. By closing the DFS capacity gaps currently experienced in developing markets, the organisation, in the long term, aims to accelerate financial inclusion.

To achieve this DFI provides online training and education courses. Consisting of seminars through a built-for-purpose online campus. These primarily focus on foundational DFS knowledge and skills, but also include areas of leadership development and change management.

Additionally, DFI facilitates a network of professionals, or communities of practice (CoP) which include in-country face-to-face meetings of DFI students and DFI-affiliated professionals, as well as the moderation of online meetings through DFI’s built-for-purpose digital network and series of global seminars. DFI’s first full year of courses was in 2016.

In 2017, DFI undertook focus group research to understand the impact one of its foundational training courses, the certificate in digital money (CIDM), was having on alumni and their organisations. Data was collected from Zambia, Rwanda and Uganda. In 2018, Development Works Changemakers (DWC) was commissioned to undertake follow-up data collection.

Unlike the 2017 data collection, DWC’s assessment considered the impact of all DFI’s online training courses and specifically focused on the extent to which DFI funders’ M&E indicators were being achieved.

The primary purpose of this assessment was to

1) assess the extent to which DFI funders’ M&E indicators are being met; and

2) assess the impact DFI training courses have had on participants, their organisations and the industry to date.

In-country visits for primary data was collected from five DFI markets in Sub-Saharan Africa (SSA), namely Mozambique (Maputo), Malawi (Blantyre), Zambia (Lusaka), Rwanda (Kigali), and Uganda (Kampala).

Project Deliverables

  • Development of primary data collection tools including a survey for practitioners, and interview and focus group discussion (FGDs) guides for practitioners, CoP facilitators, line managers and HR managers, and institution representatives.
  • Final reports including 1) an overall executive summary; 2) an overall introduction, method, a summary of secondary survey data (collected by DFI at six and 18-month follow-up) and recommendations; and 3) individual country reports for the five aforementioned countries, reporting the achievement of indicators and the perceived impact of the courses.
  • Infographic per country depicting the number of students trained, number of training attended, number of participants in this study and findings per indicators and in terms of overall impact.

Our Approach

The assessment responded to a select list of key indicators of interest. The extent to which these indicators were being achieved in the five markets was explored by a combination of cross-cutting data sources and data collection instruments.

The combination of data collection sources and tools aided methodological and data triangulation, which further allowed for the verification of data and a more textured, comprehensive account of DFI’s impact. A mixed-method approach was utilised. This incorporated both qualitative and quantitative data collection and analysis methods that were inclusive and complementary.

This approach allowed for data gathering in multiple ways and the team was able to elicit a variety of perspectives on DFI courses and impact.

Secondary data was collected via

1) a document review of previous DFI reports; and

2) analysis of programme monitoring data, namely six-month and 18-month ex-post survey which are administered to CIDM graduates only.

Primary data was collected from evaluation participants from four target groups namely

1) practitioners (who completed DFI training courses/in the process of completion);

2) line managers and/or HR managers (individuals who oversee/manage the practitioners or are involved in recruitment and/or development within their companies);

3) CoP facilitators (individuals who facilitate the in-country CoP meetings); and

4) institution representatives (Individuals who work for key institutions within the DFS sector and could provide broad insight into the DFS market/sector within their country).

Data was collected from these participants using an online survey for the practitioner and line manager/HR managers respectively, and FGD/interview guide for practitioners, and an interview guide for line managers/HR manager, CoP representatives and institution representatives. Surveys were administered online on Survey Monkey (and administered in-person in-country to gather more responses) and participants were incentivized to participate with the offer of discounted courses with DFI.

FGDs and interviews were conducted primarily face-to-face in each of the country capitals. The practitioners were invited via email to a CoP meeting, where DWC team members conducted the FGDs. Practitioners in FGDs and in surveys provided the contact details of their line managers and/or HR managers. DWC followed-up with for interviews and CoP facilitators made themselves readily available. And also assisted in arranging interviews with individuals in major institutions, including government ministries, banks, and interbanks.

The evaluation team analysed both primary and secondary data that was collected using ATLAS.ti for thematic analysis of the qualitative data and Microsoft Excel for descriptive statistical analysis of quantitative data.


The assessment provided a valuable opportunity for DFI to take stock of its achievements since 2016. The assessment provided DFI with insight on the extent to which their set indicators were being met. As well as where gaps exist, areas for improvement or best practices that could possibly be expanded upon in different countries.

It also provided insight into the value of and impact that the course may be having on individuals and in their companies. Based on the findings, several recommendations were made, that if implemented, could improve the DFI courses going forward.

Recommendations focused on improving CoP attendance and morale, amendments to and additional DFI courses, identifying local partnerships to reduce costs and increase reach, course support, and monitoring and evaluation (M&E). M&E recommendations noted the challenges experienced with data collection and made suggestions for follow-up data collection in 2019 and 2020.

DWC also suggested that the report be used as a tool for learning. Not only for DFI’s internal planning but for each country’s alumni and CoPs. Specifically, the initiatives being developed in different countries and the achievements that they had should be shared with alumni and CoP facilitators, who may be able to learn how to implement initiatives themselves. This led to the development of infographics per country. Showing how each country fared against the indicators and what overall impact was reported per country.

The infographics can serve as marketing tools of how participation in a course can add value to one’s personal and career development. And as a learning tool for other countries. Especially in terms of launching their own formalized CoPs.

We’d love to work with you

We hope that by showcasing our case studies with you, that it shares insight into the areas of expertise that we have experience in. If you have any questions about the research, evaluation, monitoring and development industry, feel free to contact us.

Case Study: IREX – Evaluation of Mandela Washington Fellowship Programme (YALI)

By Case Study, Evaluation

. With significant years of experience, we’ve been showcasing some of our case studies in various niches of the development, evaluation and research space.

From December 2018 to June 2019, we provided a final impact evaluation for the International Research and Exchanges Board (IREX). The project was funded by USAID and the geographic scope covered 49 countries in sub-Saharan Africa.

The final impact evaluation of the USAID-funded, Africa-based follow-on activities of the Mandela Washington Fellowship (MWF) Program is focused on leadership development.

Project Outline

The Young African Leaders Initiative was launched in 2010 by President Barack Obama. As a signature effort to invest in the next generation of African leaders. The Fellowship commenced in 2014 as the flagship program of the Young African Leaders Initiative (YALI). It is aimed at empowering young leaders from Africa (between the ages of 25 and 35), building their skills to improve accountability and transparency of government, start and grow businesses, and serve their communities. Consisting of academic coursework, leadership training and networking.

The Fellowship is implemented by international non-profit organisation IREX, as a cohort-based program, with six (6) annual cohorts for each calendar year from 2014 to 2019[1]. The program consists of attending a US-based leadership institute and the Mandela Washington Fellowship Summit. Some Fellows also have the opportunity to participate in a professional development experience in the U.S.

The United States-based activities are funded by the U.S Department of State. Managed separately from the Africa-based activities, which are funded by the United States Agency for International Development (USAID). This evaluation was focused on the Africa-based USAID-funded component only.

Steps in the programme

During the course of their stay in the US, each Fellow is expected to put together a Leadership Development Program  (LDP). They finalise when they complete their Leadership Institute and share online for comment and peer review. The LDPs form part of the USAID-funded component of the program. Over time, it was voluntarily adopted by US-based institutes. LDPs are distributed at pre-departure orientations to connect the US-based and Africa based parts of the programme. And to guide with the implementation of their US-based learning when they return to their home countries.

ghana drone shot

Upon returning to their home countries, Fellows continue to build the skills they have developed during their time in the United States through support from US embassies, the YALI Network, USAID, the Department of State, and affiliated partners[2]. Through these experiences, Mandela Washington Fellows are able to access ongoing professional development and networking opportunities. As well as support for their ideas, businesses, and organizations. Fellows may also apply for their American partners to travel to Africa to continue project-based collaboration through the Reciprocal Exchange Component.

The Africa-based activities are designed to support Fellows as they develop the leadership skills, knowledge, and attitudes necessary to become active and constructive members of society. They may also choose to participate in a number of USAID-supported follow-on activities, including professional practicums, mentorships, Regional and Continental Conferences and conventions, Regional Advisory Boards (RABs), Speaker Travel Grants (STGs), Continued Networking and Learning (CNL) events, and Collaboration Fund Grants (CFGs).

To assist with the implementation of these Africa-based follow-on activities, IREX has collaborated with three regional partners in Southern Africa (The Trust), East Africa (VSO Kenya), and West Africa (WACSI).

Project Deliverables

The purpose of this final impact evaluation of the USAID-funded, Africa-based follow-on activities of the Mandela Washington Fellowship (MWF) program was to determine and portray the emerging results of the program and to inform current and future youth leadership programming.

The deliverable was an impact evaluation report that answered the following main evaluation questions:

  1. What is the impact of follow-on activities on male and female Fellows’ skills; knowledge; and attitudes necessary to become active and constructive members of society; compared to those men and women who did not participate in the follow-on activities?
  2. How has the program impacted practices of male and female Fellows in supporting democratic governance through improving the accountability and transparency of government in Africa?
  3. Has the program helped male and female Fellows to start new businesses? To what extent has participation in the program helped Fellow-led businesses expand and become more productive?
  4. How has the program impacted on male/female Fellows’ identification with, and participation in community challenges/social responsibility?
  5. To what extent is the network for Mandela Washington Fellowship male and female alumni who collaborate on issues of democratic governance, economic productivity and civic engagement a self-sustaining network? How have USAID-funded follow-on activities contributed to this?

In addition, cross-cutting themes that had to be considered included: empowerment of women and other marginalised youth, including the disabled and LGBTQI, to address inequalities and development challenges; increase of youth participation overall, with an emphasis on how these empowered youth can contribute to their countries’ development; and the establishment of significant partnerships with the private sector to leverage resources, increase impact, and enhance sustainability of planned activities.

kenya nightscape

Our Approach

The evaluation adopted a mixed-method approach. Gathering both quantitative and qualitative data from a large sample of Fellows who had participated in and those who had not participated in Africa-based follow-on activities. Quantitative data was gathered through an online survey from 1292 Fellows, 35 percent of the total Fellow population. Qualitative data was gathered through one-on-one interviews. Conducted either face-to-face or via Skype, with Fellows and program staff and partners, or through focus group discussions with Fellows, during country visits to six African countries.

In this way, a wide range of stakeholders was included in the evaluation. Quantitative and qualitative data were cleaned, transcribed, analysed and incorporated into the findings of the evaluation. Both quantitative and qualitative data was also gathered from secondary sources, including literature on leadership in Africa, and a range of sources provided by IREX on the Africa-based follow-on activities. In addition to the main report, five case studies were produced, highlighting specific programme outcomes.


The value of this evaluation was two-fold. It showed that the aims and methods of the Mandela Washington Fellowship, including the Africa-based follow-on activities, are highly relevant and in line with literature and best practice on youth leadership development in Africa. It also contributed to the body of knowledge of (youth) leadership development programmes. Specifically those based on the ethos of a values-based servant and transformational leadership.

The evaluation showed that the MWF is highly relevant in fostering individual, group and community values within young people. So that they can become true leaders in their own sectors and communities. In addition, the evaluation showed how these young people solidify their leadership roles within their own careers and sectors at a crucial time when they are progressing. Thereby becoming more respected and influential in their workplaces and communities. And more active in society.

The Africa-based follow-on activities enabled Fellows to solidify the knowledge and skills gained in the US, to ground and root the US-based learning, and helped Fellows put their new knowledge into practice.  The program has strengthened significantly many of the values that the Social Change Model (SCM) of leadership focuses on. Especially the consciousness of self, congruence commitment, collaboration, and also, common purpose and citizenship. Not only of home countries but also of Africa in general.

The evaluation showed that amongst other gains, experiential learning through participation in follow-on activities promoted innovative thinking, facilitated shifts in attitudes towards gender roles, rights and sexuality, and motivated Fellows to engage in social entrepreneurship.

Client testimonial

“Development Works Changemakers was selected from a competitive pool of applicants. One of the aspects of their proposal which stood out was their focus on applying an inclusivity lens to their approach. As well as their demonstrated understanding of the leadership field. And, more specifically, their knowledge and experience with leadership in the African context, in addition to the participatory methodology proposed. Once the evaluation got underway, the timeline proposed was adhered to, despite some difficulties with timely responses from key informants.

Development Works Changemakers sifted through an enormous amount of program document data. They collected and analyzed information from program participants and stakeholders. And worked collaboratively with us to surface the most useful data points and findings to highlight program impact and challenges. Their research was insightful and grounded. Where possible with relevant outside data sources that triangulated findings or demonstrated the nuances they found were unique to our circumstances.

The finished report highlighted the most important findings for our research questions. It provided as much detail as could be extrapolated from the data available. Particularly blending the quantitative and qualitative data findings into a cohesive narrative. Development Works Changemakers were professional, insightful, thorough, and responsive to feedback.  I would highly recommend them for a range of evaluation and assessment work.”

– Cheryl Schoenberg Deputy Director, Leadership Practice IREX, and Former Chief of Party for the Mandela Washington Fellowship

Development Works Changemakers Evaluation

Over the next couple of months, we’ll be showcasing more of our case studies. Highlighting the various methods of our approach.

Do you want to stay up to date with industry news and happenings? you can sign up to our newsletter and follow us on social media.




[1] This evaluation excludes the 2019 cohort.

[2] YALI has also established four Regional Leadership Centres (Ghana, Senegal, South Africa and Kenya), and a number of satellite centres, to offer leadership training programs to young leaders between the ages of 18 and 35. The four RLCs are based at higher-education institutions in their host countries.

Case Study: UNODC – Baseline, endline and impact evaluation of the LULU programme

By Case Study, Evaluation

At Development Works Changemakers, our passion for change can be seen in our several case studies. The Baseline, Endline and Impact Assessment of the Line Up Live Up (LULU) Programme in South Africa began in May 2019 and was recently completed in January 2020.

The client, United Nations Office on Drugs and Crime (UNODC), worked with the Western Cape Government Department of Cultural Affairs and Sports (DCAS) to provide baseline, endline and impact assessment. Focusing on the area of Khayelitsha and Mitchells Plain in the Western Cape, the following findings were recorded.

children in a classroom

Project Outline

The  Line Up Live Up (LULU) programme is a sport-based life skills training curriculum developed to improve youths’ knowledge/awareness, perceptions, attitudes, life skills and behaviours to build resilience to violence, crime and drug use. The programme is designed to be delivered over 10 sessions to male and female youth, between the ages of 13-18 years.

Each session includes interactive sports-based activities, interspersed with reflective debriefing spaces in which life skills are imparted. These sessions are envisaged to lead to various outcomes, which in the long-term include youth engaging less in risk and antisocial behaviours and demonstrating resilient behaviour.

The LULU programme is being piloted in Brazil, Colombia, the Dominican Republic, Kyrgyzstan, Lebanon, Peru, Palestine, Tajikistan, Uganda, Uzbekistan and in 2019, it was piloted in South Africa. In South Africa, the programme is run in cooperation with the Western Cape DCAS as part of its flagship MOD afterschool programme.

In 2019, DWC was commissioned to conduct a baseline, endline and impact assessment of the LULU programme in nine schools in Khayelitsha and Mitchells Plain, two high-crime areas in the Western Cape, South Africa. The purpose was to assess only the short-term outcomes (knowledge and perceptions) and selected medium-term outcomes (attitudes and behaviours) of the LULU programme. The findings of this study are intended to be used for cross-country comparisons, and for informing programme improvements.

Project Deliverables

As part of the assessment, DWC produced:

  • Adjusted data collection tools that were provided by UNODC and adapted to the South African context and made more youth-friendly; these included a baseline/endline survey for youth, a self-reporting survey for youth participating in LULU, and focus group discussion (FGD) guides for youth, coaches, area managers and DCAS management.
  • A literature review focused on the context of crime in South Africa and the Western Cape province specifically, Khayelitsha and Mitchells Plain profile; policy and other approaches to tackling crime in South Africa; and the sports-based life skills approach international and local examples.
  • A baseline report outlining participating learners’ profile (including demographics and experiences of family/home life, school and community) and outcomes of interest prior to launching the LULU programme in schools
  • An endline report comparing baseline data to endline data to assess changes in the outcomes of interest following the completion of the LULU programme; and
  • An impact report, building on the endline report by additional discussing lessons learned and recommendations.

An executive summary report and summary report of the final impact report.

Our Approach

The assessment followed a mixed-method approach, which combined qualitative and quantitative data analysis in order to bring a robust and credible set of findings to the report.

A non-equivalent, multiple group time-series design was employed, whereby data was collected at baseline before the programme commenced and at endline once the programme concluded. Data was collected from learners from 9 schools across Khayelitsha and Mitchells Plain and collected from:

  • Learners who participated in the LULU programme (treatment group)
  • Learners involved in the afterschool MOD programme (control group I); and
  • Learners who do not participate in any afterschool activities (control group II).

While the initial design of this evaluation assumed all LULU learners would have attended all 10 sessions, this was not the case. Due to the high proportion of learners who did not attend all sessions, all learners were still included, but additional analyses were incorporated to assess the extent to which attendance at 1-6 vs 7-10 sessions influenced outcome indicators.

Secondary data was also collected through a literature and programme document review. Primary data was collected using surveys and focus group discussions (FGDs) provided by UNODC, which were adapted by DWC to be more child-friendly, include colloquial language, ensure that all outcomes were adequately measured by adding additional questions and to shorten the surveys to keep learners interested. In terms of primary data collection:

  • Baseline survey data was collected from 724 learners (313 LULU learners; 204 MOD learners; and 207 non-intervention learners);
  • Follow-up endline survey data was collected from 658 learners (262 LULU learners; 195 MOD learners; and 201 non-intervention learners);
  • Endline self-administered survey data was collected from 210 LULU learners; and
  • FGDs were conducted with a) 8-10 learners from five schools, respectively; b) 16 coaches from all nine schools; c) four Area Managers covering the two Metros and d) two DCAS programme management staff.

Ethical approval from a research ethics committee was granted for this evaluation. The programme and the study itself were constrained by a highly limited timeline, which impacted the implementation of the programme. The study period and school timelines forced the programme to be implemented within a five-week period rather than 10-weeks, which limited the programme’s dosage and duration.

The study period also did not allow sufficient time for LULU participants’ learnings to be fully absorbed and advanced. There were also issues with programme fidelity, and most learners did not attend all 10 LULU sessions as required. These issues made it challenging for outcomes, and especially the more medium-term outcomes of attitude and behaviour change difficult to achieve. These challenges were highlighted for consideration for when findings of the study were interpreted.

Data from primary and secondary data collection were analysed using Atlas.ti for thematic analysis for qualitative data, and Microsoft Excel and IBM SPSS for quantitative data to conduct both descriptive and inferential statistics.


The evaluation produced valuable information, including significant lessons learned and recommendations on the LULU programme that may help inform the improvement of the programme going forward in South Africa; lessons and recommendations included the need for key stakeholder buy-in, longer and more intensified coach training; support to coaches and area managers, and the need for psychosocial support for both learners and coaches.

Further, those short-term outcomes that were achieved can provide evidence to potentially support funding and buy-in for the ongoing implementation of the programme in the future. Finally, the data can be used for comparison with the other programme implementation pilot countries, and lessons learned from this assessment can guide programme implementation and the study thereof in these other countries going forward.

Overall, given the difficulties faced, the programme and its implementers/managers should also be commended on the outcomes realised; what was achieved suggests that had the programme been implemented as intended (in terms of dosage, duration and fidelity) and under the right conditions (with full attendance by all learners and enough time for change to manifest within the study period), further outcomes could have been achieved.

Development Works Changemakers Evaluation

Over the next couple of months, we’ll be showcasing more of our case studies and highlighting the various methods of our approach.

To stay up to date with industry news and happenings, you can sign up to our newsletter and follow us on social media.




Case study: Design and Implementation Evaluation of the Cash Plus Care Programme

By Case Study, Evaluation

At Development Works Changemakers (DWC) we have a passion for social change, and this is seen in our number of case studies and successful evaluations. In April 2019 we started working with the Western Cape Department of Health (WCG: Health) and Desmond Tutu HIV Foundation (DTHF) to provide design and implementation evaluation in the health and social development sector. The project continued until August 2019 and was funded by Global Fund to Fight AIDS, Tuberculosis and Malaria (GF).

Project Outline

The Cash plus Care programme is a component of the Western Cape Department of Health’s (WCG Health) Young Women and Girls Programme (YW&G), funded by the Global Fund (GF). This programme was implemented in two neighbouring sub-districts in Cape Town, Mitchells Plain and Klipfontein.

The aims of the YW&G programme were to: decrease new HIV infections in girls and young women (aged 15-24); decrease teenage pregnancies; keep girls in school until completion of grade 12, and increase economic opportunities for young people through empowerment. The objectives of the intervention were to: enhance the sexual and reproductive health of participants through preventative and promotive health and psychosocial interventions whilst enhancing their meaningful transition to adulthood; and to reduce HIV and STI incidence, unintended pregnancy and gender-based violence amongst Cape Town women in late adolescence.

The programme, also called “Women of Worth” (WoW), provided 12 weekly empowerment sessions and access to youth-friendly healthcare, seeking to address a range of biomedical, socio-behavioural, structural and economic vulnerabilities amongst participating young women. Cash incentives of R300 per session were provided to the young women for participation in empowerment sessions. Implementation of the programme was sub-contracted to the Desmond Tutu HIV Foundation (DTHF) by the Western Cape Department of Health in November 2016, and implementation of the programme began in April 2017.

DWC was contracted by the WCG: Health to conduct an evaluation to assess the Cash plus Care programme’s design and implementation. Specifically, the evaluation focussed on the appropriateness of the programme design, incentives, and recruitment processes; beneficiary and stakeholder satisfaction; and the quality of implementation. The evaluation also identified successes and challenges/barriers and made recommendations to the Global Fund and WCG: Health to inform the design and implementation of future programmes.

Pentecostal church

Image: Pentecostal Upper Hall Church

Project Deliverables

Project deliverables were:

  • A theory of change workshop with all key stakeholders from WCG: Health and the DTHF
  • A theory of change diagram showing all key contextual factors, assumptions, inputs, outputs and outcomes
  • A draft evaluation report
  • A final evaluation report
  • A final report in 1:5:25 format

Our Approach

This evaluation was formative and clarificatory aimed primarily at learning and improvement, but including some elements of early impact assessment (particularly unintended consequences), and summative aspects that informed decision-making on the future of the programme and similar programmes.

The evaluation adopted a mixed-methods evaluation design, which used mostly qualitative data. Existing quantitative data was drawn on where appropriate from the various programme and other documents reviewed.

Both primary and secondary qualitative data were gathered and analysed to answer the evaluation questions. This evaluation, which was essentially a rapid assessment, relied on the design strength produced by a mixed-methods approach which allows for triangulation of method, observers and measures. Given that the programme experienced several changes in its initial design, the developmental approach followed emphasised learning and improvement.

Secondary data was obtained from project documents, while primary data was obtained from the following sources:

  • A Theory of Change workshop with a large range of stakeholders;
  • Key Informant Interviews with WCG Health and DTHF staff (16 interviews);
  • One day site visits to five of the 11 operating empowerment session venues;
  • During the site visit, interviews with the site facilitator, at least one FGD with current participants at the site, and one-on-one interviews with any other participants, where targeted. Empowerment sessions were also observed and key aspects were noted on an observation tool;
  • Telephonic interviews with previous graduates who were “cash-no”;
  • Telephonic interviews with programme dropouts, both “cash-yes” and “cash-no”.

In all, interviews and FGDs involving 73 beneficiaries were conducted.

Phillipi village

Image: Philippi Village


The value of this design and implementation evaluation was firstly that it provided the WCG: Health and the DTHF with an independent examination of the complexities of implementing the Cash Plus Care programme in the Western Cape. A number of challenges had been experienced by the programme implementers since its inception. Many of these related to various delays with contracting the sub-recipient (DTHF) and the subsequent rush to catch up by the sub-recipient on various aspects of the programme. Another key challenge was that this programme was both an intervention (with the above-stated aims) and an academic random control trial, designed to inform the GF of the efficacy of using cash incentives to change behaviour around risky behaviour.

Many of the design and implementation challenges had to do with trying to align the needs of a random control trial with the realities of implementing an empowerment programme with young women living in a complex socio-economic setting. The fact that half of the targeted participants were randomly allocated to the “cash-yes” group, while half were allocated to the “cash-no” control group caused numerous problems.

For a start, those in the control group quickly learnt that they were not receiving cash and many then dropped out. Recruitment of young women for the study was also not effective at the beginning of the programme, which meant it struggled to reach its targeted numbers in time. Only once cash incentives were made available to all participants and a proper community mobilisation team was put in place did the numbers pick up and the programme was able to reach its goal. This evaluation brought to light many of these issues and made recommendations for how to mitigate them in future such programmes.

The delayed commencement of the programme and subsequent rush also resulted in the biometric system which was being used to manage the participation and incentive system not being properly ready. Many glitches were experienced which had to be corrected and mitigated along the way. This evaluation helped to document these problems and the ways in which they were solved.

The evaluation also brought to light the experiences of participants and the value they felt the programme had brought to their lives. It showed some emerging elements of empowerment and behaviour change, as well as new forms of social cohesion forming between attendees. The many recommendations made, based on these findings, were of great value to the programme implementers and funders, who received the evaluation very positively.

Image: Tell Them All International

Image: Tell Them All International

Development Works Changemakers Evaluation

Over the next couple of months, we’ll be showcasing more of our case studies and highlighting the various methods of our approach.

To stay up to date with industry news and happenings, you can sign up to our newsletter and follow us on social media.