This blog is based on Save the Children's SLEIC learning report which was authored by Fergal Turner, Alpha Jalloh, Lydia Kamara, Meera Rajasooriar, Richard Germond and Magdalene Abu, and which is linked below. This is the fourth in a series of blogs written by Save the Children on SLEIC. If you aren't familiar with the programme, we recommend reading previous blogs in this series, which are linked at the bottom of this page.
In our last blog we shared our immediate response to the results of the final evaluation of the Sierra Leone Education Innovation Challenge (SLEIC) .
The headlines are (relatively) simple; we had an impact on Maths scores but not on English scores. Our results fell short of both the targets set by EOF, and of results achieved by some other providers. The positive impact on learning levels in maths were meaningful and should not be ignored, despite not hitting the levels we had hoped to.
There is a tendency in our work to walk away from evaluations at this point. Teams move on to other projects, and energy is needed elsewhere. We are determined that this isn’t the case for SLEIC.
In the final year of implementation, we have supplemented our monitoring data with learning workshops with teachers, headteachers and School Quality Assurance Officers; focus group discussions with parents, teachers, children and School Management Committee (SMC) members. Now that the programme is over, we have brought together our own monitoring data, the results of the evaluation, and the opportunity to compare notes with our peers.
So what have we learned?
The main output of this is a comprehensive learning report. This report brings together all the reflections, analysis and interpretation we could squeeze out of our rich evidence base. For those interested in the details of what we did, what worked and how we know, we would recommend accessing that report here: Sierra Leone Education Innovation Challenge: Final Learning Report.
However, for the time poor or marginally interested we provide here examples of what went well and what didn’t, as well as our reflections on why.
One thing that went well, and one thing that didn’t
To illustrate what we have learned, it is useful to look at the cases where things clearly worked well, and cases where they clearly didn’t.
School Supportive Supervision
Our approach to teacher professional development was split into group training cascaded down to sub-district clusters of teachers, and regular visits to schools for observation, coaching and mentoring. This latter activity we referred to as School Supportive Supervision (SSS).
How do we know it worked?
From our data we have seen the following:
- Between the beginning and end of the programme, more than 7 in 10 teachers reported improving in their teaching competencies.
- Between the end of the second year, and the end of the third year of implementation we saw an 11-percentage-point increase in target pedagogical practices observed in classrooms.
- Government School Quality Assurance Officers (SQAOs) were central to the approach, with almost 60 percent of observations and coaching being carried out by SQAOs by the end of the programme.
- In most schools, we saw documented evidence that headteachers too were increasing their support to teachers, with 5 in 10 leading more teacher peer learning circles, and 6 in 10 leading more observations and more coaching conversations.
- Within the cohort, volunteer teachers were almost twice as likely to feel they had improved, and more likely to show observable improvements in their practice[1].
Without a counterfactual we can’t directly attribute these changes to our intervention. However it gives a plausible basis for thinking that the School Supportive Supervision approach was working[2]. This aligns with reflections from other providers, that coaching and mentoring were the engine room for impact in SLEIC.
What made it work?
In reflection, and particularly through workshops we held with SQAOs the following points have come out consistently:
- The approach built on existing responsibilities: Headteachers and SQAOs already had a mandate for supporting teachers, what they needed was additional tools, training and support to deliver that mandate.
- Co-creation and simple, usable tools: the digital SSS tool was designed in collaboration with SQAOs. This ensured that it aligned with their view of what was most useful and practical, and that they felt ownership of the approach.
- Iteration and consistency: Over the course of the second and third years of the programme we focused on cycles of three weeks of time in schools, and one week to reflect, go through data and plan for the next cycle. While we struggled to turn around detailed analysis of the SSS data for these reflection weeks, the opportunity for SQAOs and project officers to regularly meet to discuss their work and plan for subsequent cycles was of significant value.
Learning Clubs and Teaching at the Right Level (TaRL)
We had two elements of our theory of change which drew on the TaRL approach. The first was TaRL sessions in school for children in the upper grades of primary (P4-6), and the second was after school learning clubs informed by the TaRL approach in schools to support learners struggling in Maths and English.
How do we know it wasn’t working?
Our data on this is not as complete as for other aspects of the programme. But from our monitoring data we can say that:
- Learning clubs and TaRL sessions happened much less frequently than was intended – the ‘why’ for this comes in the next section. It was intended that they would occur twice a week, but from our data, only fifty percent of observations reported that at least one session had happened in the previous week.
- By the end of the programme, 5 out of the 65 schools had decided to stop running learning clubs. This reflected a challenge in maintaining a cohort of learning club facilitators, the majority of whom were volunteer teachers. This was exacerbated by the fact that we couldn’t provide stipends for them for the additional hours worked.
- Qualitative reports from SQAOs and Headteachers report that at all levels, there was confusion about how the TaRL approach was to be implemented, including how groups would be split, how assessment data would be used, and how lessons could be differentiated for different groups.
What went wrong?
From our reflections, the challenges that were highlighted included:
- A failure to address the challenge of motivation both of children participating in the after school Learning Clubs, and of the volunteer facilitators, without being able to provide stipends, or additional feeding for children staying after school.
- An approach which was not sufficiently clearly articulated, and which was seen by those tasked with its delivery as being confusing and difficult to deliver in practice in the specific context of the schools where we worked.
- An approach to training which focused on introducing the approach through district level cascade training. Reports highlight that what was missing was school level practical modelling of how the approach could be delivered in practice.
Children reported that they appreciated the extra time for learning, and the play-based approach used in TaRL sessions and learning clubs, but the approach was undermined by fidelity and consistency in delivery.
Why, why, why?
We can’t attribute the learning gains we achieved to specific activities within our theory of change. However, we can reflect on specific activities to look at what went well and what didn’t. From our two case studies it isn’t hard to draw out common lessons as to what drove success in implementation. When we were iterative, co-creative and focused on clear simple tools and communication our activities worked well, and when we weren’t, they didn’t. Some common lessons we have arrived at are:
We tried to do too much too quickly: Our theory of change for SLEIC was holistic, encompassing teacher professional development, learning clubs, book banks, community engagement and work on safeguarding and child protection. With a small budget, a small team, and a short time frame we struggled to deliver all these diverse activities with fidelity. Our lesson from this is do less and do it well.
Moving quickly, and missing steps: We had a very short inception period for SLEIC, and there was an emphasis on delivering as many activities as possible in the first year. This meant that we didn’t spend as much time as was needed on design and adaptation of activities. Approaches such as the learning clubs, were informed by successful Save the Children programmes in other countries, but delivered without sufficient consideration of local context and schools’ needs. Our lesson from this is move slowly, create strong foundations and build on them.
“Settling-in” to an outcomes driven mindset: In our first year of implementation, we focused on activities, and our approach to monitoring and evaluation was not designed to support rapid reflection on whether these activities were leading us towards outcomes. This improved dramatically from the middle of year two (hence data presented previously excluding the first year). Across providers we have agreed that the ability to quickly analyse and use data to change course was a huge driver of success, and we were slow to get into this mindset. Our lesson from this is define and track what success looks like at each stage of delivery, be prepared to act based on what you see.
These may seem like truisms, that implementation beats design, and simplicity done well beats complexity done inconsistently. However, I doubt anyone who works on education programmes will read this without a sense of familiarity.
Our task is to make sure we are learning our lessons, building on what works and not shying away from discussing what doesn’t. Through our technical packages, Save the Children has an arsenal of evidence-based approaches. What we are learning from SLEIC can help us make sure that we can consistently, and effectively turn these quality approaches into quality programmes.
What comes next?
We have two more pieces of this puzzle to share:
- Our reflections on Outcomes Based Financing, how it shaped our engagement in SLEIC, and what it will mean for us in the future.
- A short paper articulating what we have learned from our data on Teacher Professional Development (TPD).
These will be published in the coming months. In the meantime read back through previous blogs, or dive into the full learning report.
- In Case you Missed It
Read our previous blogs in this series on learning from SLEIC:
Reflections on year 2 evaluation results
Reflections on a final national learning event in Freetown
The results are out! A summary of Save the Children's evaluation results
- Read More
Read the full learning report here:
Sierra Leone Education Innovation Challenge: Final Learning Report
[1] Volunteer teachers are unpaid, and usually unqualified community volunteers, who made up almost fifty percent of the teachers who we supported.
[2] We will share a more detailed run-down of the data we have on our approach to Teacher Professional Development in a separate blog in the coming months.





