Blog
Adaptive testing and iteration: Scaling interventions in global education
Heather Leigh Kayton
Scaling educational interventions effectively remains one of the most pressing challenges in global education. At the inaugural What Works Hub for Global Education conference, four speakers shared their approaches to addressing this challenge using adaptive testing and iteration to enable data-driven scaling of educational interventions.
The session, chaired by Sabina Morley from the UK Foreign, Commonwealth & Development Office, showcased how different organisations are using iterative testing cycles to refine and improve their educational interventions. The session included presentations from the following speakers:
- Nancy Gikandi, Research and Development Manager at Dignitas
- Claire Cullen, Head of Research and Innovation at Youth Impact
- Dewi Susanti, Senior Director of Research at Global School Leaders
- Rebecca Daltry, Research Manager at Jigsaw and EdTech Hub
These presenters highlighted a common theme: the power of iterative testing to inform programme improvements. Each speaker shared insights from their experiences across different contexts. Their collective experiences illustrated diverse approaches to evidence-based programme development, revealing how ongoing testing enables organisations to not only improve their interventions but also adapt them for successful scaling.
Day 2: Session 1 – Adaptive testing and iteration (© What Works Hub for Global Education 2024)
Optimising education interventions for Impact@Scale: Lessons learned from Dignitas’ LeadNow testing
Nancy Gikandi opened the presentations with a compelling vision:
“Imagine a world where schools are vibrant places where children can thrive and succeed.”
This vision, she explained, motivates Dignitas to keep optimising and improving their interventions to create impact at scale.
Gikandi’s presentation described lessons learned from interventions implemented using the Dignitas LeadNow training and coaching app. Using an A/B test design, this study evaluated three different approaches to digital leadership training in Kenya: providing school leaders with a training app only, combining the app with in-school professional learning communities, or pairing the app with remote coaching calls. The training focused on improving the formative assessment practices teachers used to help improve learning in their classrooms. The results revealed that participants who received coaching alongside the training app were more likely to complete the training modules, these participants also showed increased confidence and improved formative assessment practices in the classroom.
Gikandi emphasised the value of iterative testing for understanding how best to create impact, suggesting a two-step approach: first testing innovations with small groups before scaling up with careful attention to sampling and representativeness.
(View Nancy Gikandi’s slide deck [PDF].)
Identifying scalable models: Aggregating evidence across 8 rounds of a numeracy intervention
Claire Cullen introduced A/B testing as valuable method for optimising educational interventions in real-time. She noted that at Youth Impact A/B testing has transformed how the organisation develops and refines its educational interventions, making continuous improvement a core part of their approach.
Their work with the ConnectEd programme in Botswana demonstrated the method’s effectiveness in real-world settings. This study employed an iterative A/B testing approach to optimise a phone-based tutoring programme. In the study, participants were randomly assigned to either receive the standard version or modified versions of the programme (such as adding WhatsApp videos), with student learning assessed via phone regularly to compare the effectiveness of different programme variations.
Youth Impact’s A/B testing revealed valuable insights in two key ways. One was identifying adaptations that can improve the programmes efficacy (‘effectiveness-increasing innovations’). The second was identifying adaptations that reduce cost without compromising learning impacts (‘cost reducing innovations’).
One of the effective approaches Cullen highlighted was engaging caregivers at home in the phone call tutorials. This modification was made based on a suggestion from implementers who advised that the phone call tutorials were smoother and had more engagement when caregivers were involved. When tested, the results showed that encouraging caregivers to co-lead the tutorial substantially increased the impact of the phone calls.
Iterating the role of school principals in improving foundational numeracy through individual and group targeting in India and Indonesia
Dewi Susanti presented research on leveraging school principals to improve foundational numeracy. She noted that while we know from evidence that school leaders play a pivotal role in enhancing student outcomes, we do not yet know how best to support school leaders to do this effectively.
Through a series of studies in India and Indonesia, the researchers at Global School Leaders employed an iterative testing approach across programmes over three years, combining teacher training interventions with different levels of school leader involvement. Using mixed-methods research with high-frequency monitoring to establish effectiveness and qualitative research to understand the mechanisms involved in improving effectiveness, the studies found that school leaders’ effectiveness depends heavily on their capacity and motivation.
The research highlighted an important tension: in hierarchical societies, school leaders often prioritise administrative duties over instructional support. A particularly challenging finding was that school leaders consistently struggled with classroom observations and teacher feedback, often hesitating between asserting authority and respecting teacher autonomy.
Susanti’s team found that a mix of instructional and managerial leadership training approaches was necessary and asked a critical question for future research ‘How could school leaders find more time, improve their skills, and provide direct supervision without being intrusive of teachers autonomy?’.
(View Dewi Susanti’s slide deck [PDF].)
Digital personalised learning in Kenya: Findings from a multi-strand implementation research study
In the final presentation of the session, Rebecca Daltry presented findings from a multi-strand implementation study of digital personalised learning in Kenya. The research, conducted in partnership with EIDU, examined how classroom-integrated digital tools can support learning outcomes.
The study evaluated an innovative app that combines teacher lesson plans with adaptive student content, all mapped to Kenya’s competency-based curriculum. Through a randomised controlled trial, the research revealed nuanced impacts of digital personalisation on learning outcomes.
Key findings showed that while the tool had significant positive effects, particularly for lowest-performing learners, there were important implementation challenges. A potential ‘trail-off’ effect emerged after the midline results, and researchers observed that ‘fast-learners’ received more access to the tool as teachers typically provided it when students completed their regular work. Daltry emphasised that teachers are the primary agents in successfully implementing these tools. This insight challenges assumptions about automated personalisation, highlighting instead the crucial role of teacher judgment in effectively integrating digital tools into classroom practice.
(View Rebecca Daltry’s slide deck [PDF].)
Addressing key challenges
The session concluded with a rich discussion about contextual considerations when implementing and scaling educational interventions. The central role of data and evidence-based iteration resonated throughout the session, as Rebecca Daltry noted, success comes from ‘implementing, reflecting, iterating’ repeatedly to understand what works in different contexts.
Key themes also emerged in the discussion regarding the importance of strong relationships between government and implementation partners. A particularly interesting exchange focused on how different organisations approach government engagement. While Dignitas emphasised co-creation at the county level in Kenya, Youth Impact highlighted the value of building on long-standing relationships and identifying champions within existing programmes. These varied approaches underscore the importance of context-specific strategies when scaling educational interventions.
Building guidance on best practices for using approaches like adaptive testing and iteration while considering contextual needs carefully will help organisations build on experiences to scale their educational interventions more effectively. Closing the session, Sabina Morley highlighted the exciting future of the What Works Hub for Global Education:
“It’s really exciting that the What Works Hub is going to be creating guidance on how best to do this. We’re looking forward to hearing more of that guidance in years to come and at future conferences.”
Watch the full session on YouTube.
Heather, L. K. 2024. Adaptive testing and iteration: Scaling interventions in global education. What Works Hub for Global Education. 2024/004. https://doi.org/10.35489/BSG-WhatWorksHubforGlobalEducation-BL_2024/004
Discover more

What we do
Our work will directly affect up to 3 million children, and reach up to 17 million more through its influence.

Who we are
A group of strategic partners, consortium partners, researchers, policymakers, practitioners and professionals working together.

Get involved
Share our goal of literacy, numeracy and other key skills for all children? Follow us, work with us or join us at an event.