97BN

IEFG BIG Series: Measuring what Matters in EdTech

July 14, 2024 International Education Funders Group (IEFG)
IEFG BIG Series: Measuring what Matters in EdTech
97BN
More Info
97BN
IEFG BIG Series: Measuring what Matters in EdTech
Jul 14, 2024
International Education Funders Group (IEFG)

Send us a Text Message.

Welcome to the IEFG Brains in Gear Series. In this conversation, our hosts and guests will demystify how we measure the impact of EdTech. They will discuss the need and methods used to ensure we're collecting the right kind of evidence to ascertain that a particular EdTech has its intended effect on students' learning.

The hosts for this episode are: 

  • John Soleanicov, the Co-Lead of the Learning Schools Portfolio at the Jacobs Foundation. He is a Romanian-born, U.S.-raised professional with more than 15 years of global experience in financial services, management consulting, and education. 
  • Suraj Shah is the Lead for Strategic Partnerships for the Mastercard Foundation Centre for Innovative Teaching and Learning. He is responsible for collaborations between the Mastercard Foundation and the various Governments/Ministries of Education in Africa and Strategic Partnerships with education stakeholders. His primary focus is aligning with governments across Africa to bridge education challenges with innovative EdTech solutions. 

Joining them will be 

  • Ciku Mbugua, Innovation Manager at Brink, works on one of Brink’s programmes, Edtech Hub, across two teams: In the innovation team to support teams to iteratively test and grow promising EdTech interventions using the sandbox tool, and in the engagement team as the country lead in Kenya. As country lead, Ciku specifically supports government, development partners and EdTech entrepreneurs with evidence and technical assistance to accelerate technology integration in education.  
  • Loïc Watine, Chief Research & Policy Officer at Innovation for Poverty Action,  is responsible for leading our overall strategy to ensure that our research is relevant to and used by decision-makers and to equip them to use data and evidence to improve the lives of people living in poverty. He provides strategic guidance and leadership to a team of senior leaders overseeing IPA’s Sector programs, Policy team, and Right-Fit-Evidence unit, who together form the Research & Policy global team and work closely with our Country Office teams.  

Here are some resources if you would like to learn more about evidence measurement in edtech:

  • Here's a link to the EdTech Hub website and evidence library.
  • Learn more about LEVANTE here.
  • Learn more about m-Lugha here.

Here is a resource bank of events, resources, and collaboratives to learn more about EdTech. You can add more resources and comment on the current list to enrich it further.

This podcast was brought to you by the International Education Funders Group, curated and edited by Anjali Nambiar, with post-production by Sarah Miles. You can learn more about the IEFG at www.iefg.org

Subscribe to the podcast so you never miss an episode! And don't forget to rate and recommend this podcast to your colleagues.

You can follow the IEFG on LinkedIn here. https://www.linkedin.com/company/international-education-funders-group-iefg

Show Notes Transcript

Send us a Text Message.

Welcome to the IEFG Brains in Gear Series. In this conversation, our hosts and guests will demystify how we measure the impact of EdTech. They will discuss the need and methods used to ensure we're collecting the right kind of evidence to ascertain that a particular EdTech has its intended effect on students' learning.

The hosts for this episode are: 

  • John Soleanicov, the Co-Lead of the Learning Schools Portfolio at the Jacobs Foundation. He is a Romanian-born, U.S.-raised professional with more than 15 years of global experience in financial services, management consulting, and education. 
  • Suraj Shah is the Lead for Strategic Partnerships for the Mastercard Foundation Centre for Innovative Teaching and Learning. He is responsible for collaborations between the Mastercard Foundation and the various Governments/Ministries of Education in Africa and Strategic Partnerships with education stakeholders. His primary focus is aligning with governments across Africa to bridge education challenges with innovative EdTech solutions. 

Joining them will be 

  • Ciku Mbugua, Innovation Manager at Brink, works on one of Brink’s programmes, Edtech Hub, across two teams: In the innovation team to support teams to iteratively test and grow promising EdTech interventions using the sandbox tool, and in the engagement team as the country lead in Kenya. As country lead, Ciku specifically supports government, development partners and EdTech entrepreneurs with evidence and technical assistance to accelerate technology integration in education.  
  • Loïc Watine, Chief Research & Policy Officer at Innovation for Poverty Action,  is responsible for leading our overall strategy to ensure that our research is relevant to and used by decision-makers and to equip them to use data and evidence to improve the lives of people living in poverty. He provides strategic guidance and leadership to a team of senior leaders overseeing IPA’s Sector programs, Policy team, and Right-Fit-Evidence unit, who together form the Research & Policy global team and work closely with our Country Office teams.  

Here are some resources if you would like to learn more about evidence measurement in edtech:

  • Here's a link to the EdTech Hub website and evidence library.
  • Learn more about LEVANTE here.
  • Learn more about m-Lugha here.

Here is a resource bank of events, resources, and collaboratives to learn more about EdTech. You can add more resources and comment on the current list to enrich it further.

This podcast was brought to you by the International Education Funders Group, curated and edited by Anjali Nambiar, with post-production by Sarah Miles. You can learn more about the IEFG at www.iefg.org

Subscribe to the podcast so you never miss an episode! And don't forget to rate and recommend this podcast to your colleagues.

You can follow the IEFG on LinkedIn here. https://www.linkedin.com/company/international-education-funders-group-iefg

Ciku: The reality of EdTech is that it cannot only wait for large RCT studies to tell us what it is that we need to know. 

Suraj: One of the things that really kind of caused a few alarm bells was the GEM report from UNESCO, which said while we're seeing a lot of interventions of education technologists. Over the years, especially with low and middle income countries, we're still not seeing the evidence of whether that actually is improving education outcome.

Loic: Another thing that probably is more unique to edtech is that you are very data rich. You can constantly do A B tests and the like and constantly use the feedback loops, even after designing the initial version of those to improve your product. 

John: One of the problems with that is people always think of tech in isolation and they'll get a study with, you know, look at the impact I've had in one of these contexts.

And this is not, I think in the research world, there's not often a lot of external validity. So you've proven something in one point, in one place, one set of parameters. Uh, it's not necessarily going to work in another place because those conditions aren't the same. 

Anjali: Welcome to the IEFG Brains in Gear Series.

In this conversation, we'll hear our hosts, John Soleanicov from the Jacobs Foundation and Suraj Shah from the MasterCard Foundation demystify the evidence behind good and bad EdTech. Joining them will be Ciku Mbugua from the EdTech Hub Africa and Loic Watine from the Innovation for Poverty Action.

John: Hello. Hi, I'm John Soleanicov from the Jacobs Foundation. Today we'll dig into the critical questions around what do we know about what works and what doesn't work in EdTech in low and middle income countries. We'll discuss the research field on EdTech and NYCs. and what type of evidence the field needs.

Suraj: Thank you, John. And I'm Suraj Shah from the MasterCard Foundation. Together with John, we'll be discussing some successful EdTech interventions. So today we have with us Ciku Mbugua from EdTech Hub Africa and Loic Watine from Innovation for Poverty Action joining us, in contextualizing these discussions with a lot of data and evidence.

John: We at the Jacobs Foundation, we think a lot around evidence. How do we know what works and what do we think about evidence? Not all evidence is created equal. A little of it seems tailored to the context in which we work. So I'm curious, Soraj, if you see the same and what's on your mind when it comes to this topic?

Suraj: You're right, uh, John. A, there's a lot of evidence. And there's different evidence which doesn't seem to be connected. One of the things that really kind of caused a few alarm bells was the Jim report from UNESCO, which said, while we're seeing a lot of interventions of education technologies over the years, especially in low and middle income countries, we're still not seeing the evidence of whether that actually is improving education outcome.

And I'm, I'm thinking, is there way too much. That is not connected. And if so, what do we need to do? What's worked? What's still to be proven? 

Loic: The few lessons that are clear enough across context, one of them is the importance of adaptation and individualization, which is really a big promise of EdTech. And in fact, when the, the EdTech is used to, individualize the learning approach to the child that that tends to work.

Another one is about the use of technology to improve quality of teaching, standardizing a little bit. I was involved in one study in Ghana on the impact of broadcasting lessons from a studio with some interactions with the teacher in the classroom and that showed partly good impact. You get some tricky debate there.

Are you replacing teacher or are you negating what they can bring in on themselves? I'm not sure how possible it is to do that sustainably in that scale. There's certainly something there and I think it's, it's probably most relevant to settings where you have problems with the teaching quality. It's particularly weak or you don't have teachers, you know, you can think of remote areas or refugee settings where, where this kind of approach could be, could be worth.

Supporting on the other side. Some of the things we've learned as well is that tech alone is not going to work. It's not really a silver bullet. You can think about, you know, using mobile to do SMS reminders for school engagements for parental engagement, sharing information. You could think about using AI to do coaching to teachers or tutoring, but the tech itself hasn't had much impact in the value stage.

John: And I think one of the problems with that is, is people always think of tech in isolation, and they'll get a study with, you know, look at the impact I've had in one of these contexts. And this is not, I think in the research world, there's not often a lot of external validity. So you've proven something at one point in one place, one set of parameters.

It's not necessarily going to work in another place because those conditions aren't the same.

Suraj: John and Loic, you're right. We work at the intervention of working with policy makers around expanding access to education using technology. It's putting the emphasis that we're learning regular education with technology as a tool. We used to have books. We now have technology to kind of enhance that learning experience.

And this is where I think even Sort of large scale projects tend to stall because there's a lot of emphasis on how the infrastructure and what the devices will be like, and, you know, how we'll place it, et cetera, there's very little focus around how we skilling the educator. What is the content we're looking at?

What is it that we're trying to do with this large scale national implementation project? Have you seen this? Has either one of you seen, wow, place where. Maybe that has started to work where we are, you know, focusing on the North Star. What is the North Star and that's improving education skills outcomes rather than just saying, you know, X number of people have digital skills.

John: I don't have an answer in top of mind, but I have another element to the problem. One of the challenges we sort of create ourselves in the funding community, which is You know, a tech is for better or worse is very attractive. And so funding sort of follows off in the things that are for lack of a better word, sexy, attractive, interesting.

And they think of tech in isolation, this idea that let's do tech for tech sake. And funding kind of follows that. And then if you were on the other side, you're trying to get that funding. You have to position your. Project in a way to speak to the funder, and it's also an interesting reflection for the funding community.

And when I say that, I think, I mean, not only foundations of bilateral and multilateral funders. They often fall into the same trap. I'm afraid the hype is going to just make that worse in a way. And when you add this to context where maybe you're missing connectivity, you're missing the tech ability of teachers, but hopefully somebody has an example of where, uh where the holistic intervention approach has worked. I don't have one off the top of my head. 

Suraj: So Ciku, you have some evidence of what has worked so far. How has it worked? 

Ciku: I think if I was to think about what we are uncovering as Working is co creating the intervention with those that it's going to serve, we have found is really important.

We've done a couple of studies with parents, just using technology to kind of upskill them to the place where learning can continue at home. Same case with the teachers. We are about to conclude some very interesting digital personalized studies in Kenya. And I think one of the things that's becoming more certain is This only works if it's put alongside structured pedagogy, so again, the role of the teacher becomes increasingly crucial.

It's like enabling the teachers to understand they have a substitute teacher in the technology. What's not working? Making humongous capital investments on technology without understanding why. I think Kenya, again, is a really good example. We have spent a lot of money on the digital literacy program, but until today, we have We're not very clear on what the goal of that program was.

There was nobody within all of the different government sectors that we could say, this is the one person that's responsible for this program. And so now what we're trying to do as a country is kind of go back where we're rolling back to make sure we can put some things in place that would enable us to enjoy that investment.

Loic: One thing I would add on government is I actually think the governments are often left out of the equation. And maybe that's similar to what you were saying at a higher level. Yeah. The way it take companies work and the way that who typically funds and runs those companies, those guys are not necessarily the kind of guys that are well suited to, in fact, their incentives are not necessarily pushing them towards partnering with government and thinking about how their solution is going to complement the government systems really and how the teachers work and so on and so forth.

So there's way more focus on going straight to the learners because, you know, that allows you to have more of a common. Silicon Valley playbook, basically, then the things that are really going to compliment the existing system. And I think philanthropy has a role there to sort of correct that market challenge.

Suraj: This is something where we've got experience now over the last four years as Muscat Foundation. We're creating enabling edtech ecosystems in various countries. Currently, our program on an edtech fellowship runs in nine countries through eight different partners. And the idea is to get everyone sitting together.

It's not a private sector do it alone and it's not a government do it alone. Neither should we ignore the third factor, which is the users, the teachers, the learners, the youth, the parents, you know, not leaving anyone behind. Since COVID, I think we've found that a lot of governments have realized this is not a government do it alone.

So they've started to sort of say, okay, we need to see what the private sector is doing. You know, they see the impact of what you're doing is they can provide the scale for you. And this is what we're trying to establish in some of these countries and kind of show how that successful model works. But, um, again, you know, to convince policymakers, et cetera, we still need to give evidence and not just, you know, a few examples of how things work.

Where is that evidence? Where are the best places to get that evidence on a tech? You know, is there a place to go and learn about what works? What doesn't? I'm open to either one of you, John. Like, uh, she could say a few things around that. 

John: One thing I would say is that every policy environment is different.

And so what, one of the things that governments need ultimately, right? To do better policies, better capabilities. And what we've seen emerging is these delivery units or labs, housing, government, we call them ed labs. And the goal there is to sort of source evidence wherever it is and translate it in a way that makes sense for policymakers.

And I think that's where, you know, that's the way to do it. I don't think going to Google and searching or even going in academic journals is going to solve it. I think you need a sort of a translation mechanism that really requires. Taking what's out there and translating it to the local context and needs.

Uh, so I know IPA has done a little bit of work there. Like, I don't know if you, if there's something you can share on that work. 

Loic: You've summarized it pretty well. Maybe I can, I can give a little bit of stories here. So we basically are working with somewhere like 12 different ministries of education in LMI countries, Africa, Latin America, Asia, to build that internal capacity, those units, it takes very different forms because very organic and very context specific.

To have the government and the system as a whole be learning itself, so to speak, and test things out and just like you said, looking at the evidence out there, one of the example that's most related to technology is, is in Rwanda, where the Ministry of Education has really taken the initiative to Use technology to test learners at scale to have this information about learning outcomes to drive policies and actions at all levels in the system from the teachers all the way up to the government.

And we've supported that effort and hoping that this can be leveraged again at all levels with lots more informed decisions and also cheaper studies because you certainly have all the learning outcomes at your fingertips. 

Ciku: I like that. And if I could add, we're very fortunate to be part of what IPA is spearheading with Kenya government, but now speaking from a researcher's perspective is taking a couple of steps back and recognizing that the reality of edtech is that it cannot only wait for large RCT studies to tell us what it is that we need to know.

And so how do we infuse, yes, the long term studies, but then how do we infuse innovation in how we generate evidence? So at EdTech Hub, we have a team that I'm part of that does, that runs sandboxes. Sometimes it, it's really, Helpful to demystify an assumption. And we found that sandboxes are brilliant at that.

I think something else that we're beginning to lean on is implementation research. So there's lots of organizations that are on the ground implementing, but may not have the capacity to hold that data and. Keep it in a way that's translatable, and I think we would like to encourage that. When I think about also disseminating evidence, our teachers need to understand what good looks like in the classroom.

They are, after all, part of crafting some of those solutions, and they are part of generating, actually, the data that we then use, you know, in boardrooms to make decisions. How do we not just keep information at the top, which I think sometimes is what we often do. 

John: Additional challenge to that, though, is, and I see this often, this focus on, let's, Spread the evidence that we have.

I think the missing link there is all, you know, being a teacher, I'm sure you all know, it's not an easy job. There's a lot moving on. There's not a lot of extra capacity to go look around for evidence. And often the evidence out there is not solving an imminent and pressing need the teacher has. And so I think the other way, if you want to generate evidence relevant for teachers, you really need to start with a problem facing teachers.

And a lot of countries, sometimes it's really just about. The administrative burden that they have, the challenge in managing students have different abilities in the same classroom, large class sizes. So I think if, you know, if we're not working on evidence, that's. Directly relevant to a teacher in the classroom, it's not gonna get a lot of uptake.

And I think it's a lot about the ethos and the openness and willingness to experiment and to look. Sometimes there are experiments that can be done in the classroom by teachers and so we, we look at a lot of ideas around that, around what teachers can learn just in a sort of a micro experiment on a day-to-day basis.

And that can be very instructive, I think, to, to iterate and improve. Even small things in the classroom 

Suraj: to John, instead of just going into product redefinition or or design or, you know, refinement, we can also get that evidence out there, collate it, put it together for. Everyone to kind of understand, I know there's several, there's 1 in South Africa called net and.

All that is doing is simply a place for teachers to network on how they use technology. What are their problems of teaching in a, you know, environment, et cetera. It is nothing to do with learning skills, et cetera. It's not a teacher professional development, but the importance of that, I think it's got nearly half a million users across Africa, where.

Teachers peer to peer best known practices that is working and kind of taking back to their classroom. Ciku did mention that demystifying is very important, whether it's at parent level, teacher level, learner level, or even the government level. Whenever governments hear of of improving the reach of edtech, they start thinking money and, you know, budgets, whatever teachers think of technology, they start thinking this is going to replace my job.

So this demystifying of what technology does, again, it points to evidence to say, you know, if X number of children from different environments in this country spent. So much time learning with technology, with their teachers sort of also equipped. These are the results we're seeing. We need more and more of those success stories coming out and kind of, you know, being hammered in.

I don't know if whether Loiko or Ciku have those sort of success stories to talk about. 

Ciku: I actually have a couple. I think Suraj will know because of the floods we've experienced in Kenya over the last two months or so, there was a disruption to the school calendar. And so our learners ended up going back to school.

Almost two weeks late because, you know, there was damage. Some of the schools were being used as places of refuge for people that had been displaced by the floods. And reports that are coming out of Nairobi, public schools in Nairobi continued learning through technology. So to see schools having changed to the point where they can actually continue learning with technology, I think for me is huge.

What other success stories do we have? There are good stories coming out of the frontier counties, counties that have Traditionally and still are, you know, left behind in terms of school enrollment, school retention, learning outcomes. And we're beginning to see technologies such as MLUGA, which, you know, embrace mother tongue instruction, because I think it's really critical when you're thinking about foundational literacy to, to consider mother tongue instruction.

One of the key results for me is the fact that the community embraces This forms of technology. And of course, it has a lot to do with who is developing the technology, how much time they took to co create with the community because they understand the community really well. Those are the things that we see that encourage it.

We seem like we're a long way to go, but step by step, we're getting there. 

Loic: I wholeheartedly agree with this importance of co creation and taking feedback, taking innovations, ideas from the teachers and the users themselves. And I think another thing that probably is more unique to EdTech, because that is true for the whole field of education.

What is true for EdTech Is that you are very data rich, so you can constantly do A B tests and I can constantly use the feedback loops even after designing the initial version of those to improve your product. And as we know, this is a key part of the playbook in Silicon Valley again. But it's surprising to see that those practices are fairly rare in edtech companies that are focused on LMICs.

And so at IPA we have an initiative in partnership with the Jekyll Foundation that tries to help those edtech organizations in doing that. I think this is also some of the things that you guys at the edtech hub are working on. That being said, You know, this kind of work of iterating with your the data of the app itself, where it can be transformational.

It also has some limits. You know, if all you do is using the data within the app, you will not be able to measure the outcomes for kids that are not on the app. So there is still room, I think, for implementations. Of the more classical type outside of the app. And I honestly don't think this stuff is happening nearly enough.

And I think this is why we're still, to a large extent, having debates in the space that are not very evidence informed about EdTech work and EdTech that doesn't work, when actually it's about which EdTech work and how and where. 

John: Yeah, absolutely. That brings me to one of the, well, pet peeves. So great to hear others pet peeves as well.

The amount of Companies or products I see with just before and after results and using that to claim, Hey, look, these students, they use my app and then they learn something and here's the before and after as if the alternative is, you know, the students sitting on his thumbs at home, but we all know there's a different alternative, which is their regular day to day class and you need to do better.

Then the alternative and most people don't think about that. And yes, it's more difficult to measure because you have to step outside your app and you say, like, you have to go and get some benchmark data. You have to try to get data on a comparable classroom. And I think people then say, well, I can't do an RCT because it's too costly it's, uh, in their ethical issues, but it's not about RCT or nothing, right? There's this whole spectrum of things. And that's one of the things we've spent some time with IPA working on different types of design studies that you can use just having a comparison group, which is actually not as complicated as it seems already is going to increase the wealth of information you have available to you.

And I think we'll allow us to have better and more robust discussions around. how tech works under what conditions. 

Suraj: John, yes, I support that pet peeve. There are two things around this. One is, are we actually improving learning? Even using an app, for example. And this is something that we've started to use in the, in the kind of nine countries we're talking about, where we've started in Africa, and we're going to add four more countries.

We're creating this edtech ecosystems is introducing something, a concept called the science of learning by Carnegie Mellon University to the tech developers, and it is incredible how they stop in their tracks to understand. There's one thing to create some really snazzy looking apps, et cetera, have a hundred thousand downloads a day or something.

It makes you stop and re look at. Your technology solution, are you actually making learning happen by doing this? I think there's some new ways to help all these education technology companies across. And the second thing is, again, data that is pertinent to one company, one app, one style. But I believe if we collate all of that, Then we can create a more better knowledge summarized informed decision making some strategy coming out of it.

I know for a fact that in Africa that quality data and. Idea, the development arm of the African Development Bank Association for the development of education in Africa, they are very, very pertinent about these quality is defined as timely, relevant data and most ministries, even though they have emissives.

Some countries have as many as five different E MIS systems running. Uh, they're still not getting the right data on time to make informed decisions. So I think collectively we need to come together. They are through our partnership is going to start empowering. 30 countries over the next five years in understanding that data, where that data is going to be, how they will collate it and, you know, what they'll do about it.

Then education will be kind of nimble to adapt to new needs, et cetera. 

John: Yeah, I think tech is the answer to that problem, right? Of getting, and that's the tech people don't talk as much about, right? People like talking about the shiny ed tech. That goes in front of students, but there's the whole back end part of it.

And you're right. Some of these systems are, you know, still very analog, you know, and they're not interoperable. And sometimes they're not comparable. And one of the things we work in the Jacobs Foundation, it's more of a research driven way. Pharmaceutical companies trial their drugs. They have multi sites and they're all testing the same thing.

So we're creating a thing we call LEVANTE, which is a learner variability network exchange of different universities, trying to really run assessments in the same way on students. all over the world. It's going to create a big pool of data that's going to allow researchers to do really interesting things that's never before been done.

So it's not going to solve the ministry problem in short term, but I think in the long term, it's going to create a wealth of data that will help us advance the science of, uh, of learning and help us learn more about what works and what doesn't across multiple different contexts. 

Ciku: I have two pet peeves.

Let me start with the one that's closer to home, which is our ability as researchers and evidence generators to just create reports that no one reads and not tailoring what we know to the audiences that we want to reach. And the other one is the other extreme of, of just how rigid government policy is around edtech.

I think In Kenya, especially, we saw during COVID just how much that rigidity cost us. It took a really long time for learners to be able to access technology that was available in the schools just because a policy had said it could not leave the school, and yet the children could not go to school, you know, so.

How do we, as we continue to interact with government, whether it's through this evidence labs and through the technical assistance that we offer them as they're designing their programs, begin to work towards a more agile way of policymaking, one that acknowledges the reality of the speed at which technology is moving.

So those are my two pet peeves. 

Suraj: And one good example that Loic mentioned, Rwanda, and you know, Rwanda's advanced a lot on testing out things. That is all now translating into an ICT strategy, which is ready. And as we speak, the edtech policy is about to be presented for a final approval. We've been involved in the supporting, drafting that edtech policy in Rwanda with our partners, the World Bank.

Every statement on that policy is data driven, evidence driven, or what works, what doesn't work, and given the Rwanda context, given some of the past successes, but also some failures that everyone knows about, and how do we kind of approach this from a very successful, and not encumbering the policy into a very heavy handed document, but something that's got a few points on where the results need to be, not the how.

And so, you know, that allows for a lot of different players to also work within that policy environment. 

Loic: I'd say it's the focus on scale without impact. I think we're probably all familiar with this problem, even hard tech companies tend to get funded and the burden of proof they face, which is quite weak, honestly.

And it is quite possible to reach millions without having real proof that you're actually having impact on learning outcomes in this space. So I think, again, philanthropy has a role to play here. In shaping the incentives or on correcting where the market incentives fail for rigorous studies of impact.

It's hard to expect actually the companies to want to pay for those themselves. When they can get away with anecdotal evidence. And so I think that's a unique area where philanthropic funders can come in. 

Suraj: Yeah. And funding, funding for edtech, especially funding for edtech in Africa has hardly seen any kind of major leaps.

The bulk of the users for EdTech in low and middle income countries is going to be those from public schools as a government domain. Governments will not have enough funding to fund investors themselves into EdTech companies. There is also impact funding, which does put the organization seeking the funding to task to say, This is the agreed impact that we would like to see.

And then once you prove that, you get, you know, funding. I don't think anyone's cracked that yet. Unless somebody does know a few things in this podcast. We're still trying to figure out what is that sweet solution where you try to cover bulk of users using AirTag. But somebody still needs to bear that price.

You know, if you make data zero rated, then who's, who's paying for that? So that whole investor's profitability, social angle, we need to create some form of a good solution. Make, you know, examples of a few companies and see how that works. 

John: I can share one that we've been working on. Actually, before that, I'll tell you that a problem is maybe even worse than you describe it, because the problem is, it's not that you need to be just profitable.

Right. The venture model is such that you need to be really exponentially profitable, right? Because there's a high risk in these companies. And so for every, you know, 20 you invest in, you know, maybe a couple, three or four will succeed. So you need those to be really. Very profitable and scalable to justify the whole, the whole fund economics.

So you need this to work and you need sort of somebody paying the bill that then, how do you know if it's going to work and how do you tie that to the results? The one thing we've been working on with the Swiss development corporation, it's a bilateral from Switzerland, our home country. It's really around what's called results based financing.

We have a structure called the impact linked financing facility for education. And we've been focused on a Sub Saharan, more West Africa, Sub Saharan Africa and Middle East. And the idea is we want to give funding that's tied to results. And the results is where we spend a lot of time. We've had IPA help us to really measure something that is robust as opposed to something that can be gained and something that's easy to use before and after.

So we spent a lot of time thinking what's the trigger, but we tell the company, we're going to pay you if you prove that you're delivering learning outcomes. So we signed a forward agreement with them saying, you know, if you show that you're improving in learning outcomes, Here's some money, like no strings attached from us.

And then they use that money in theory to go raise capital, right. To say, look, I have already secured contract in a way of pay or somebody paying the bill that allows them to go find financing in the capital market. So it's early days, but we spent a lot of time on the technical side and, you know, defining the triggers.

I think there's potential there if it works to get much more capital in and scale it. And, you know, it is, it is ultimately a funding issue. But it's funding. It's tied to results. Uh, and I think if we can prove that link, I think we get a lot of more money into paying for the results. The beauty of this part is, you know, as opposed to some of the development impact bonds that have gotten a lot of attention.

These are quite lean operations, right? So the administrative costs are much lower than development impact bond, which we think will make it much more. Perhaps some promise on the horizon for this problem, but it's still, you know, too early to really make a definitive claim around that. 

Suraj: Thanks. We look forward to learning a bit more around that, John.

John: All right. Happy, happy to share more.

Before we finish, let's go around and get some advice from everybody. Three pieces of advice for practitioners or philanthropists. 

Suraj: I would advise three different things to three different areas of people coming together for successful air tech. We start with policymakers. Is your policy, is your regulatory framework adaptable for everyone to participate?

And if not, How does the private sector development world help you bring the data and evidence that you need to develop that and open it up? But you bring it to the private sector, the balance between, I think, John, you put it quite well, the sustainability and these serious profitability to survive because it's Lots of changes, lots of new competition coming up.

So, the idea is to have to stay a step ahead, be innovative, but also kind of making sure that we're not leaving anyone behind. And yeah, the development world, this is where I think if we Drive research and evidence collection for both parties, the private sector, the policy makers, the third party, the users.

And the other thing the private sector can do is convene the people together to every so often to understand what works, what doesn't work. 

John: Mine are, you know, three ones, general, hopefully simple. Number one, listen to teachers and go to classrooms. I think we can talk all the theory you want. You don't see how technology is using the classroom.

I think you won't learn a lot. So it's all about implementation, but how does it work in the classroom? Number two, it's not a RCT or nothing, right? There's a lot of wide spectrum of evaluation methodologies in between. So it's all about right sizing to what works. And last data. I think this is the thing that's technology is going to enable.

We talk about student facing tech. We talk about digitizing data, collecting it, aggregating it, getting into the hands of policymakers. So let's not forget about data when we think about technology. 

Ciku: Okay, so my pieces of advice is I'd love to now see some focus on generating evidence on how teachers can use technology for teaching and learning.

What's the science of learning and how and how does technology weave itself in? How do we move from the Rigid RCT studies and I'm not saying that we shouldn't do them to a more experimental type of evidence generation that also includes the teachers and the learners a lot more so that we are learning alongside one another.

And, you know, being able to pivot before we make massive investments. On things that we actually that are not necessary because they're not targeting the problem. So how do we get a bit more experimental? And then thirdly, how do we involve the government in evidence generation a bit more? I think one of the challenges and why it's so important to build relationships with government is.

It's not enough to, to generate the evidence and come to them and say, this is what we're learning. Um, how do we take them along the journey? Because I think that this is what will help even improve how they allocate budgets, one, two, how they formulate policy. So how do we get government to be at the core of our evidence generation?

So I think those three. 

Loic: Okay, my three related to evidence. One, I say it's important to be demanding on the rigor of the evidence. And, you know, we had some discussions about the roles of RCTs in there. Folks should not shy away from engaging with this. There's many ways of doing RCTs, especially in the, in the edtech sector, that are not that costly.

You know, you can be very creative. You can also design those studies in such a way that you don't, uh, need to deny access to anyone. Uh, you know, encroachment designs or rollouts or things like that. Randomized rollouts or things like that. So it's possible to engage and I think you shouldn't shy away with that.

And I think it's very important. That we really answer with rigor questions that otherwise folks will never agree on. My second advice is actually to, to sort of put some more nuance to what my first advice was. And my second piece of advice is that you need to adapt the kind of evidence you're seeking or expecting to the stage of maturity of the solution you're working with.

You really want to have methods of learning that are much more iterative, much faster, that maybe make use of your data to improve the solutions you have. And once you have something that is mature enough, then maybe it's the RCT. And once the RCT actually shows very good evidence, maybe you don't need to do RCTs over and over again.

Maybe you can use indicators, proxies, to check that, you know, you're still getting close to your North Star as you scale. The third piece of advice, uh, maybe something we probably haven't talked nearly enough here, but it's really the elephant in the room in edtech in LMRC countries is to really keep considering access and equity issues when you talk about edtech.

The market incentives, again, are not there for that. But during the pandemic, when we had this unfortunate live experiment that everyone had to move to tech, all the studies, and I was part of one in Cote d'Ivoire, all the studies showed that this was really inequitable, especially when you were relying on one solution that not everyone could access to.

So I think that is, sure, maybe this is going to improve over time, but I don't think it's going away anytime soon. And I think there's an important role for philanthropy to think about ways to use edtech. I'd be less than a sentence more teacher oriented that are going to enable an equitable. Okay, 

John: I think that's a wrap for my side.

It was a great conversation. 

Anjali: We hope that you enjoyed listening to the conversation. To sum up the discussion on EdTech's potential impact on bridging the equity gap, we heard that it starts with defining what we mean by equity, and how that would look like in different contexts. We also discussed how early stage organizations may not always be able to start with serving the bottom of the pyramid learners, but they can have an eye towards it from the get go and build competencies and design elements on their platform, such that it allows for an inclusive participation.

Philanthropy, again, can play a crucial role with Innovative funding options such as providing the stopgap capital while the procurement process with the government is underway and support edtech organizations focus their attention on all learners and not just the ones that can afford it. This podcast was brought to you by the International Education Funders Group, curated and edited by Anjali Nambiar and and post production by Sarah Miles.

You can learn more about the IEFG at www. iefg. org and do subscribe to the podcast for more such thought provoking conversations.

Podcasts we love