Somewhere on Earth: The Global Tech Podcast

Using AI to identify threats to human rights and political activists

Somewhere on Earth Episode 40

Using AI to identify threats to human rights and political activists
Political activities such as hustings, campaigns and voting are well underway in many countries, but alongside these come incidents of reprisals and voter intimidation in certain regions. Ushahidi is an NGO based in Kenya that maps these incidents. They’ve collaborated with major AI companies specializing in global risk detection. Dataminr utilises public data through its platform to identify risks in advance. Their AI system processes trillions of computations daily, analysing billions of public data inputs from nearly a million sources. It processes various forms of data including text, images, video, audio, and other real-time information. Dataminr has helped develop new AI tools for Ushahidi that focus on helping the collection of data, improving geolocation and real time translation into local languages – all of this is leading to improved safety for individuals. Jessie End, VP, Social Good at Dataminr and Angela Odour Lungati, Executive Director Ushahidi are on the show.

The programme is presented by Gareth Mitchell and the studio expert is Wairimu Gitahi.

More on this week's stories:
Dataminr: AI for Good
Leveraging Citizen‑Generated Data In The Age Of AI - And How We're Making That Happen

Support the show

Editor: Ania Lichtarowicz
Production Manager: Liz Tuohy
Recording and audio editing : Lansons | Team Farner

For new episodes, subscribe wherever you get your podcasts or via this link:
https://www.buzzsprout.com/2265960/supporters/new

Follow us on all the socials:

If you like Somewhere on Earth, please rate and review it on Apple Podcasts

Contact us by email: hello@somewhereonearth.co
Send us a voice note
: via WhatsApp: +44 7486 329 484

Find a Story + Make it News = Change the World

00:00:00 Gareth Mitchell 

Hello world. This is the Somewhere on Earth podcast. I'm Gareth. It is Tuesday the 9th of July 2024 and I'm in London. We have voices from, let's see, look it up on the map here, Nairobi and New York City. So let's jump in. 

00:00:25 Gareth Mitchell 

And with us today for some experting, again, second week running. Lucky me. Lucky us. Wairimu Gitahi. Hello Wairimu. How are you doing? 

00:00:32 Wairimu Gitahi 

Hi Gareth, I’m very, very well and you. 

00:00:35 Gareth Mitchell 

Good. I'm exceedingly well. And you're always on such fine voice actually, I'd love to know what your secret is. Do you just drink a lot of tea or something? Or is it just you? 

00:00:42 Wairimu Gitahi 

Yeah, a lot of Kenyan tea. But I also laugh easily, so it makes it makes it much easier. 

00:00:49 Gareth Mitchell 

Well, we like you, especially if you laugh at anything I say. Even better if it's intentional.  Right, there we are. Off to a good start already. Let's try and keep the mirth going. And despite that we have some serious topics though, but hopefully we can get a little bit of laughter in there when and as we can. Great. Should we do it, then Wairimu? 

00:01:06 Wairimu Gitahi 

Yep, let's go. 

00:01:08 Gareth Mitchell 

As they say in podcasting and YouTube videos, let's totally do this, here we go. 

00:01:12 

Yeah. 

00:01:13 Gareth Mitchell 

And it's that laugh again. 

00:01:19 Gareth Mitchell 

And coming up today. 

00:01:23 Gareth Mitchell 

In this year of elections around the world, not only is it a big year for hustings, campaigns and voting, but also in some places, reprisals and voter intimidation. Today, we're meeting one organization that began life mapping such incidents, things like voter intimidation. Now it's become a global nonprofit with a much wider tech for good remit.  And that organization is now working with a large AI company that specializes in global risk detection, and we're speaking to a senior person from that body as well. There's a whole lot of tech for good breaking out today, and it's all right here on the Somewhere on Earth podcast. 

00:02:08 Gareth Mitchell 

All right, then let's start by finding out about Dataminr. Now this company spells its name without an E so that's Dataminr or one word without the E and it builds itself as one of the world's leading AI companies, with its platform that uses public data to detect risks long before other platforms, or indeed, any of us are even aware of these threats. 

00:02:27 Gareth Mitchell 

Dataminr AI sits on trillions of computations every day across billions of public data inputs, from nearly a million data sources. It ingests text, images, video, audio, and just lots of other kinds of live data. They've been working in partnership with Ushahidi. Now, that's the non profit that began life in 2008, to map election violence and reprisals, and it's now more widely about improving lives and enriching communities. We'll meet Ushahidi in just a moment. But first, let's say hello to Jessie End, VP, Social Good, Dataminr. 

00:03:01 Gareth Mitchell 

So just a reminder for everybody, we’ve a few organizations with names floating around here. We're going to be meeting Ushahidi in a moment. We're talking about Dataminr now with you, Jessie. So thanks for joining us. How are you? 

00:03:12 Jessie End 

I'm doing well. Thanks for having us. I'm looking forward to this chat and doing so with Angie too, because she's a wonderful partner. 

00:03:18 Gareth Mitchell 

Marvellous. That's Angi who’ll be joining us from Ushahidi. Who's got got little oar in there before we introduced her. So there we had a little preview there to to Angie. So, Jessie, what about Dataminr? I've given a bit of a preview there in the introduction, but what else do we need to know about you? What do you? 

00:03:33 Jessie End 

You've done an excellent job, so rather than repeating some of what you've already shared, I think the thing that I would add, right, so we are we are an AI company that uses publicly available data, applies AI to publicly available data in order to alert on emerging events as they unfold around the world. 

00:03:56 Jessie End 

I would say the other thing that's worth adding is that social good, while I lead that function at Dataminr, is really part and parcel of the fabric of the company. So if you listen to our founder and CEO talk about the origin of Dataminr, it is always been about saving lives, goes back to him being a young man when 911 happened and recognizing the power of publicly available data and imagining what that might be able to do in the service of saving live. So I have the privilege in my role of taking that mission and running further down the field with it in terms of working with nonprofit and social good organizations in humanitarian response and all sorts of other impact areas. 

00:04:40 Gareth Mitchell 

What do we mean when we say publicly available data then? 

00:04:44 Jessie End 

So Dataminr, it's all data that you could have access to publicly, right? So we're not looking at private tweets or things like that. Our public available data ranges from sensor data. So thinking about traffic cameras or air quality sensors to social media, government blogs, websites, news sites and even the dark web, which comes in mostly in the context of cyber security. 

00:05:14 Gareth Mitchell 

Oh, that's interesting. How trawlable is the dark web because you know, always think of the dark web as the the bit of the Internet, the information space that isn't indexed by a big search engine for instance, like Google. So I wouldn't even know how you even go about finding usable data from the dark web. In fact, I've never been on it. I, journalistically, I suppose I should have been by now, but I'm too much of a good boy I suppose but um. 

00:05:38 Jessie End 

I think that that makes probably all of us, but certainly I can count myself in that group. I'm not a technical expert on that. So the how of it, I wouldn't be able to explain, but I know that we mostly use it for cyber security incidents and a little bit more around illicit trafficking as well. 

00:05:59 Gareth Mitchell 

Well, that makes sense. You know, if your whole business is about risk prediction and detection, you would want to go to places where bad things might be being discussed or where data is going around. So I can imagine that's an interesting aspect of the story. Let's just come up with a few examples then, because so far this might sound quite abstract. So there's all this data out there and I gave some mind bogging stats in the introduction about the amount of data, the amount of computations that you do but, like what? I mean, what is it that pops out the other end that is useful to your clients or your partners like Ushahidi? 

00:06:33 Jessie End 

Across our business, no matter whom we're serving, when we look at our alerting, why it's valuable, it really comes down to three things, speed, scope and depth of alerting. And I'll share a little bit more on each of those with some examples to your point to try and make it a bit more tangible. 

00:06:50 Jessie End 

So when it comes to speed, having access to real time information is critical in helping organizations stay ahead of crises and threats so that they can better protect their people and assets in the fields. And in the case of humanitarian response, their beneficiaries as well. So thinking about things like duty of care.  

00:07:10 Jessie End 

When a disaster strikes nonprofits have to react quickly and thoughtfully, so they have a lot at stake and the speed of that alerting is really critical. 

00:07:17 Jessie End 

So an example of that, we work with one organization that doesn't actually have, it's a nonprofit organization that doesn't actually have boots on the ground. So they work through smaller local organizations in the countries that they serve, providing medical aid to in, in crisis settings. 

00:07:37 Jessie End 

And when Russia first invaded Ukraine Ukrainian refugees were crossing the border into Poland and their very first signal that this was happening and to the degree that this was happening was through the Polish press. It was actually an article that they found via a Dataminr alert and through that one article they were able to 

00:08:02 Jessie End 

essentially, geocode the border checkpoints and refugee reception centers over a week prior to when that same information was being published by the UN Office for the Coordination of Humanitarian Affairs, which most people know is OCHA. So that speed, that ability to get this local source ahead of time over a week ahead of the official stats was really critical to them being able to position aid for those refugees that were entering Poland. 

00:08:31 Gareth Mitchell 

OK. And I know you mentioned that you know there are these other aspects along with speed, I think you said one of them is depth, isn't it or something you say. So anyway you have these important parameters against which these risks are determined. And I think that's a great example there, as that's the kind of thing I was looking for, you know, so kind of early warning if you like of that full scale invasion of Ukraine. I know there's one of your case studies on your website is around the Baltimore Bridge collapse, for instance. So, you know, data coming from the vessels and other sources and sensors and webcams and so on. Absolutely fascinating. We will carry on this conversation with you, Jessie. But it would be good also to bring in now Angie, who's with Ushahidi. 

00:09:14 Gareth Mitchell 

So Ushahidi is best known for its crowd sourcing and mapping tool. And I say that because that's how I know of Ushahidi, Angie. This is Angie Odour Lungati, by the way, who's the executive director at Ushahidi? So I remember, you know, back in the day, long before many of us were even thinking about mapping data and data visualizations and geolocators and so on, was around reprisals following the, or around leading up to the elections in Kenya at that time. But now you've branched out quite a lot, haven't you? So give maybe a slightly less garbled introduction to what you do than I've just given. 

00:09:52 Angela Odour Lungati 

I thought that wasn’t garbled that was pretty good. But yeah, I think I would describe us, and many people do describe us as a global nonprofit technology organization that's empowering people through citizen generated data to think about how to develop solutions that will strengthen their communities or, you know, help improve them in one way or the other? We spent some time really thinking about what our purpose is, like reflecting on where we've come from and where we're going now and we've kind of summarized that into  

00:10:21 Angela Odour Lungati 

what we're here to do is to make sure that people everywhere can easily gather data and generate the insights they need to help them tackle issues that matter the most to them. We want people who are rarely heard by decision makers. You know, these are disenfranchised communities to share their data in ways that allow them to help each other or get the outcomes that they want. 

00:10:42 Angela Odour Lungati 

We want decision makers to also have access to data and insight that helps them better support the communities that they are committed to serve and that we're also operating in a world where community generated data sets turn into 

00:10:56 Angela Odour Lungati 

public goods that can serve as a resource to improve society. I think, you know that citizen perspective sometimes tends to get lost, and yet it adds so much flavor and better understanding of some of the pressing issues that we are facing today and to really just trying to make sure that is still at the heart of different conversations that we are working in. 

00:11:17 Angela Odour Lungati 

So there's a lot of, you know, still focusing on raising the voices of those disenfranchised communities, but also thinking about how those raised voices then transform into insights that are valuable for people to then make decisions out of them. 

00:11:30 Gareth Mitchell 

Yeah. So Wairimu, I guess you will have been covering Ushahidi for some time. I mean, they're an important technology organization in Nairobi and in Kenya. So what, what have your dealings been with Ushahidi? How well do you know them? 

00:11:47 Wairimu Gitahi 

I first heard of Ushahidi in 2008. During the election time, and as Angela was speaking, I I said I I need to ask her something. During the elections, I mean, the communities were giving information about what's happening in different regions, and I'm wondering how do you evaluate what is objective and what is not? What is one sided and what is not when you're trying to detect whether there's a risk or blah blah blah or you know mine data or collect data from communities. 

00:12:18 Angela Odour Lungati 

Mm-hmm. That's a very good question. It's it's one that comes up every time we talk to anyone who's looking to use a platform to crowdsource. The risk of bias has always been there, but we've put in several checks and balances to try and tackle that. So from a technological perspective, one, nothing goes live on the map unless somebody has reviewed it to make sure that it actually aligns. 

00:12:39 Angela Odour Lungati 

And even if you do, for example, like in the case of the 2022 elections we purposely shared some of the things that were tagged as misinformation just to show people the extent of the pervasiveness of misinformation. For example, say you know the 70% at this point versus 30%. I think that's still valuable, but at bare minimum, making sure that there are ways that you can tag the information as to what is credible, what isn't, what is a public opinion versus what is fact? 

00:13:09 Angela Odour Lungati 

And from a non-technological perspective, leveraging existing partnerships, you know working with groups like the Baraza Media Lab and the Fumbua team to help us really corroborate some of the information that's coming through. So that also pairs up with the technological aspect of how we're flagging things because at the end of the day, we are a small nonprofit organization that's focusing on building the 

00:13:29 Angela Odour Lungati 

infrastructure. But the people on the ground and some of those civil society organizations already have trust established and know a whole lot more. So that just makes it a much more powerful engagement. 

00:13:42 Gareth Mitchell 

Yeah. And again, just to help people out in terms of what you actually do and what the platform's all about and also the citizen element of it, for instance, there was this storm  Daniel, a major storm that hit Greece last September. And I know a bit about it because I happened to be on one of the Greek islands. Thankfully for me anyway,  

00:13:55 Angela Odour Lungati 

Oh wow. 

00:14:02 Gareth Mitchell 

further west from where this storm hit? Yeah. So I wasn't that far away. Thankfully for me, the worst that we had was some very unseasonably wet and crazy weather. I mean, luckily it didn't get any worse than that. But of course, as was reported on the news at the time, further east,  

00:14:17 Gareth Mitchell 

There were like, Skiathos, for instance, was very, very badly hit with just absolutely catastrophic floods. But there were volunteers using the Ushahidi platform there. It's one of the case studies on your website to gather data about storm water levels and lots of other aspects that they could pool together  

00:14:37 Gareth Mitchell 

and bring all that information to just one place where it could be useful, and map that information. They used the Ushahidi platform to do that, and that aided the relief effort. So that's just one of many case studies that you can be credited for, I guess at Ushahidi. 

00:14:54 Angela Odour Lungati 

You've done your homework. I like it. 

00:14:55 Gareth Mitchell 

We do our best. Yeah. And and I just happened to be in that region at the time, I hadn't realized that Ushahidi was involved at the time. But anyway, look, it gives an idea of what you do. So what you and Dataminr both have in common, of course, is that you are about pulling together data, in your case, in Ushahidi, this is crowdsourced data, not purely crowd sourced, but that's, you know, kind of your USP, I suppose that I've certainly been aware of, and that Dataminr, it's these public data sources. 

00:15:17 Gareth Mitchell 

So what brings your two organizations together? And I suppose what does that  

mean and ultimately, how is that making the world a better place for the rest of us? That's really where I'm going with this, so perhaps we can come back to you Jessie at Dataminr, what does this collaboration mean? 

00:15:32 Jessie End 

Yeah. So I might just take a quick step backwards and give you a little context on how this partnership came to be. When I began with Dataminr, I think we were starting shortly after I began, we were starting to hear this term polycrisis 

00:15:52 Jessie End 

Used, bandied about. And for those of the listeners who might not be familiar with the term, it refers to the simultaneous occurrence of multiple catastrophic events, and more specifically one where individual crises become increasingly entangled and the resulting impact is larger than the sum of its parts, so you can think about climate change, conflict, food insecurity, migration, threats to public health, 

00:16:20 Jessie End 

and threats to human rights.  And I think when when we were kind of faced with this growing polycrisis, I think we recognized that the demands on the humanitarian and social good organizations on that space were only going to be increasing. So we were looking for 

00:16:41 Jessie End 

additional ways we could actually relieve some of that burden beyond just our just our alerting and so in 2023, we launched something our AI for good program which actually sits under the umbrella versus social innovation lab. 

00:16:56 Jessie End 

And we ran a pilot program with UN Human Rights and before that. But in 2023, we launched our official cohort, and Ushahidi was one of the applicants. So we released a public RFP, essentially inviting social impact organizations to share their ideas on how AI could make a profound and game changing impact on the work that they do and Ushahidi was one of the organizations we selected to work with. 

00:17:22 Jessie End 

And I can let Angie maybe share a little bit more on what their proposal was and what we've been working on. 

00:17:30 Angela Odour Lungati 

Background into how some of the like the example in Greece that you shared or some of the election monitoring work tends to come to life. 

00:17:38 Angela Odour Lungati 

It tends to be a massive effort of pulling in a lot of different stakeholders, volunteers to come together to help us process the incoming data, and a very intricate workflow, like how we identify what's coming in from different media sources and then putting it through another layer, of then geolocating it and translating it. 

00:18:00 Angela Odour Lungati 

Then going for a final level of review to publish and verify it, while also figuring out how to then escalate to different partners for for response. Every time that we have deployed the platform, for instance in Kenya and many other people who've used the Ushahidi platform we will talk about the same. When you're dealing with hundreds of thousands of data sets, it means that you need to have hundreds of volunteers, probably spread out across different time zones to make sure that there is a continuous cycle of data being processed. Right. 

00:18:32 Angela Odour Lungati 

And while you know, we've tweaked, we've tweaked the workflow over the years to find you know, reduce the amount of time we take from the point somebody on the ground sends a message and says I'm in trouble or I need help and the point where it gets responded to, we recognize that there have been an efficiency there. And so when we saw the RFP by Dataminr open up such a fantastic opportunity for us to think about how we could ease the pain 

00:19:00 Angela Odour Lungati 

of manual data processing on the platform to help anyone who's using it to advocate for change and collect data to focus on telling the story rather than focusing on the manual processing of the data. Right. And so we applied for it, we got it. And that was amazing. 

00:19:17 Gareth Mitchell 

And so you mean the platform would sort all that out for you effectively. So you want the platform to be the the lowest possible friction to get onto the platform and share the data? 

00:19:25 Angela Odour Lungati 

Not only get onto the platform and share it, but for the people who are trying to organize that data to make sense of it. So for example, when I send in a text message and  

00:19:34 Angela Odour Lungati 

see polling station X was closed and Wairimu says polling station Y was closed. That's helpful. But then you also want to see it at scale, that 10, 20, maybe 100 other people who will say the same thing and it's valuable for people to come onto the platform and see how many people are reporting 

00:19:54 Angela Odour Lungati 

incidences of polling stations being closed and where are they located? The process of figuring that out, extracting that information, properly, categorizing it, classifying it and verifying it takes hundreds of people to manage the thousands of reports that are coming in. 

00:20:13 Gareth Mitchell 

Yeah, because it's so important to verify this data, especially during politically heightened times of national tension. And Wairimu, how do people respond then? What's your experience in Kenya of people using Ushahidi and indeed looking at it as an information source? How important is it? 

00:20:37 Wairimu Gitahi 

I think it's, it's very important because you're able, as Angela was saying, to get information from various places. But allow me to ask you something else that I've been observing and I keep on wondering how Ushahidi deals with that. We are living in a world and even probably even Jessie can just answer it because it's all about data mining, particularly through communities. 

00:20:58 Wairimu Gitahi 

We're living in a world where privacy concerns are very key and even if someone you know gives information and doesn't really reveal who the person is, and so on and so forth, there's still an aspect of revealing something that would be otherwise personal, you know, so I'm wondering how you, how you deal with that and also of the barriers involved.  

00:21:20 Wairimu Gitahi 

Because I'm thinking like during the Kenyan elections, there were, for example, some violence going on in very remote areas. And of course you need a certain level of technology to be able to share this information. So could we say that some of the information from people who did not have the facilities to be able to share the information was, I mean wasn't captured and of course that also brings up a problem of objectivity. 

00:21:48 Angela Odour Lungati 

Jessie, I'm going to, let me start and I'm going to tackle that last question you asked around, you know, if there are people who are left out. 

00:21:56 Angela Odour Lungati 

That has, is something that we've factored into the design of how the platform works from the very beginning. We recognize that we're trying to raise the voices of people who tend to be left out of critical conversations. The world's most disenfranchised communities. And that means that we have to design the tools in ways that meets them where they are. 

00:22:15 Angela Odour Lungati 

And the way that we have done that, or at least did it from the very beginning, was start with the most ubiquitous tool that people have access to, the mobile phone. My mother has it. My grandmother has it. My grandfather, father does as well, right? It's the lowest means of entry or the lowest means of being able to engage and allowing people to share information, 

00:22:34 Angela Odour Lungati 

But of course you know just to try and widen that place. People are also able to share reports via smartphone applications, are able to do that via the web. They're also able to do that via e-mail and via X currently. And we've recently also been integrating WhatsApp and USDD as a lower cost option to SMS. 

00:22:55 Angela Odour Lungati 

So we are also constantly thinking about the fact that the way people are showing up is different and it's constantly evolving. So how do we consistently make sure that we're meeting them where they are so that we are lowering those barriers of interaction, or 

00:23:08 Angela Odour Lungati 

at least from a technological perspective. Second thing, of course, is making sure that they know about the existence of a tool that allows them to share. So the interesting thing is that when you talk to many people, they probably don't know about Ushahidi itself. They'll know that I sent an SMS to this short code number 

00:23:28 Angela Odour Lungati 

because, partner Y, it could be Constitution and Reform Education Consortium Kenya  (CRECO) or Baraza Media Lab told them. If you share a message here, something something different is going to happen. So most of the time, the people, the respondents, the people on the ground will not know who Ushahidi, they will know the partner that is using Ushahidi to help raise their voice. 

00:23:50 Gareth Mitchell 

Right. And just a thought for you Jessie, as well then. Is there a bit of a law of unintended consequences around, say, data protection legislation? Because one of those points that Wairimu made there to Ushahidi is around data privacy for instance. So is it possible that in some jurisdictions that well-meaning attempts to protect people's data then takes what could be very useful data out of the public sphere and might rather impoverish, make your models less useful in that way? 

00:24:24 Jessie End 

Yeah. Thank you for the question. I think it's a really important one to frame, especially in the context of the groups that we work with, right, humanitarian response. There's plenty of sensitive data there. The the thing about Dataminr is that we are at our heart, an event detection platform. So we're not looking for trends, sentiments, things about people's personal lives. Our AI is trained to identify when an event is happening. All that other information that might be 

00:24:54 Jessie End 

in there, our AI sees as irrelevant, so it would never show up in our platform because it is sort of beyond the scope of what our platform is designed to do. Obviously, the the added benefit is that we don't have to worry about those privacy things in the same way. That said, we are a tech company and we are compliant with all the different GDPR and all the other things that are out there that that we believe 

00:25:21 Jessie End 

are important protections to be in place, so we comply with all those. And I will also say from what I've heard from our partners, we go above and beyond of that. But the biggest thing that we have going for us is what we're designed to do as a platform, right? 

00:25:38 Gareth Mitchell 

OK. And so just finally then as VP Social Good. Just tell us, I kind of slightly flippantly earlier said we're going to find out how you're making the world a better place. So if it's not been evident already anyway, but where for you as a parting thought is the social good around this AI? 

00:25:55 Jessie End 

So for me, it's really always about the work that our partners are doing. So whether that's an AI for good partner like Ushahidi or you know, UN Human Rights, who's using our first alert platform to monitor attacks on human rights defenders, it is about the work that our partners do, and so we're actually very careful in how we select our social good partners. 

00:26:19 Jessie End 

And our platform is essentially just a tool to help them enable to enable them to do their jobs better, more effectively, to keep people safe. So the social impact and what we do really is just in enabling partners that are doing great work. 

00:26:36 Gareth Mitchell 

OK, so in your case then Angie over at Ushahidi, it's to help you achieve your mission. As you say on your website, to generate solutions and mobilize communities for good. And I'm not just reading out your slogan to promote you. I'm I'm a journalist I'd never do that. 

00:26:50 Gareth Mitchell 

But I suppose I'm throwing it back at you. Just to challenge you really just in your final reply about how that's happening. It sounds lovely, but what does that mean for communities? You know, perhaps give me one example where you can say, look, this is a community where, you know, without this data there would have been election reprisals. People could have got hurt or [people] 

00:27:10 Gareth Mitchell 

that we could get law enforcement to polling stations when people were being beaten up outside, or there was this natural disaster that happened and we had some insights and we managed to prevent something much worse. Just give me an example of how you’re going about this in this rather grand sounding mission statement. 

00:27:25 Angela Odour Lungati 

One, I'll first kind of speak to how the work, the partnership with Dataminr is really supporting and helping us. You know we've managed to come up with three artificial intelligence models through this partnership. One that supports on automatic classification of data, another that helps us with automatic geolocation and automatic 

00:27:46 Angela Odour Lungati 

translation of the data, that is already proving to help us process a lot of information much faster. We haven't published any reports around this yet, but we're actively using it right now to track the protests that are going on in Kenya. But to your point around how we're trying to achieve our our lofty goals, I think there's plenty, so many examples that I can use around  

00:28:09 Angela Odour Lungati 

how being able to raise voices has been extremely valuable and has helped to effect some level of change. Whether it's in Nigeria back in the day and how the tool really helped to increase voter turnout because people were engaged, or using it in a local community in Tana River, that's trying to understand, you know, energy consumption patterns within a local community in an area that tends to be ravaged by climate change, and being able to surface the fact that nursery tree 

00:28:36 Angela Odour Lungati 

farmers have been getting taxed more than people who are cutting down trees to sell charcoal right. And that now being tabled in the local government for them to figure out how to shift this and potentially lead to a different conversation around how we can address climate change in the area. Or the platform being used to raise the voices of women who are being harassed on the streets of Cairo and Egypt years ago, and shifting the conversation from ‘is this happening or not’ into ‘how do we change this’  narrative? How do we shift behavior and start to have conversations around what needs to change? There's so many. 

00:29:11 Gareth Mitchell 

So real real life, that's what I was after, just those real life outcomes you know. I love the slogans, don't get me wrong. But yeah, I'm glad that we got to some real life examples there. Look, we'll leave it there everybody. That's been absolutely fascinating. Thank you for that discussion, Angie.Odour Lungati who's executive director of Ushahidi, joining us there from Nairobi. 

00:29:31 Gareth Mitchell 

And also from New York City, we have had the pleasure of Jessie End’s Company. She's VP Social Good with Dataminr and Wairimu of course is our journalist, expert, commentator and general all round, good person friend in Nairobi as well. That'll do for today. a big warm hug from all of us here in the studio. So that's Dylan and Keziah are doing the sound. It's Liz doing the production managing. It's Ania doing the editing and producing. And it's me just turning up and waffling on and then just trying to keep it vaguely coherent. So thank you everybody. And we'll see you next time. Bye bye  

00:30:00 Wairimu Gitahi 

Bye 

People on this episode