Ethical AI Podcast

Episode 4: AI in the Classroom of the Future

October 17, 2022 The IU Center of Excellence for Women & Technology Season 1 Episode 4
Episode 4: AI in the Classroom of the Future
Ethical AI Podcast
More Info
Ethical AI Podcast
Episode 4: AI in the Classroom of the Future
Oct 17, 2022 Season 1 Episode 4
The IU Center of Excellence for Women & Technology

Discussion with Krista Glazewski and Cindy Hmelo-Silver on the way AI can be used to create optimal K12 classroom environments that empower students and teachers to support learning, problem-solving, and collaboration.

Krista Glazewski
Professor of Instructional Systems Technology and Chair of the Instructional Systems Technology department in the IU-Bloomington School of Education

Cindy Hmelo-Silver
Distinguished Professor of Learning Sciences, Barbara B. Jacobs Chair in Education and Technology in the Bloomington School of Education. She’s also Director of the Center for Research on Learning and Technology and co-principal investigate for the NSF AI Institute for Engaged Learning 

Show Notes Transcript

Discussion with Krista Glazewski and Cindy Hmelo-Silver on the way AI can be used to create optimal K12 classroom environments that empower students and teachers to support learning, problem-solving, and collaboration.

Krista Glazewski
Professor of Instructional Systems Technology and Chair of the Instructional Systems Technology department in the IU-Bloomington School of Education

Cindy Hmelo-Silver
Distinguished Professor of Learning Sciences, Barbara B. Jacobs Chair in Education and Technology in the Bloomington School of Education. She’s also Director of the Center for Research on Learning and Technology and co-principal investigate for the NSF AI Institute for Engaged Learning 

INTRO MUSIC 1

 

Laurie Burns McRobbie:

Welcome to creating equity in an AI enabled world: conversations about the ethical issues raised by the intersections of artificial intelligence technologies, and us. I'm Laurie Burns McRobbie,  University fellow in the center of excellence for women and technology. Each episode of this podcast series will engage members of the IU community in discussion about how we should think about AI in the real world, how it affects all of us, and more importantly, how we can use these technologies to create a more equitable world.

 

INTRO MUSIC 2

 

Laurie Burns McRobbie:

AI is not just for grownups, in fact, some of the most adept users of AI technologies like voice recognition, our young children, my five year old grandson is an example. He got fed up with Siri one day and yelled, "Siri, Will you shut up?" To which Siri replied, "That's not a very nice thing to say." John's response. "Siri, will you please shut up?" applying human communication expectations to artificial intelligence. Well, my grandson's generation is growing up with these technologies in their in his classrooms and play spaces. What are the implications of AI in these environments? How are they being used today? And what is the potential for how they could be used to enhance learning? That's the conversation we're going to have today with two IU professors of education, Krista Glazewski and Cindy Hmelo-Silver. Krista is professor of Instructional Systems Technology, and chair of the Instructional Systems Technology Department in the IU Bloomington School of Education. Cindy is a distinguished professor of learning sciences, and the Barbara B. Jacobs chair in education and technology in the Bloomington School of Ed. She's also Director of the Center for Research on Learning and Technology, and Co-PI for the NSF funded AI Institute for Engaged Learning. Both Krista and Cindy are working currently on several projects involving artificial intelligence in K12 classrooms. In other words, AI in the classroom of the future. Welcome, Krista and Cindy.

 

Krista Glazewski:

Thank you for having us.

 

Cindy Hmelo-Silver:

Thank you so much.

 

Laurie Burns McRobbie:

I'd like to start by having both of you talk about the vision you have for optimal K 12, classroom learning environments, all of your efforts to work with AI tools, start with that vision in mind, right?

 

Cindy Hmelo-Silver:

So I think our vision of a classroom of the future is one where kids are actively learning trying to solve problems, learning how to use information critically, and in general, being active working collaboratively. And we think about how can AI help make those kinds of learning experiences that allow kids to engage deeply in academic content in the service of solving problems? But how can AI help us do this in diverse range of classrooms?

 

Krista Glazewski:

Yeah, and I would add to that, you know, when we think of the classroom that we're imagining, we're imagining a classroom where there's lots of collaborative engagement, where there's lots of activity where kids are talking about ideas, and negotiating problem spaces, where the kinds of problems spaces that they're negotiating, and working toward feel consequential, where they feel like their ideas matter, where they're feeling, the freedom to bring their whole sense of self, to the endeavor where they feel like this matters to me, and I matter in this environment. And so that's the classroom we're imagining. Sometimes when we picture classrooms, we might picture the very common setting where there's desks in there, and rows and teachers at the front of the classroom. And we might have experienced more of a chalkboard environment. Kids these days might have a whiteboard, but fundamentally, those roles have remained unchanged for decades. But at the same time, there have always been educators that have imagined placing kids and their interests and their ideas at the center of problems, and engaging them in work that feels consequential, meaningful, and relevant to their own lives.

 

Cindy Hmelo-Silver:

And I think even beyond that, we see the role of AI in these kinds of settings, as part of a system of supports. So we see the role of the teacher is still being critically important. So we want to think about what can we do to help the kids what can we do to make, create these consequential kinds of learning and increase settings, how can we support the teachers to support the kids. So we see it as sort of this system of, of contexts and supports at both student and teacher levels.

 

Laurie Burns McRobbie:

So putting that putting the student and the teacher in the center?

 

Krista Glazewski:

Yeah, and you know, Cindy, and I use this phrase we've written about this idea, we have kind of created an umbrella phrase called ambitious learning practices. And what we're getting that if this idea of ambitious learning practices is really changing the nature of how we envision the relationship between teaching, learning, and the materials of the learning environment. So in that context, both AI can function to both help create spaces that are engaging and immersive. In a in an immersive digital problem world or problem space. And AI can also be serve a role in supporting learners to do the complex collaborative work that we're talking about.

 

Laurie Burns McRobbie:

You mentioned ambitious learning practices. Can you say more about that? And maybe offer us an example?

 

Krista Glazewski:

Yeah, sure. So we would define ambitious learning practices as a constellation of practices that really help define what the role of the teacher is, and what the role of the learner is, in this context, the role of the learner is to again, address and solve some sort of complex problem that feels meaningful, and it feels important and relevant to their lives. It also offers opportunity for collaborative engagement, and for students to really enact their own agency, so that they have many pathways toward toward meeting success. And in this context, the role of the teacher is really wanting to facilitate to keep reminding students of what the problem space is and what they're working toward, and to help them to go deep when they need to go deep, and also offer suggestions for support when they need more support, assistance, or helping the environment.

 

Cindy Hmelo-Silver:

But of course, this is also challenging for the teacher because there's, they're juggling lots of balls in the air. So when we try to think about how do we support both the student and the teacher, we think about it with this in mind. So one example of a project we're working on now is with the software environment, Crystal Island eco journeys. And this software is a multiplayer game, that situates students in an island in the Philippines, where fish are dying on a fish farm. And the learners need to be able to help the, the local population with trying to investigate this. And it's situated very much we did focus group of kids, they wanted to be kids who were doing a field experience, they didn't want to be scientists, or, or journalists, or any of the other things that we thought might be interesting. But we so they need to gather information they need to scientists ask them, what they should you know, what kinds of information they might have, they have to use equipment to analyze samples that they can collect in the game world. But we also try to provide support to help them collaborate well, so we have places where they come together to share their findings, the multiple children in the group. And we're trying to provide support that will adapt the kinds of assistance it gives to the students based on how they're doing, but we're thinking about how they're doing as a group level phenomena. So we want to help them collaborate at the same time, we're trying to present information to the teacher. So for things where the system can provide some advice to the kids, we can do that there's other places where it might be useful to alert the teacher of what's going on, and maybe give the teacher some advice to help them in orchestrating the classroom environment.

 

Laurie Burns McRobbie:

So you're what you're talking about here is an environment in which teachers can respond to a diverse range of learners. When you think about equity issues. In the context of of learning, it's, you know, kids are different, and they are going to arrive at a solution to a problem or, or something else in a lesson at different speeds at different times, and so forth. And so systems have to be responsive to that. We think about the current environment that Chris had mentioned how we really haven't changed his paradigm of kids and rows of desks and the teacher in the front. And there's already technology in current classrooms. Can you talk a little more about what the shortcomings are, particularly from this standpoint of meeting the needs of a diverse range of learners?

 

Krista Glazewski:

I would start by saying that one of the fundamental things we have to consider is what do we want? Learners to show us about what they You know, and how do we want them to do that? And so this ask the question of how expensive can we be, it would be really tempting to channel learners. And because we have the type of technologies, where we could assess them, or we could get run diagnostics, where it would give us information, we, we might be tempted to lean into those kinds of technologies, when in fact, what we're arguing is actually the opposite. We're arguing for expansive ways of looking at what children know and can do, and how they can show us that what they know and can do, we're also advocating for the kinds of learning experiences that are going to feel really messy, so that the kids aren't channeled and they're not collected, the analytics aren't just collected about them, so that we can assess where they know and, and have this really diagnostic form of, of, of the what they do. But But where we think really expansively about how they can engage in learning environments, and ways that they can show us and enact their learning, whether that's through, you know, the kinds of problem solving that they do, and the kinds of artifacts they can produce, and whether those artifacts are written or whether those artifacts are take some other form. So what we are imagining is the kind of learning environment that feels really generative. And again, kids are at the center producing the work of this, I think some of the shortcomings, in addition to we don't want to just channel learners, or that we also want to have designed environments, like eco journeys, that present this kind of immersive world for kids to engage in. And, and it can feel really narratively, very important and very rich for them. At the same time, the actions that they take, that might trigger the kinds of supports that we're talking about where AI might offer you support in certain kinds of ways. And help groups to go more deeply with their work go more deeply with their ideas. We, one of the main shortcomings is we don't have a lot of data models to work from. So when we talk about NLP, for example, that might be able to Natural Language Processing, they might be able to capture what a group is doing. And they might be able to offer really targeted kinds of support, that would help a group and, and scaffold their learning, we don't have the kind of data models that are rich enough, that could create, you know, the kinds of supports, and that would really trigger the kinds of actions that would feel really rich, and be that kind of just in time support.

 

Cindy Hmelo-Silver:

I think another issue too, is not only not having enough data for the data models, but also making sure that we do our work in diverse populations, so that we understand how data might be interpreted differently in different cultural contexts, you know, urban versus rural or just in different kinds of settings.

 

Laurie Burns McRobbie:

Right, right. So there are probably now third party systems and software that are in classrooms now that as you say, don't aren't don't have these adequate models. And one of one of the issues that's come up in other conversations is that as companies start to develop these products, or they're training them on datasets that are derived from past experience, or in limited, limited cases. And so they really are limited, I think, when in how much they can actually do because of the limitations. That is echoed in other environments as well. 

 

Krista Glazewski:

Yeah and I think that when third party companies are creating solutions for classrooms, I think it's very much with this kind of assessment, diagnostic use in mind. And, and that's a very maybe narrowly conceived and it might be the kind of issue or problem that is that is maybe easier to grapple with. In the cases that we're talking about. You know, what this really involves for us is it involves partnership with teachers and trust building over time, and really engaging with communities for many years in a row before we can have the kind of partnership that enables us to be present in classrooms have these tools and technologies in classrooms, where there's trust in what we're doing, because we're asking parents to consent to allowing their students to basically be instrumented write that they that we can collect voice. We can collect text we can collect in game actions. And this kind of instrumentation also inherently means that that what we're doing people place to place a level of trust in what we're doing.

 

Cindy Hmelo-Silver:

So issues of privacy and bias are come up both in the actual data we're collecting and who we're collecting data from.

 

Laurie Burns McRobbie:

So these are these are risks we're dealing with right now, but they're also risks certainly going forward, as you think about the development of these kinds of tools. And the degree to which I assume this, they'll still be partnerships with outside companies who are going to sell to school districts and so forth. And, and and yeah, and the whole issue of tracking kids, of course, is front and center, certainly for parents. I'm wondering if there are also issues, you think across the diversity of school districts in the country that some of what sort of the better, better implemented, solutions will be adopted by wealthy school districts and less wealthy school districts will not have them may have have to continue to rely on the kinds of tools that you're seeing now that are that are more prescriptive? And what are you seeing happen there?

 

Cindy Hmelo-Silver:

So I guess what I'd like to think about is the kind of stuff that we're developing, I think that we're committed to being freely available. When it's ready for that, I think some of the vision of of this when we started was the notion of being able to support teachers who might be less experienced in that might not be in the school districts that have lots of professional development, but being able to provide just in time support for teachers so that we might have a better be able to better serve the needs of diverse learners in diverse school districts.

 

Krista Glazewski:

Yeah, I would add to that, that part of our vision is AI enabled learning that does, that puts analytics in the hands of the teacher, so that the kinds of analytics we might be able to collect in the classroom can be made actionable by the teacher, and consumable by the teacher. So that in a really informed look of where a group is, or where an individual student is, really could empower the teacher. And a teacher can look and say, "oh, okay, for this group, and this status, and where they're at this is actually really good place." And it's in a group at a very similar place, a teacher might look at that same group and say, "Oh, unacceptable, because I know that that group needs to go much more deeply with their thinking with their reasoning and with their understanding." So we really envisioned that also a kind of classroom where a teacher is empowered to make make the kinds of decisions about teaching and learning, informed by analytics, but not directed by the analytics.

 

Laurie Burns McRobbie:

Are there other other things you're worried about? As you look ahead for how these kinds of as these systems evolve? And particularly, you know, Cindy, you mentioned issues around data privacy, which are always crucial, but other other things you worry about?

 

Cindy Hmelo-Silver:

I think one thing that I worry about is, is how do we create narratives that will speak to a range of different children that are going to be relevant for learners where they are, I think, trying to figure out how to you try to support ambitious learning practices across a curriculum? You know, it's it's easy to do a one shot experience, but trying to think about how do we do this more broadly? What are the implications for the kinds of development that we need to do? I think that would be one of the two of the risks that I would think about are challenges.

 

Krista Glazewski:

I think these emerging technologies are kind of at a point where we don't yet know what's possible in an everyday classroom. And I worry that we might become too familiar with the technologies. And, and not critical enough about what gets instrumented in classroom, what kinds of analytics are collected, and what they're used for. I also worry that we might be working from impoverished data models that are not rich enough to capture the range of experiences that are represented in classrooms across the country today. And then we also worry that we would be really freely giving over data to third parties, that that that would not be the best use of, of child's information. So I worry that we might not trust the technologies too much in many of the in much the same way that we've been really apt to give over our data to social media platforms, or to online companies in exchange for use of services. I mean, we have consented to that implicitly. But I worry with children's data, that it can be used to create models that are that are not rich enough for capturing the range of learning experience, and I'm worried with children's data that it would be used to track children in some way. So I think we should be rightly skeptical about what's collected, why it's collected, who's collecting it, what it can be used for now, but also how it gets used in the future. So that there are plenty of protections in place, about children's actions and and, and work in the classroom and activity in the classroom that really makes it clear and explicit to parents, what it's used for how it can be useful, how it can help learners because I think that ultimately, there's a lot that can be done to really enrich learning the learning experience, and that's the goal here, but not to track learners into a certain pathway or into a certain channel.

 

Laurie Burns McRobbie:

And I think there's certainly one one can observe that there are trends, and maybe these happened at all times certainly seem to be present now that that where there's there's, there's a desire to do just those things that you're worried about this, this trend is prescriptive, this we need children to be in this narrow lane. And, and what I'm hearing you talk about is a very, really not only a child centric, but a learner centric, but a teacher centric model. So a teacher is in a position to to control those environments, and maybe even develop narratives on the fly, possibly, depending on what is going on in the classroom and and who he or she is actually coming in contact with in terms of the learners is that is that a fair way to think about this?

 

Krista Glazewski:

I mean, this would be a good point to talk about the vision of the AI, Institute for Engaged learning, because it's very much actually in line with what you've described.

 

Cindy Hmelo-Silver:

So some of what we expect to be doing is to be able to like adapt narratives. And also have like, both the bigger multiplayer games, but also shorter forms, that can be developed more quickly, so that we can begin to create narratives that cover a whole range of content and contexts. And there are AI techniques like procedural content generation that I think can be used to adapt narratives and generate stories, not necessarily from scratch, but from adapting stories, that and narratives that are out there,

 

Laurie Burns McRobbie:

Right, and that makes sense to the what what's actually going on with those particular learners in that particular classroom.

 

Krista Glazewski:

Yeah, and this vision for narrative center learning, one of the collaborators on the project likes to say, remind us how much of our human activity and experiences organized around narrative narrative isn't just the stories that we tell that relate our own experiences. It's also the stories we tell about who we can become and how we can become. And so one of the collaborators likes to say in this way that we are taking one of our oldest forms of technology, narrative storytelling, and we are, we are combining it with one of our newest forms of technology, AI enabled teaching and learning.

 

Laurie Burns McRobbie:

That's, that's wonderful. I, one of the one of the things I want to make sure we talk about in this whole podcast series, is you know, that AI isn't just something we have to be concerned about, we need to be concerned about it, certainly, but that it is an enabling set of technologies. And we should be thinking about how to develop them and and implement them in ways that really enhance human human functioning. and that's really what you're speaking to, it's very exciting. What would you say are the investments that need to happen? What needs to be out there that make it possible for us to realize what you're talking about? Whether they're, whether they're financial policy investments? Obviously, teacher training investments, what else might you point to?

 

Cindy Hmelo-Silver:

It feels like a combination of all of those things. I mean, we need funding for research and development. We need funding to support teachers, and particularly to support teachers as partner in the design endeavor. They know kids better than we do. And they are the best adaptive support that we have, and that our children have. And so investing in teachers and make helping teachers be part of the design processes is really critical. I think policy needs to understand both what AI can do and what it can't do. And, you know, what are appropriate roles for technology? And I would say that that's probably across technologies, not just AI.

 

Krista Glazewski:

And I would add to that we know one of the problems we run into in schools honestly, are infrastructure issues. We, these AI enabled kinds of learning environments and supports rely on high quality machines that are really fast and devices that that function and can handle these kinds of ambitious and new technologies. But it also hinges on really robust networks. And believe it or not, we think that we like to think that this has been solved, we like to think like schools are, are all connected. And kids have one to one devices, meaning that that every kid in a school has a device. But that's not to say that all devices are the same, and all networks are the same. So we've run into issues in our just in our daily work with classrooms and our partnerships with teachers, where infrastructure is a barrier. And I actually hate to say this, because I used to think, oh, we left the digital divide behind. And right around the 2010s, we started to feel like we've kind of solved this digital divide issue. But we're seeing now, with with what these AI enabled technologies require this kind of second wave of a digital divide, some schools are wonderfully connected, and their devices are wonderfully up to date. So yeah, in summary, one of the investments we need as an infrastructure infrastructure for that supports devices, really robust networks, and security and access to the kinds of technologies that we're talking about.

 

Laurie Burns McRobbie:

Yeah, I think we, we may have seen some of the downsides. And the pandemic may have exposed some of this to just the array of devices connectivity that students had, obviously, coming from home. But even in school, some school districts are able to support their remote learners better than others do. So we're really very much dealing with these issues, even without thinking about adding new technologies to the classroom.

 

Krista Glazewski:

Yeah, Laurie, and I'm actually really surprised that this is not taking more of a center stage. I do think you're right, the pandemic has exposed, you know, the wide range, and really helped us to see where some of these inequities do exist. And yet, I really have seen states not really answered the call, they haven't they've there hasn't been kind of a unified call to really address what some of these network and technology issues are just from an infrastructure standpoint.

 

Laurie Burns McRobbie:

So I'm going to turn to maybe another another aspect of thinking about AI in the classroom of the future, or AI in general in our lives, which is teaching about the technology itself. So we have now I think, since forget what year this went into effect, maybe 2018. In Indiana now requirement that computer science be taught in Indiana public schools. And that's now being folded into the curriculum. And we're starting to see the teacher training that needs to happen so that teachers are equipped to teach computing concepts. And of course, when we think about AI technologies, that's just a natural extension of that. Are you thinking about how then thinking about children K through 12 students in classrooms learning about these technologies and what that curriculum itself might look like?

 

Cindy Hmelo-Silver:

So we have been thinking about that. We have a project primary AI that Krista is the lead on working with upper elementary school students.

 

Krista Glazewski:

Yeah, this project was conceived a few years ago. And we're kind of in our final stages of wrapping this project up. But this project asks the question, "how can we teach kids about AI?" And we're targeting kids in upper elementary grades grades three through five. Again, we're talking about an immersive story world where kids learn about a problem. And in this case, this is a conservation problem, the they are immersed in a world. That's a very real problem. The yellow eyed penguin on the South Island of New Zealand is endangered. And it's also a notoriously shy penguin. And so any attempts to kind of investigate and learn more about the penguin and what the threats are, can cause actually more harm because the penguins might abandon their nest, they might get scared off. And so it might actually cause more harm. And so in this context, we asked kids to, to develop and program an autonomous robot that wrote a roving Penguin or a robo penguin that goes into the context. Of course, it's all simulation based, and it, but the animation is really cute. And so they program this with AI technologies losing concepts from and practices from AI, and from AI planning, computer vision machine learning to go and investigate. Well, what are the threats experienced by the penguins and what can we do to address some of these threats? At the same time kids deal with the conservation problem in their own local environments. So you might be wondering, why would a kid in Indiana care about the yellow eyed penguin in New Zealand, and we actually have worked extensively with teachers and co-design with teachers. And I want to just give a shout out to them because they've been so wonderful. This is Chrissy Pablo at Lakeview Elementary School, Jill Toi at Ligotti. And Amanda Moore, she's at Center Grove. And they've been working with us for a number of years now to create materials and activities that will feel really relevant to kids. What the teachers do is they install what is trail cameras, and they've been wonderful to work with. With us, what the teachers do is they install trail cameras local to their site, they capture the wildlife there, they start asking questions, and conducting their own investigations local to their school site, about what kinds of wildlife are present. Where do we have concerns? What do we know about the habitats? And so again, this is much very much from the ecosystems conservation standpoint, trying to get a sense of the my, our own local environments, what's here, what is what are threats, what are endangered, and what's invasive.

 

Cindy Hmelo-Silver:

As well, perhaps his using some of these what we called unplugged activities, to think about how's the computer doing this, but to think about it in in low technology kinds of ways. So how would they recognize the animals they're trying to look for on these trail cameras?

 

Laurie Burns McRobbie:

And, and these are thinking about the equity issues that that are informing this, this series that is part of this also helping students to understand the data environment, in other words, the data that they're giving, that they're collecting, and that they need to protect, and look for?

 

Krista Glazewski:

Yeah so what we're really trying to make legible for kids are, how these technologies function, what are some of the ideas behind an AI enabled data collection? For example, when we entered this project, we asked kids: what do you think about AI? What are your ideas about AI? Where does what where do you engage with AI in your daily life? And their responses are really creative and kind of funny. And, you know, they actually imagine cases in which they do a Google search. And there are people on the other end of their Google search, supplying answers to them, or there's people in real time responding to questions and queries. And they knew that wasn't quite right. But they didn't actually know exactly how to explain it. And so we're also trying to open the black box, so to speak, and make legible to them. The ways in which these technologies function.

 

Cindy Hmelo-Silver:

I think, in the in the three years since we wrote the proposal, AI has become so much more ubiquitous. I think, when we started, I was probably skeptical myself about why we were teaching this to elementary school kids. And in the time, since, you know, Siri, and Alexa, and all these AI enabled devices have just exploded, and having people understand that it's not magic.

 

Laurie Burns McRobbie:

Yeah, yeah. I mean, the the pace of change has been just just extraordinary. I think that we're all we're all dealing with it. And I think this you what you're raising, though, also are going to spill over to what parents are going to have to start to understand as well. But but that is where we are, we do all need to understand what these what these issues are. Because these children are going to be out in the workforce and 20 years and and hopefully producing great products that we can all continue to use and equitable manners.

 

Krista Glazewski:

Yeah. And I think you know, not every child is going to grow up with an interest in computing or being/wanting to become an AI scientist. Sure. But certainly, as a responsible consumer, we want people to be really aware of how these technologies work and function, what they can do, what they can do, and where there are potential for harms. And I think so teaching kids about about these technologies is really important in this context. And we also know that, that not every kid may become a computer scientists, but every kid is going to have a role in thinking about the responsible use and applications of these technologies. And to continue to return to this time and time again, that every time there's a new advancement, we could be asking the question about where's potential for harm, where's potential for something really ambitious and new to happen? Where's potential for joy? We want we want all of that to be asked of these technologies. And simultaneously so when we think of what are these kids are going to become, they're going to become whatever they want to become. But we also want them to bring their full sense of of self interest in an informed way to these endeavors.

 

Cindy Hmelo-Silver:

So AI will be in their worlds and having them be able to have a sense of how it works and how they might use AI in ways stead are responsible and understanding where there are trade offs, I think will be really important. 

 

Laurie Burns McRobbie:

Yeah, absolutely. Well, I can't think of a better way to better better ending for our conversation today. Thank you both so much for sharing your the work you're doing, your insight that you're gaining from from the work you're doing and, and I will just say I'm also as a as a citizen, enormously grateful to the work that you and your colleagues do to educate the next generation. It's, it's it's really sacred work and I thank you both so much for what you do. Thank you.

 

Guests:

Thank you, Laurie.

 

OUTRO MUSIC

 

Laurie Burns McRobbie:

This podcast is brought to you by the Center of Excellence for Women and Technology on the IU Bloomington campus. Production support is provided by Film, Television, and Digital production student Lily Schairbaum and the IU Media School, communications and administrative support is provided by the Center, and original music for this series was composed by IU Jacobs School of Music student, Alex Tedrow. I’m Laurie Burns McRobbie, thanks for listening.