Educational Relevance

Instructional Practice Inventory

June 26, 2024 Olivia Wright
Instructional Practice Inventory
Educational Relevance
More Info
Educational Relevance
Instructional Practice Inventory
Jun 26, 2024
Olivia Wright

Dr. Jerry Valentine, a professor emeritus at the University of Missouri. Jerry brings a half century of knowledge and experience and is sharing the instructional practice inventory which involves an opportunity to collect data and communicate with educators about what is happening inside the classrooms to engage students.

For more information on this process, contact Dr. Jerry Valentine at  ValentineJ@missouri.edu




Thanks for listening. If you would like to share your thoughts or topic ideas, or would like to be a guest, you can find Educational Relevance on Facebook, YouTube or email us at oliviaw1201@educationalrelevance.org, brwright44@gmail,com or mark@educationalrelevance.org.



Show Notes Transcript

Dr. Jerry Valentine, a professor emeritus at the University of Missouri. Jerry brings a half century of knowledge and experience and is sharing the instructional practice inventory which involves an opportunity to collect data and communicate with educators about what is happening inside the classrooms to engage students.

For more information on this process, contact Dr. Jerry Valentine at  ValentineJ@missouri.edu




Thanks for listening. If you would like to share your thoughts or topic ideas, or would like to be a guest, you can find Educational Relevance on Facebook, YouTube or email us at oliviaw1201@educationalrelevance.org, brwright44@gmail,com or mark@educationalrelevance.org.



bryan-r-wright_1_04-11-2024_092148:

Hello! Welcome to Educational Relevance. a platform for experienced educators to give proven, successful strategies in education and share those thoughts and ideas with today's leaders in the classroom and administration and how we can help students of today. My name is Brian Wright. I am currently the adjunct professor at Concordia University. I'm alongside two fine individuals today. Mark Mcbeth, my partner in crime. Mark is a former administrator and a leader of educational circles around Missouri. And Kansas. And today we have a guest. We have Dr. Jerry Valentine, professor emeritus at, from the university of Missouri. Jerry brings a half century of knowledge and experience to us today. And he's going to be discussing IPI and IPI stands for instructional practice inventory. And I'm going to turn this over to my partner, Mark. Mark, good morning

mark_1_04-11-2024_092147:

Good morning. So it's quite an honor. Dr. Valentine and I go back many years into the nineties when I was a first year principal and thought I knew everything and. Find out I didn't know much of anything, but Dr. Valentine introduced me to the instructional practice inventory. it's an opportunity to collect data and have a conversation with educators about what we're doing inside the classrooms to engage students. And with that Dr. Valentine, how do we go about collecting data for student engagement? And then how do people go about using that data? And why is that valuable?

squadcaster-6ia1_1_04-11-2024_092149:

Mark, when we, when we built the process and I had the opportunity to do it with one of my former doctoral students, Brian Painter and he managed to sneak a dissertation out of our early work too, which was a win win for him. And for all of us, Brian and I put our heads together to see if we could produce a way of getting in and out of classrooms and having insight about how kids are thinking, how they're engaged. was all part of a large comprehensive systemic school improvement project that I directed at Mizzou was designed to work with elementary, middle, and high schools and help them look at the big picture of everything they're doing in school culture, school climate, principal leadership, teacher leadership, curriculum instruction and assessment, how kids learn. we worked with multiple schools around the state for many years. engaging with those schools and trying to help them refine what they were working on so that we'd have better outcomes for kids. Well, in designing that in 1995, one of the things that I thought would be really helpful would be to have a process that would allow us to understand how kids are thinking as they go through a class to collectively look at that from a school wide picture because all the work we were doing on school comprehensive school improvement was designed to be school wide. We worked at first trying to get in and out of classrooms and codify student engagement strategies. Are kids involved in cooperative learning? Are they doing some kind of inquiry et cetera? And we found out as we played with that for a few months that that really was we were barking up the wrong tree. That really was not it. Detailed enough. It wasn't giving us insight. You expect to put kids in a cooperative, collaborative learning experience. You expect them to be tackling a real problem of some sort, and you don't necessarily always have higher or deeper thinking going on. we settled in on the notion that we were going to have to fine tune this and we built that around basically. Six strategies two of those focus on higher deeper thinking, two of them focus on lower surface thinking, and one of them focuses on disengagement. And to get make that happen. we had to figure out. How do we get in and out of a classroom with a data collector who is trained to collect the data and get in and around the classroom without and be as unobtrusive as possible without interrupting without, creating distractions. And then slip out of that classroom and codify the majority of the kids were thinking or how all the kids were thinking, depending on whether it was the same process or whether it were different thought processes going on in the classroom. So it took us, a better part of a few months to really. Get the protocols down and to be able to figure out how to make that happen. But we finally did, we found out that we could train teacher leaders to do this, we wanted to make the process a teacher driven process, not an administrator driven process. We wanted the teachers to be the collectors that went to team of teachers to be the collectors of the data. We want a team of teachers to be the ones who. Who looked at the data once they had collected it and that first look and decide, how do we want to engage the faculty with the data and the same teacher leaders then were responsible for leading the faculty in the study of the data with a whole notion that if we could look at a big profile of how our kids are thinking across the school day. Which is something I thought many years earlier about, wouldn't it be nice as a principal to have that? Wouldn't it be nice to, as a teacher to have that so I could look at myself and think how I fit into the impact that we're having on all the kids in the school. we built that process so that the teacher leaders would be the ones who engage their colleagues in the study of the data, getting in and out of the classroom without being disruptive and having a real good handle on how kids are thinking meant that. collector walking into the classroom would have to stand there at the door for just a moment and scan the room and get a big picture of how everyone's engaged and then quickly move in around the room and among the students, sometimes leaning over and visiting with students about what they're doing and how they're thinking, sometimes simply looking at the work that they're processing, sometimes standing back of the room and letting the teacher talk to the kids and lead as teachers do, and as they do quite a bit of the time. But it took a while to figure out that process. But eventually we felt we had it down to where we could have validity, accuracy of the coding process. We could have reliability, consistent accuracy within the data collector's own time spent collecting, and iterator reliability, which was to the The consistency of accuracy across data collectors as multiple teachers help to collect the data across the school. so that's basically what we did. Now, we put it in the hands of the teachers into me with the faculty and to engage the faculty on reflective thought processes about. The data profile that they were looking at, or are we comfortable with the amount of time our kids spend doing higher or deeper thinking? Are we comfortable with the amount of time our kids are spending doing or our surface thinking? Do we really like the percent of disengagement that we have here within the classrooms across our school? Do we really like the head counts of how many kids are disengaged and and how they're thinking and bubbling up those kind of conversations after each data collection, we found we need to collect data 4 times a year. One data collection a year really didn't even. Moved the needle in terms of growth and change and two data collections started to build a vocabulary. Three data collections moved the needle and moved it statistically significantly. We were seeing significant increases in higher deeper thinking, significant decreases in disengagement. And yet we also knew that if we tried four, we thought we might have something better and we did. It moved the needle a little further. we had schools that said, okay, if four good, then let's do 8, 9 a year, every month, why not? But I found out when I looked at the data on that, that poor data collection seems to tap out. It starts to lose its meaningfulness to the faculty when they're looking at it too often and having too much to too much time spent on talking about when they have other things to do. So, 4 data collections a year seem to move the needle and 3 certainly did as well. So that's what we've been recommending to schools for some, you know, 2 and a half decades now.

bryan-r-wright_1_04-11-2024_092148:

Jerry, when you say four times zero, is that quarterly? And

squadcaster-6ia1_1_04-11-2024_092149:

basically quarterly is what we suggest to the schools. Yeah, somewhere about around the middle of the quarter is usually the best kind of time to slip in

mark_1_04-11-2024_092147:

How many data points are being collected in a particular day?

squadcaster-6ia1_1_04-11-2024_092149:

Depends on the size of the school because what you'd like to have is all classrooms. We collect the data with a systematic proportionate sampling process that the data collector starts at a point on the school map. Goes down that hall, classroom, classroom, classroom, turns a corner, classroom, classroom, turns a corner, classroom, classroom, turns a corner, classroom, climbs the stairs, and repeats. And you repeat that pattern, you repeat that pattern, you repeat that pattern, you repeat that pattern all day from the beginning of the school day till, till the end of the school day. And at the end of a school day, In an elementary school, for example, maybe a typical mid sized elementary school of four or 500 students, going to have somewhere in the neighborhood of 120, 130 data points in a larger school, let's say a large high school with 1500 kids or 2000 kids, you would have multiple data collectors collecting data, keeping distance from each other, because now you've got to have a pool rich enough to be valid. And that means you're going to need 250, 300 data points, 400 data points from a school that size. I can remember in one of the largest schools, middle schools in the country, we collected data. We had 5 or 6 data collectors working at the same time because. Because of the nearly 3, 000 students that were in that school at the time. So, you've got to have a large data pool. And they were collecting 1, 000 data points. But most schools, 125, 150 smaller elementary schools, marching that on up to 150, 175, to 200 or more,

mark_1_04-11-2024_092147:

are we looking at middle school, elementary, high school? Where does IPI come in at? What grade level?

squadcaster-6ia1_1_04-11-2024_092149:

We built the process in the pilot with elementary, middle, and high schools. We tested it with elementary, middle, and high schools. And we've been using it since 1996, November, when I first did a collection with elementary, middle, and high schools, because that project that it came out of was a support system for 10 elementary, 10 middle, and 10 high schools. We were going to be working with those 30 schools for the next two and a half years from my office at the university do that comprehensive school improvement. So we found out it works across the grade level and you get very different kinds of results too when you start thinking about. Engagement and disengagement zone across those grade levels. But, but that's what we wanted to do. And that's what we were able to figure out how to do with validity.

bryan-r-wright_1_04-11-2024_092148:

Now, who are the data collectors again? I missed that one, Jerry.

squadcaster-6ia1_1_04-11-2024_092149:

Now we train teacher leaders to be data collectors. teacher leaders who are willing to put in just a little bit of time to be a part of this process. And as I tell them when they start thinking about do they want to do this or not that look, if you, Pitch in and do this, and you help your colleagues look at how kids are thinking, and you do that for a few years, you're going to look back after those few years, and you're going to feel pretty good about your contribution to where that thought process moved from when you started to where that process ends up whenever you stop and look back at it longitudinally. And the longitudinal data that we have supports that as well.

bryan-r-wright_1_04-11-2024_092148:

added on questions. So this is not an evaluative tool. So teachers are pretty comfortable about them coming in and doing this.

squadcaster-6ia1_1_04-11-2024_092149:

No, we've told the principals, you need to know about this process. understand it very deeply, but you have to simply stay on the sidelines and support it it has to be a teacher driven process. And it's not about evaluation. It's not about supervision. It's not about administrative walkthroughs. It's totally about a colleague going into your classroom. on the kids, not making any kind of code about you as a teacher. This is all about the kids and how they're thinking. A teacher's name is not written down. A room number is not written down. It's an anonymous pool of data about kids thinking across the whole school day. So that, you really try to take the, I got you. Kind of mentality out of it. I call it the jazz it up effect. We, we were trying to do our best from the beginning to take the jazz it up effect out of this process because the more we can get teachers to relax and trust that these are their data. more we're going to get what's typically happening day by day by day in these four data collections each year, there

mark_1_04-11-2024_092147:

why is it valuable? Above and beyond assessment data? So if we're doing summative or formative type of assessments with students saying, did they learn the standard or not? Why this data in addition to that?

squadcaster-6ia1_1_04-11-2024_092149:

are very few ways to get this kind of data and very few people have gone after this and felt like they had have done it with a level of You really have to train the teachers thoroughly on how to get, collect good data. you have good data, now, we get a picture of how much, higher and deeper thinking is taking place in the school. Well, and deeper thinking is really critical to student learning. we've got decades of data from scholars who've said that as you increase higher and deeper thinking in the classroom for the kids, you increase academic success. it's a straight line relationship in almost every study that you look at, and we've been looking at, and I've been trying to read all of that literature for years. You know, I come from a background of school leadership and preparing principals, but to me, principals have to be instructional leaders, and that means they have to understand all of this. So that's what I focus on in my whole career. if we know that we want to increase higher, deeper thinking for the kids, then why is that so important? Well, number one leads to more positive academic success. Number 2, you look at it as a way of understanding what is expected of our students once they've been with us for 12, 13 years. And when they go out into the workforce, the national reports that come out of of the workforce tell us consistently that, economically, they want certain kinds of skills from their graduates to contribute to the economy and, to, for example, let me give you cite some, some examples here from the National Forum and Forbes survey, Forbes studies, surveys, That have come out in the last several years. In fact, they usually do these every four or five, six years. Here are the top eight skills, in, in the top 10, the ability to be analytical, critical thinking, creativity, curiosity, judgment, innovative. Complex problem solving, and continuous learning. All of those are linked back to higher or deeper thinking. Because when we talk about the concept of higher deeper thinking in the Instructional Practices Inventory, the IPI, talking about how we help kids to be more, critical, more analytical, more creative to be better problem solvers, to be more complex thinkers, to think more deeply, more critically, more reflectively, more analytically, more creatively. I roll that off of my tongue all the time, multiple times a day. It's what our definition has been we started, and it's what basically the general definition of higher thinking has been. Whether you go back to Dewey and in the beginning of the century, whether you move to Bloom in the middle of the century, last century, whether you move to Hattie and others who have really been contributing significantly to the literature in the last, in the last few decades, that's basically the essence of higher, deeper thinking. that's what we're trying to codify it with it with our observational processes in the classroom. How much higher deeper thing do we have? How do we grow it? much disengagement do we have? How do we reduce it? We need a certain amount of lower order surface thinking to build fundamental skills and, and, and recall and practice and, and drive home those innate kinds of skills that, that are needed in education. So, 3 big pictures, higher or deeper, lower order surface, disengagement, what the IPI provides for a faculty to study.

bryan-r-wright_1_04-11-2024_092148:

as you're talking about training teachers, let's go back to the basics of Getting the right people on the bus. As they say, when you talk about Jim Cullen, getting the right people on the bus, how do you get the proper teacher leaders so you can reduce the subjective results that teachers want to provide and get something a bit more objective, what you're trying to attain with students.

squadcaster-6ia1_1_04-11-2024_092149:

We ask the school leaders to or ask teacher leaders who have certain kind of qualities. Really good understanding of the teaching learning process. Respected by their peers, willing to put in a little bit of extra time, not a lot required, but just a little bit demonstrated leadership among the faculty and, and then openness, openness to look at and, and, and consider something like this. And lastly, and finally, the ability to stand up in front of their colleagues and help them engaged with some meaningful data. So to get them to the skill level that they can do that well, we're looking at a basically an eight hour workshop eight o'clock to four o'clock. We start that workshop with the overview with background reading. We move straight into coding examples of kids in the classroom on the written paper. by lunchtime, we're finishing that process up. And by early afternoon, we're out in classrooms doing guided practice and debriefing as we walk them through how to build the skills of getting in that classroom. It's a it's a lengthy process. And at the end of the day, and we also sneak in an hour also. Near the middle of the afternoon of how do you lead your colleagues in the state of the data? Because, frankly, that's one of the toughest jobs for that team of teacher leaders to engage their colleagues meaningfully in the data. And then, at the end of the day, they take a written assessment and code and code and code for about 3540 minutes. And I take the codes back to my office here and, and go through them and send them, give them feedback about their coding skills and their coding accuracy. That's how we get those people to that. Then when we start them out for their first data collection, we recommend they pair with somebody. If the school's already using the process and they're being added to the team, we recommend they pair with a veteran data collector, if not, then two rookie data collectors pair together and they work like that for the first data collection day or two, and then they move off solo. And as soon as they move off solo, we're off and running because now we've got that many more heads out there to get deeper, richer pools of data from each school.

bryan-r-wright_1_04-11-2024_092148:

Okay. And then the second question connect the dots as it were, between the summative evaluations, such as maybe an assessment for learning strategies that you may want to use with kids. So the kids are getting part of what they're doing with the fire of higher earning and how they can define what they're learning in the classroom. But then with the summative evaluations, such as standardized testing, how does IPI affect positively the summative testing strategies in schools?

squadcaster-6ia1_1_04-11-2024_092149:

had the chance to do a lot of studies with that, including multiple longitudinal looks at the data. We also had the chance to look at the data across multiple states. we have consistent, and we've gone to pretty sophisticated statistical treatments, hierarchical linear modeling, structural equation, equation modeling. We're way past the notion of just simple correlations and simple regressions in terms of trying to look at all of this over the years. we've written our papers, and We've presented them at national conferences and we put them up on the, website, et cetera. So, we've got the research grounding behind that, but, what we found is basically that as higher deeper thinking increases. So too does academic success in schools. also found that the big tail wagging the dog for a lot of student learning is disengagement and the amount of time that kids are disengaged and not engaged with the learning process in the classroom that disengagement. As it goes up, academic success goes down. And the ratio that we found over that in our studies has been about two to one for every two percent increase in disengaged learning time in a school. That school year, The school can expect to have lost one more percent of a student body failing the high stakes test, meaning not meeting standards. So, the 2 to 1 ratio, as we've called it over the years in our, in our work with the IPI is, is basically shown you, you've addressed this engagement and, you also have to address higher, deeper thinking, you have to do them both at the same time. And we've also found mathematically that it takes about four units of higher, deeper thinking. To neutralize one unit of disengagement. Now, all of that's a good old statistical fancy stuff that we do, but it's, informative and we see it in the real world too. When we start looking at the schools, I've had some real fun with schools by plotting IPI longitudinal data, their statewide assessment data and any other kinds of assessment data that they have. Doing plot lines and going up and circling Where do you see the dip? Where do you see a growth? Why is that happening? Let's have a faculty conversation about those kinds of things went past beyond just our leadership team. So it's meant to be a real hands on tool for schools to use that puts him in the driver's seat. Both in terms of collecting the data and in learning from the data.

bryan-r-wright_1_04-11-2024_092148:

Okay. you,

mark_1_04-11-2024_092147:

So the conversation today was about I P I instructional practice inventory and how we collect data. And what's that data is used for. It's uniquely different data than our assessment scores, but your question, Brian, about making that correlation between. Using data like this and then data like student assessments is that there's a link. And so that conversation about the research that says, if we're doing higher, deeper higher order questioning, deeper thinking there's going to be a correlation in our assessment scores. We have to have teachers having that dialogue. Sounds like what Jerry's saying is we, we got to have teachers have dialogue around that data. And if they're having dialogue around that data, then they can make changes. Dr. Valentine, if people want to reach out to you to learn a little bit about this, go to IPI student engagement. com. Would that be about right?

squadcaster-6ia1_1_04-11-2024_092149:

That's a good place to go and look at our work and look at our research and all. And then of course by email the best place to, to get ahold of

mark_1_04-11-2024_092147:

Okay.

squadcaster-6ia1_1_04-11-2024_092149:

certainly

mark_1_04-11-2024_092147:

Very good. We'll, provide that for everyone as well. Anything else, Brian?

bryan-r-wright_1_04-11-2024_092148:

I think for this session, I think we're about done. I want to say thank you, Dr. Valentine,

squadcaster-6ia1_1_04-11-2024_092149:

pleasure.

bryan-r-wright_1_04-11-2024_092148:

in the meantime, I want to say thank you all. Thank both of you for your participation today on education relevance. Have a good day now. Bye bye.