Hello World
Welcome to the Hello World podcast for educators interested in computing and digital making in the classroom. Join your hosts from the Raspberry Pi Foundation as we explore the exciting world of computing and digital making education and hear from educators, learners, and experts along the way. In each episode, you'll meet exciting guests, hear their stories, learn something new, and have some fun along the way. And you can always read more about computing and digital making education in Hello World magazine. Subscribe for free at http://helloworld.cc
Hello World
Do kids still need to learn how to code?
This week we're exploring the big question "Do kids still need to learn how to code?" Our hosts James and Veronica pose the question to our guests who share their experiences from the classroom, voice their opinions and discuss the implications of AI within computing education.
Full show notes: rpf.io/hwp-s6e1
Do children still need to learn how to code?
Pete Dring:It's one of the defining questions of our age.
James Robinson:I still get to solve the problem and enjoy programming.
Chris Coetzee:Are we allowed to use this, is this cheating? I don't know.
Pete Dring:Am I gonna be replaced by a robot?
Chris Coetzee:Maybe teachers are not as part of this equation as much as we would like.
James Robinson:Welcome back to Hello World a podcast for educators interested in computing and digital making. I'm James Robinson computing educator and aspiring AI aficionado.
Veronica Cucuiat:And hello, I'm Veronica Cucuiat, a research scientist here at the Raspberry Pi Foundation, and I've previously worked as a software engineer. I'm really excited to speak to our guests today about our question; do children still need to learn how to code? As ever we really value your comments and feedback which you can share at helloworld.cc/podcastfeedback. James, to start things off what do you think about today's question; do children still need to learn how to code?
James Robinson:It's a really interesting question, and of course we're not going to have all the answers today. And I think I'm going to kick things off by talking from a fairly personal experience of engaging with AI and trying to use it for some programming. I think, in my work i've written a few short scripts and macros and explored API's that I'm unfamiliar with. And I think where I found AI really beneficial is as a tool that can help speed up my programming. I can ask it a question and it can give me a piece of code that probably won't work first time, but it gets me through like layers of an API that I'm trying to explore, so that I can get to the sort of the the Crux of the problem that I'm trying to solve. I think for me as a novice kind of amateur coder. I think it gives me, it's empowering. I think it increases my efficiency. But I still think like there is the requirement of programming knowledge in order to fix, to iterate and to improve, and to solve problems and integrate it into wider kind of challenges. So I think my personal view and it might change over the course of this episode. Is that, whilst the nature of what kids need to learn to code might shift, i think there is still a need for it. I think it's just that the type of knowledge we need might be changing slightly because we have this tool that can support us. That's my view, what's your thought Veronica?
Veronica Cucuiat:Well, I agree with you James. I think it is a really really interesting question and certainly from a research perspective as well. And I think it's nice to contextualize were asking this question because there's substantial progress that has been made in the area of large language models, in the last year or so, and there are studies that have shown how large language models like ChatGPT can complete first year undergraduate computer science programming tasks with a really high degree of accuracy. So my thoughts on this are first of all, why do we program? And certainly why do we think it's important that young people learn how to program? Because there are, on one side there are the programs, the apps and the websites that we write, we want to get to the end product. But there's also that process of engaging with technology, creating with technology that I think, enables young people to engage with technology as creators rather than consumers, that I think is more meaningful. And I think programming is a big part of that and I think it is potentially even more relevant now with the increased prevalence of AI, the use of AI. And I think the fact that we have tools like ChatGPT that can generate sensible solutions, code solutions to some problems, definitely does have implications
on what we teach and how we teach I think:on assessment, on plagiarism, on the learning objectives we might have around programming. For example, here at the Raspberry Pi Research Center I'm working on a project looking at the use of large language models to explain syntax error messages to students. Which is wielding really interesting results in terms of what educators value about programming and it's not necessarily just getting students over the errors and draw working code. It's a lot about the process the students engage in, in that debugging process. So I'm really excited to explore this question with people that will know a lot better what they're talking about, our guests today.
James Robinson:Just you saying that Veronica's made me think, I think there's something about... you're right that process right? I really enjoy programming except for when it's really frustrating. And so if I don't have access to a more knowledgeable other like a professional programmer who can solve my specific challenge. Well, actually maybe I can turn to AI as a knowledgeable other to help me have that kind of dialogue. I still get to solve the problem and enjoy programming but I don't have to wait for a forum response maybe and someone to understand the problem I'm trying to solve. But enough from us we should get our guests in the room. So our first guest is Pete Dring head of computing at Fulford school in York. He develops innovative teaching resources for computer science. So Pete, what do you think about this question of whether kids need to learn to code? Is it relevant? And what are your thoughts?
Pete Dring:Yeah it's a great question. And I think it's one of the defining questions of our age in many ways. I think the big decision in society is "am I going to be replaced by a robot?" And there's a lot of fear and a lot of misunderstanding. So do we need to learn how to code well it's not a life or death? Yes or no. If a student doesn't know how to code they might lose out financially, they might lose out in terms of career options. But for me personally the reason why I enjoy programming is, it's a creative problem-solving activity and I want students to have every opportunity to thrive. I want them to be innovators, creators, the people who can solve the problems of the future. So yes I think we're changing the way that we code. I really like the way that you said that James, but if we want our students to have the best possible chance of thriving we need them to understand what the machines are doing for them. Rather than just relying on something else because it won't just make things easier. It will also make things less satisfying and I want students to get that creative buzz out of understanding exactly what's happening when they solve their own problems.
James Robinson:Excellent, no, thank you and welcome to the podcast. Veronica, would you like to introduce our second guest?
Veronica Cucuiat:I certainly do, welcome Chris. Chris has taught computer science for 24 years and is currently completing a PhD in computer science education at Abertay, Dundee. Chris welcome and thank you for joining us. We are talking about coding and the impact of AI on coding and programming. What does that term mean to you, coding, that term mean to you? And how do you percieve that impact?
Chris Coetzee:Well the word coding covers a manner of sins here, because I think back to when I was a child and what coding meant then, and coding meant when your parents bought you a new magazine with some words in it that started with "10 go to 20". And you coded a game which was basically typing words into a computer terminal that made the computer become this live thing where it suddenly reacted to a button press and there was a thing that flashed and there was lights, and it was interesting. But over time the word coding has sort of shifted to be slightly more than that, because if you were to ask the average 11 year old here in England."So do you code?" They would say "Well, of course I use Scratch." But there's no typing there. They're dragging and dropping blocks that have similar meanings to what I did when I was a child. There's "go to this", "click that", lights flash. So I suppose nowadays you have to say coding is almost like speaking computer-ese. So I speak Python, I speak Java, I speak Scratch. I can communicate my ideas in a way that a machine can understand it. And that's, I think that is the sort of the broad idea of what coding is now. What coding looks like in the future nobody knows.
Veronica Cucuiat:That makes a lot of sense Chris, and I think I really like the way you put it"speaking computer-ese" and also in terms of what coding was originally intended to, was to communicate with computers. Right? So whether it's tags, low-level programming, high-level programming. And the reason why we say high-level programming is because it's maybe more similar to natural language communication. It's about that communication with computers which takes different forms.
James Robinson:I think yeah, I'm really interested in that kind of, that explanation and definition. I think for me when I think about coding I really think of it as a subset of programming. Coding is the bit where you take your intent and you turn it into something that the computer can understand. So whether that's you know Python or a block-based language or whatever. But if we think about the levels of abstraction model which includes design and programming, understanding the task and then the execution as well. The coding is like one layer within that, of taking your design and writing it in a way that the computer understands. The executing and debugging is a different set of skills. So that's where I kind of position coding in this space. And I think it's interesting and I wonder whether, when we think about AI in relation to coding, whether the AI is purely doing the sort of the coding bit or whether it's also able to support the design, the other aspects of what we think of, or what I think of maybe is programming. Yeah, I don't know if anyone has any other thoughts on that? That's just my kind of, my view of programming and coding.
Pete Dring:I really like what what Chris has said in terms of describing algorithms and the way that that's changed. I think with the ease of being able to generate algorithms and code with AI, I've seen a lot of memes going around and people worrying that "Oh my job as a software developer is over, there's no purpose for me." But if we roll back the definition of what it means to write code to unambiguously without any room for disagreement about meaning, describe exactly the functionality of what a program is going to do. Yes, you can do that in Scratch. You can do that in JavaScript. You can do that in Python. But you can also do that by speaking in natural language to artificial intelligence you can do that very quickly. It's a low bar to entry, but actually if a student wants to do that definitively to pull out all of the edge cases to make it crystal clear exactly how it's going to behave in a larger project. All of the detail that you'd end up putting into any programming language you'd have to communicate with AI. It's just a different way of representing, definitively, unambiguously what that algorithm needs to do.
James Robinson:That's a really good point and we've got the whole thick range of things we could talk about here. I'm really interested in both of your experience in engaging learners with coding and AI. What kinds of activities and experience have you seen? How our learners perceiving the large language models and their utility when it comes to coding? And what experiences have you got? Chris do you want to start us off with that?
Chris Coetzee:Yeah. So this fascinates me no end because in the UK where I'm based, we have to do these sort of like what other countries might call homeworks or something like that. That sort of cumulatively adds up towards your final grade for the subject. And it's always been this really interesting thing where pupils are sort of pretending to be real life coders and they have to meet with their client and they develop this solution for the client. And this process can take a couple of months because they're gonna have to sit there and code and they have to research how to code. And in walks ChatGPT in November last year and a pupil puts his hand up and says "Sir, I think I can do this whole thing in a couple of hours." And I go"Hmm. Well, that's a really interesting problem that you've just created for yourself." Because we're going to have to wait and see what the people that run the exams say, if this is okay. And they don't know... we don't know. So, it's sort of turned what used to be for 20-odd years a very nice progression of "we have to do this, then this, then this, then this, then this." Its made an incredible shortcut that now we're wondering "Are we allowed to use this is this cheating is this not cheating? I don't know." So, it's really interesting to me.
James Robinson:I want to come in on that before I bring Pete into this because I think there's something really interesting there about the artificial nature of learning to program in a kind of formal education setting. Particularly when it comes to examinations because, you know it's been a long time since I was in the classroom now, but I remember back then we're doing you know controlled assessments and that kind of thing. And the artificial nature of a programmer sitting down in isolation and deciding a solution for a specific problem. When actually in reality all the programmers and developers that I knew in the real world were writing collaboratively and using GitHub to do things like, you know version control. And so that examination was quite separate from how coding was actually being done in the real world. And I think now it's a similar kind of thing but we're seeing, you know, we have this other access to other knowledgeable people who can support and collaborate with us. How do we as educators address that and help understand you know what actually the learner can do by themselves. That was slightly tangential. Pete what's been your experience of this in the classroom?
Pete Dring:Really mixed, it depends on the age range. So a couple of memorable moments where some GCSE students, so age 14-15, we were going through some theory. I teach a practical GCSE where students have to have an exam in controlled conditions. They're allowed to, so they have to sit the two hours and write some code. They're not allowed access to the internet. They're not allowed to use generative AI they do have a syntax reference guide and they can use that to solve some problems. It starts nice and easy debugging some code that's broken, filling in the gaps and then the harder questions involve the blank canvas. So the end result for us at GCSE for the qualifications, the students have to be able to code independently. They've got to be able to debug, they need to problem solve. They don't necessarily need to remember the syntax. They can look that up, but they need the practical experience of writing code. So we have some lessons that are with Python, writing Python. Some lessons that are theory based and in a theory lesson we were using, it was Microsoft's Copilot rather than ChatGPT. We're talking about different sorting algorithms and we were implementing them in Python to consolidate the theory and I did a really terrible job of implementing live coding and merge sort fail. I would have failed the job interview at most software development companies, but then we talked through the concepts of the theory. We attempted using the skills that the students had learnt by that point. So to apply it we tested it, it obviously didn't work. And then we went to generative AI. It really quickly generated some code that was actually quite similar to what the students and I had developed. And because we'd put the groundwork in to understand the concept first they were able to say "Oh, that's the bit that we were missing. And we could use that to debug ours." Then on the other side of the spectrum as well, sometimes it helps to go straight into the AI and say, "Okay, well can you implement this for us?" and the AI does a terrible job. So known algorithms AI is brilliant at, but something like "can you write some code to draw a picture of a cat? That is I don't know holding a sausage in its mouth" or something ridiculous like that. It will generate some code that draws something. But it's definitely not a cat with a sausage in its mouth. And at that point you've got a starting block to build on with your creativity. So it's a tool that students can use but it's definitely not something that they can rely on in isolation.
Chris Coetzee:Another way this has sort of come into my awareness, is every year when we have this not only this time of year, pupils come up and they say "I'm interested in studying this particular course, computer science is one of the courses and in computer science there's coding." And for many years now one of the first questions you ask kids or they would even volunteer and say "Yeah, I really like coding. That's why I want to do computer science." But for the last about a year and a half, maybe two years. I have pupils coming into quite senior levels of computer science. So age 17-18 going "No I don't really like coding, but I really like computer science." and you go "My word these people exist!" You always put them together you always think that well, if you like computer science, you must like coding. And for a while in the UK, at least we had this dirty thing called IT which was sort of how to use computers and that was a dirty sort of sidetrack to pure proper computer science. It's fascinating, and I'm wondering if AI is not possibly sort of the hint of things to come, as in the way that pupils interact. If they think that AI is like another tool for them to use, then coding almost becomes annoying and "I'll just get the AI to solve that bit and I'll do the things that I actually enjoy." Which might be how to write the ethics of whether something is piracy or not. And I'm wondering if we are sort of becoming very obsessive about AI, "yes, but we should keep AI away from coding because, god that might be the end of our subject and it would be terrible." But I'm wondering if kids feel that way.
James Robinson:It's really interesting, Pete?
Pete Dring:I find that actually really exciting Chris because I think that creates opportunities to add more diversity to the subject. I think it's great that a wider varieties of personalities are finding different aspects of our subjects to love. And I think the best analogy that I've heard is the invention of the pocket calculator in maths. Maths hasn't changed as a subject for it seems like millennia, although mathematicians would argue otherwise. But when this device came along that proved that all of the skills that were drilled into you from a very early age seemed irrelevant because something could do it much faster, much more accurately, much more reliably. People are saying "well, do we still need to learn timetables? Do we still need to know algebra? What's the benefit? Surely a computer can just replace all of those processes?" So then we look at the benefit of knowing general arithmetic, times tables in society and there's still a role for that. I'm sure we've all been in a queue at supermarket getting quite frustrated when people don't have those skills. But then also in assessment in schools and universities. We have one exam in maths where students do use a calculator. We have one exam in maths where students don't and different students will enjoy either depending on the specific type of practical skills. Mathematical skills which are both valid.
James Robinson:And I think I was going to make this point earlier on, but I think our question for this episode is "do students still need to learn to code?" And I think actually maybe that's quite a binary question. Maybe we could be phrasing that "to what extent do our learners still need to learn to code?" Because I think for some learners AI is going to be the tool or the enhancement that means they can actually start to create the ideas that are in their head and coding doesn't you know hold them back. And for others it's going to be this powerful source of sense checking and a collaborative partner. And I think it's just about finding ways to kind of deploy this tool in a way that is most useful for everybody. Veronica I know you've got some questions shall we move on to some other aspects of this?
Veronica Cucuiat:Yes, definitely. Well, I do have a question and I do want to tie into what both Pete and Chris were saying about kind of these nice consequences to broadening participation and broadening what computer science means and the way people engage with it and get involved in it, which I think is really interesting. And I think on the other side of the coin, some might worry about the autonomy students might feel if a tool can generate exactly what I'm being assessed on or asked to produce. Then that yeah, they might feel a reduced sense of autonomy in the meaning of what they're being asked to do, or what they are learning. And I wonder whether you have some thoughts on that on whether you've seen any evidence at all in your students and this maybe potential reduced autonomy the students might feel. And in what way do we need to change or adapt our learning objectives to make sure that, that autonomy still remains, Chris?
Chris Coetzee:I think it's interesting that we're speaking to two teachers here because teachers are obsessed with measuring things. The same as you would with with athletes. We would feel very aggrieved if we were measuring our pupils to run a race and we were to find out that one of them had been taking steroids for the last three years. That would be a real problem for us because it's unfair. And so what we have now is this weird situation where, because fairness is now somehow more important than ability. So the AI makes things possible that weren't possible before, but now we're so worried about "Is this fair is this right? Should we be doing this?" That we are saying."Oh, let's rather not." And so that's to go back to what Pete was saying. That's why for some of these exams they're saying "no calculator because that's unfair." But with AI it's a bit difficult because it's not a singular box with the same 17 buttons on it. It's difficult. So for example, can I still measure a pupil that has access to ChatGPT-4 and a pupil that only has access to Bing Chat and some people that doesn't have access to anything. Is that fair? And I think it's not so much the autonomy that pupils struggle with it's the fairness. And if you have brothers and sisters, the fairest thing in the world is who slices the cake and this is exactly that fairness problem that AI creates that is perhaps more accentuated in teaching and learning and that aspect of education generally, because we need to have a definitive answer how good are you? Are you an A pupil? Are you a B pupil? And how do I know this is by testing and measuring you to within the Nth degree. And so is AI going to mean that you are just a better pupil if you can make AI do better things? Or are you a better pupil if you don't use AI in the first place?
James Robinson:And maybe that comes down to what our measurement of better is? Because if we are measuring "Can you write a piece of code that does X?" well then AI gives you an advantage in that in that arena. But if we're measuring something else and I don't have an answer for that, but if we're measuring like, "Can you explain what this piece of code is doing?" Or something like that, it's a different question, it's a different skill set that we're measuring.
Chris Coetzee:It's a lot harder to measure if we're honest.
James Robinson:Right, it is.
Chris Coetzee:It's significantly easier to measure "Can Johnny say print Hello world" to "Can Johnny... create a cat with insert name of object in mouth?"
James Robinson:Yes. "Can they explain what this code is doing?" Well, it depends as well right to what extent like because I can say "Well it's going to display a message." Or I could say "well it's a function called The Taste of this is perimeter...[inaudible]" It's a much harder thing to assess. Pete did you have anything to add?
Pete Dring:Yeah, thank you. So if we go back to some of the building blocks of our subjects we've got abstraction and we've got decomposition. So we teach from a very early age that computing involves hiding unnecessary detail to focus on the most important thing. It doesn't mean we completely take away that important detail. It's still important, but at different stages of people's learning, at different stages of people's careers there are different things that we need to focus on as the most important thing. And I think I was teaching year 12's. So that's 16-17 year olds recently and we were introducing low-level computing assembly programming. And some of the students said "Ah thank goodness that we don't have to do this anymore because higher-level languages exist." But they also said, "Well it's fascinating now that we understand it because now we understand what's going on inside the CPU, what the registers do, how to control it." But we spend less time actually implementing that low-level assembly code and worrying about what the registers are doing because we're empowered to to use the high-level programming language. And I think that's what is going to allow us to take place using artificial intelligence. Most of the time for most people students won't have to worry about the specifics of certain types of syntax of code. But does that mean that there's no benefit at all of digging down into how it works? Well, no absolutely not. My wife's an artist and she admires other people's art. There's great benefit in looking at art that other people have done but it doesn't compare at all to creating it for yourself. There's more creativity, fulfilment and satisfaction in doing that, but she's a stickler for actually wanting to understand the process. To work out what types of paint do different things, learning new techniques, sharing those techniques with other people. So I'd say you can see a game, you can play a game. You can choose and have autonomy about creating a game or you can actually understand how it works. And at each stage there's a different level of abstraction and at each stage AI can help us with the creative process, but at each stage the detail of how it works is still absolutely crucial for students to be able to learn.
Chris Coetzee:I think there's a risk here. And the risk is the same as with my electric car. My dad told me how to change a spark plug and that was really useful. I have no idea what's going on in my electric car. I plug it in and I drive and that's all great. And I have a concern that if we have pupils that completely disengage with this nuts and bolts, what's going on beneath the hood. That we are going to have a generation that just does honestly think it is magic or there's this artificial human which it might be, I don't know, that creates things on your behalf. And then crucially that has to fix things when things go wrong. Because even the most amazing AI isn't flawless so there would be a problem that pops up and then somebody has to have the skills to look under the hood. Now you might argue well get another AI to do that, but I'm concerned. It's almost on the same level I suppose where people are worried about if everybody rents an apartment who's going to be the one that builds them, you know, it's... I don't know.
James Robinson:I think it's really interesting. I think, it feels like we're kind of at an inflection point and we were at an inflection point about 10 years ago in the UK where we moved away from a very ICT Focus subject because we felt as a community, as a profession that it wasn't empowering. It was leading students to not understand the things that we're going on under the hood. And so we added or maybe actually shifted towards pure computer science. And we've been studying and focusing and developing that for the last ten years or so, and now we're at this next inflection point where a disruptive technology has come a long. It is potentially, really, really, really powerful as a skill to understand and deploy, but it also has the potential to disenfranchise our learners. And I think that that means it is imperative for us as educators to really think about what is the subject that we teach for. And I like the way that Peak explained it earlier on that it's not just about making an end result or a product. It is about the exploration and the study of this wonderful subject and somet imes that means understanding the detailed nuts and bolts. You did make me think Pete I wonder if AI can write good assembly code yet. I haven't tried that. It'd be an interesting experiment to see. But, I think it is imperative that we are mindful that you know, we don't reduce our knowledge of the subject, our discipline down to what can we produce with a computer and therefore what could AI produce with a computer whether that's code or media or whatever. Our subject is richer than that and it is full of conceptual understanding that is valuable, intrinsically valuable not just for a purpose. I'm going to get off my soapbox now and I'm going to see if anyone else wants to come in.
Pete Dring:I think we've hinted on a couple of dangers with artificial intelligence. And I think one that rarely gets talked about is the environmental cost and the financial cost because we're used to things like Copilot and ChatGPT which are to a certain extent freely available. Actually when you try and implement your own large language model on your own computer and you see just how much processor resource and memory it requires, its vast. And there's data centres out there consuming electricity, producing carbon dioxide and the cost of that has to be taken up somewhere. But there's two aspects, one is well, do we want students to have to rely on this hugely potentially damaging resource just to be creative, surely... sometimes I enjoy cycling and walking. I'm a stickler for you know wanting the latest gear but I also like just going outside without a waterproof or without walking boots or anything like that. Sometimes it's nice to have the gear but sometimes it's just nice to enjoy the thing in itself. And I think if our subject is to be creative we want students to not have to rely on all of this extra stuff that has cost to the environment and also potentially restricts access to reliable internet connection in a western country with access to artificial intelligence through the big companies that may or not provide it for free in future. We need to safeguard the democratisation of our subject so we don't become reliant on a technology that can be controlled in the future.
Chris Coetzee:I'm concerned about the financial aspect here because AI has the potential to be a little bit like a drug. And I find myself now adding ChatGPT as a favourite on my browser and referring to it quite often and if I'm doing it, I can sure as heck think my pupils are doing the same thing. And if at some point in the future, it suddenly sits behind a pay wall or there's some financial thing to it. I can imagine it will be a little bit like streaming video services, you feel sort of morally obliged just to remain in your social circles. Of course, you have to know what's going on in this episode. So you will subscribe for £2 a month. But no matter what the financial barrier is. There would be a barrier and that splits the planet into the have's and the have-not's and unfortunately I think AI isn't one of these things where you can go. "Well, the have-nots will just be a little bit worse off." They will be substantially worse off. If I have to think about the speed, back to going to my original example, of that people developing the solution that is ultimately going to give him a grade that will allow him to go to a university of his choice. That'll have a demonstrable impact on the rest of his life. Just that access to this thing, and equally if this was behind a pay wall and increasingly you get the idea that they might go that direction or at least, I don't know, you can use this as long as you say the word ChatGPT every 10 seconds, then that's great. And I'm just concerned about where that might lead.
Veronica Cucuiat:Well I think that goes back to that point of fairness that you made earlier on Chris. And yeah, it is a really important aspect and I think you both mentioned, Pete you mentioned your example of how you used it with students and how nicely you put... I think you referred to putting the groundwork in to then go to the sorting algorithm and discuss it with students. But it was useful because you put the groundwork in. And also Chris in your example with a student who took two hours to do something that normally you do over a few months potentially. I wonder what you both think around what is that groundwork? And what are those fundamentals around programming that you think need to be there for students now? That are there for students now and need to continue being as we teach programming, whether we have access to large language models or not?
Chris Coetzee:I think that depends on what the what the destination is. What do they have to be able to do with the coding? Because that massively influences the granularity of what you need to teach if they have to put a message on a screen then they have to know the coding for output. If they have to put a little pyramid on the screen they might decide they might have to know iteration or a for-loop or a something. And to that extent I think maybe teachers are not as part of this equation as much as we would like. Because ultimately the teachers are teaching for the most part around the world to an exam. And so it's all about the measuring stick, that determines what they should know. So on a sort of a more meta scale what should pupils know? It's a bit like being a doctor. Well it depends what kind of doctor you want to be. I suppose you need to have a basic idea of human anatomy. But if you're going to be a neurosurgeon, do you really have to know how the bones regrow or something? I don't know. There's some parallel there for me in computer science, specifically with coding. Because it used to be quite, we knew where we were going. They had to have these particular coding skills as a fundamental base level. So for example, with web development they had to know what HTML tags are because they have to make web pages. But include AI in that, that massively fast forwards that. So do you still, do they still, is that still the first thing I have to teach them? Or is the first thing that I teach them how to write a prompt in an AI that generates the thing?
Pete Dring:Can I jump in on there, I think that's such a powerful question and a really good answer Chris. Thank you. I think depending on who you ask you'll get a completely different answer to what is the groundwork. But for me an outsider's perspective, whenever a relative or a friend gets introduced to me as a teacher. They think my job is to impart knowledge, to somehow get students to understand things. And there's an aspect of that. But the vast majority of my time is convincing students to want to care, and then teaching them how to ask questions rather than answer them. And we can only impart information if students are receptive to receiving it. I think AI is brilliant at seminating information in a much more accessible and personalised way than even the best teacher thinks that we're able to do. However I've not seen yet a way for AI to motivate students in computer science and programming and software development and problem solving or any other subjects. But also in prompting students to come up with the prompts required for AI. That has to come from the curiosity, from the creativity, from the drive and the desire to think "I want to make something new and I know what I want it to do and I just need to be able to find the tools, the language, the method of articulating that to make it a reality."
Veronica Cucuiat:That's a really nice answer both from Chris and Pete. Thank you. And I also wanted to pick up on that point Pete. I wonder if, do you feel like the way in which you motivate students changes at all in view of these tools?
Pete Dring:You'd have to ask my students and they probably take issue with the assumption that I do actually manage to motivate them. Different things work for different people for different personalities. But absolutely. Yes, if there's something out there that can reduce the frustration. If there's something out there that can create more sense of achievement in less time and effort then, yeah. I want that. It's definitely a useful tool to add to the things that are available in the classroom.
Chris Coetzee:I think it's important to remember it's a tool, it's a tool. And it just like a calculator is a cool tool. This is a tool and I have yet to see a child that I teach that doesn't get fascinated when there's a new thing for them to play with. That's human nature. We like poking in the sand at the little animal bones, we like doing this. So I think the motivation is perhaps easier than than what we would like to admit because some of it could be quite self-serving. The most important thing to a teenager is him or herself. So they really want the thing that will help them get to their perceived finish line. If they're finish line is academic achievement great, but if it's just to finish the assignment so Mr. Coetzee can stop shouting at me, that's an equal good payoff in many respects. So, yeah motivation.
Veronica Cucuiat:You made a really nice point Chris about engaging educators and teachers and perhaps not having done this to enough in the past. And I wonder if you have any thoughts on how to better? It's something that we think a lot about at the Raspberry Pi Foundation certainly in the research team. We try and engage with the kids as much as possible. But there's also so many kind of logistical limitations to that and time and so on. And I wonder if you have any thoughts on that, and specifically as we moved to, the field changes so much and so quickly and I think the topic will change so much. I think we need to engage with educators better and quicker and shorten that loop of the way we enact changes and potentially the curriculum or assessment. And I wonder if you have any thoughts on that on how first better do that?
Chris Coetzee:Well some things change really quickly and some things don't change very quickly at all. At the start of the academic school year so we went for some academic CPD training and it was all about how do kids remember things. And I'm thinking this is my 25th year and not much has changed. Kids still remember in the same way. I think the bit that involves educators, gets back to the, "is this cheating or not?" We really have to come up with an answer for this because as a society we have to decide how much credit we can give something that is produced by AI and then is that a creative credit? Or is that an intellectual credit? Or is that a, you are now worthy of moving on to the next stage of your academic life credit? So I think teachers need to be involved in those discussions at least with the decision makers at that point. I think that's that's probably a good start.
James Robinson:Pete did you want to add to that at all?
Pete Dring:Just to say that there was a consultation from the UK government about the opportunities and risks of artificial intelligence and the opportunities was a short paragraph and the risks was I don't know 10 pages. And it just created a culture of fear of there's going to be more pressure, more accountability and more questions with very few answers. So if you want to engage with the educator community. If we can have a rigorous discussion about how do we check plagiarism? What is acceptable? How do we support teachers to make sure that we don't have to check multiple versions of students work multiple times and sign off each one? Because there are huge opportunities, but huge dangers for the profession to feel threatened to feel put out of our jobs outside of our comfort zones. Even inside computer science, but certainly in other subjects. It's a real area of concern.
Chris Coetzee:That's really important. I think computer science we're just slightly ahead because we think we know what the problem is, but this affects my wife's psychology lessons, this affects other subjects. I think the reason we're talking about this in computer science land is simply because computer science caused the problem so we may be at Ground Zero. We know what we did...
James Robinson:Let's not, let's not take responsibility. I think we're sort of coming to the end of our conversation now, but I think what would be really and we've had a really in-depth conversation, and we've kind of gone all over the place with our with our discussion. I think what would be nice just to kind of close the conversation out is, I wonder if anyone's said have perspectives has changed or broadened? If we think back to the question of the episode do students still need to learn to code or to what extent? What's our kind of closing remarks on that what are our positions? And I'm going to focus on on Pete and Chris. How would you summarise this conversation in your mind?
Chris Coetzee:I think somebody needs to know how to code and if it happens to be the kids in front of me in a computer science classroom, somebody needs to know this. For as a back-up plan for humanity because if we keep building on top of things we've already built at some point if something goes wrong, somebody has to climb down the ladder and fix the bottom. And if the person who has to do that doesn't know the difference between a For Loop and a While Loop. I think as a society we would regret that.
Pete Dring:No, I agree with Chris. I think if there's an opportunity for a student to become a more effective communicator as a politician or to have a wider reach in the science research that they do, or to be more prolific in the inventions that they create. Or to be better at communicating in the charity that they work through. There are creative solutions with code that can open doors for productivity and fulfilment in school and way beyond school that we would be crazy to shut off those opportunities by denying students the opportunity to learn how to code.
James Robinson:What a fantastic conversation that we've had today. If you have a question for us or comment about the discussion that we've had, then you can email us via podcast@helloworld.cc or you can tweet us@helloworld_edu. My thanks to Chris and Pete for sharing their time experience and expertise w ith us today, and we'll be back shortly exploring more topics to do with computing education. So Veronica, what did we learn?
Veronica Cucuiat:James it's so interesting and I could have kept on going and going for another three hours. I just really enjoyed how we quite quickly I think in this podcast went to the basics of what is coding, and the those fundamentals of why do we teach coding. And also even what is teaching and why do we do it. And I really enjoyed that and I enjoyed exploring and identifying there are some things that do change and will change with respect to large language models and AI tools. There are also those things that don't change and will have remained the same for the last few years and will remain the same. And I think finding those ways of better communicating those and engaging with students and educators to identify those and be clear about them and having fruitful conversations is the best way forward. How about you James?
James Robinson:I think that you know, AI is here to stay. I think it's going to have an impact on our classrooms and I think most important is that, I think educators need to engage, they need to think about that impact and be very deliberate I think in the choice of how they use AI to support their learners and challenge their learners. Would be what I took away from today. Also, I'm going to go and see if AI can write me any machine code. I'll test that out. Thank you to our guests and we'll see you soon!