EdTechnical
Join two former teachers - Libby Hills from the Jacobs Foundation and AI researcher Owen Henkel - for the EdTechnical podcast series about AI in education. Each episode, Libby and Owen will ask experts to help educators sift the useful insights from the AI hype. They’ll be asking questions like - how does this actually help students and teachers? What do we actually know about this technology, and what’s just speculation? And (importantly!) when we say AI, what are we actually talking about?
EdTechnical
(Short) Owen's Spicy Take on Hallucinations and Libby's Strong Disagree
This week, we're trying something a bit different and doing a short episode. The gloves come off as Libby and Owen engage in a lively debate about the "hallucinations" in large language models (e.g. unexpected and hard to explain errors) and their impact on building educational products.
They spar on the nuances of model hallucinations, discussing the various forms and potential consequences. Owen presents a "spicy take" on the matter, advocating for the value of engagement and interaction even if it means accepting a certain level of inaccuracy.
Libby, however, expresses concerns about the accuracy of information in educational settings, particularly in K-12 schools. She emphasizes the importance of the high bar set by traditional educational tools in terms of factual correctness.
Who scores an ed-technical knockout? You, the listeners, will decide!
Join us on social media:
- BOLD (@BOLD_insights), Libby Hills (@Libbylhhills) and Owen Henkel (@owen_henkel)
- Listen to all episodes of Ed-Technical here: https://bold.expert/ed-technical
- Subscribe to BOLD’s newsletter: https://bold.expert/newsletter
- Stay up to date with all the latest research on child development and learning: https://bold.expert
Credits: Sarah Myles for production support; Josie Hills for graphic design; Anabel Altenburg for content production.