In this episode I use a recent statement made by Sam Altman, regarding the emergence of intelligence, to highlight the outdated way both laymen and many scientists view AI specifically, and complexity more broadly. I argue that, despite what we are old, a truly scientific and rigorous theory or decision does not demand a causal explanation, and in fact such causal approaches are quite counter to doing good science today.
Sam Atlam's excerpt: https://www.instagram.com/reel/C60dq1Oyw_r/
Tweet: https://twitter.com/sean_a_mcclure/status/1789315878544453977
Check out the video version: https://www.youtube.com/@nontrivialpodcast
In this episode I use a recent statement made by Sam Altman, regarding the emergence of intelligence, to highlight the outdated way both laymen and many scientists view AI specifically, and complexity more broadly. I argue that, despite what we are old, a truly scientific and rigorous theory or decision does not demand a causal explanation, and in fact such causal approaches are quite counter to doing good science today.
Sam Atlam's excerpt: https://www.instagram.com/reel/C60dq1Oyw_r/
Tweet: https://twitter.com/sean_a_mcclure/status/1789315878544453977
Check out the video version: https://www.youtube.com/@nontrivialpodcast