Engineering The Future

Episode 31: Tech Stewardship (Part 2)

September 20, 2023 Ontario Society of Professional Engineers Season 3 Episode 31
Episode 31: Tech Stewardship (Part 2)
Engineering The Future
More Info
Engineering The Future
Episode 31: Tech Stewardship (Part 2)
Sep 20, 2023 Season 3 Episode 31
Ontario Society of Professional Engineers

In this episode of Engineering the Future, host Jerome James continues his lively chat with Mark Abbott, director of the Engineering Change Lab at the Mars Discovery District in Toronto, on the value of tech stewardship.

Jerome and Mark expand the discussion to include the role tech stewardship plays in aerospace, healthcare and manufacturing, as well as what government can do to encourage and support the responsible use of technology.  

Join us for the conclusion of this fascinating conversation.

Show Notes Transcript

In this episode of Engineering the Future, host Jerome James continues his lively chat with Mark Abbott, director of the Engineering Change Lab at the Mars Discovery District in Toronto, on the value of tech stewardship.

Jerome and Mark expand the discussion to include the role tech stewardship plays in aerospace, healthcare and manufacturing, as well as what government can do to encourage and support the responsible use of technology.  

Join us for the conclusion of this fascinating conversation.

[Start of recorded material 00:00:00]

Jerome James:  This episode of Engineering the Future is brought to you by Cornerstone Law, the official legal partner of OSPE. OSPE members get 30 minutes of free legal advice with a lawyer. If you have a legal inquiry about your license, engineering practice, unpaid invoices, contracts or any questions about construction and engineering law in general, please do not hesitate to contact Cornerstone Law for more information please visit Cornerstone Law’s website at cornerstone.ca or call 416-591-2222. 

Voiceover:        This podcast is brought to you by OSPE, the Ontario Society of Professional Engineers, the advocacy body for professional engineers in the engineering community in Ontario. 

Jerome James:  Welcome to Engineering the Future, a podcast brought to you by the Ontario Society of Professional Engineers. I am your host Jerome James. This is part two of our conversation about tech stewardship and how engineers are taking on the role of stewards to ensure that technology benefits us all. We’ve been speaking with Mark Abbott, professional engineer and director of the engineering change lab at the MaRS discovery district in Toronto. 

I was in the audience when you were on stage about five years ago. We were sitting at the now defunct Google Sidewalk Labs or something. Do you remember the name of that? 

Mark Abbott:    Yeah. It was Sidewalk Labs was the kind of Google back down on the Toronto waterfront where they were, you know, there was all this initial hoopla about smart city technology and creating a test bed. And then quite famously I think that didn’t pan out very well and wound up folding and now there’s a bit of a rebirth I think with a new path forward on kind of community development in that that same area. 

Jerome James:  Right. Right. And there was a really great conversation about AI, emerging technologies, and it was more about the pitfalls of who is developing the technology, the type of datasets that they were being trained on. You know, are they looking at considering all types of people? For instance, are people with darker skin complexions going to be recognized when they put their hand under a certain dryer in restrooms? Yada, yada, yada. And then the pending general intelligence that is coming over the horizon. We didn’t talk about anything with regards to generative AI during that talk. Remember, like, this came out of nowhere, it seems. 

Can you talk – put a little bit of sense of what this latest movement is in the AI realm and some of the things that we need to be aware of and thinking critically around this technology and how tech stewardship can guide us in a path that we come out of the other side not in a dystopian future? 

Mark Abbott:    So again, you know, when that awakening happened in the 1960s around the nature of our relationship with nature there were seminal moments, right? Rachel Carson writing Silent Spring, the picture of the earth from space, the whole earth catalog. And these were sort of, you know, events and things that helped kind of drive the wider awakening. 

I would say today, you know, going back a few years the original Facebook Cambridge analytical scandal, a series of milestones around climate change, and most recently, as you’re alluding to, ChatGPT and the generative AI kind of step change that happened this earlier this year, from my view are those kind of similar milestone moments that are happening in this larger awakening. Right? 

And so since ChatGPT kind of hit the broader consciousness earlier this year, there’s been, you know, a flurry of activity both, like, optimistic, how do we put this to use everywhere and sort of like more pessimistic and cautious around all of the potential pitfalls. And there’s almost a whole new cottage industry around sort of, you know, debate on both sort of sides. 

What hasn’t been in the debate yet is the natural kind of conclusion with generative AI is it’s shining a light on that lack of underlying stewardship capacity that exists in society. So some really interesting organizations have been making metaphors to nuclear weapons and saying, like, hey, we need to kind of think of generative AI, like nuclear weapons, and we need a – you know, we need a compact between governments that can kind of regulate all this. 

What’s missed in that analogy is nuclear weapons were a technology that could be governed by nation states effectively. Generative AI, this powerful technology, has just been put in everyone’s hands all at the same time. The genie is out of the bottle, right? So you can’t regulate it. The dream of top-level policy and regulation is never going to work on its own. What generative AI in particular has really highlighted is something that’s always been true but is becoming more and more acute and hopefully more and more apparent is, as a society, there’s no quick workaround, no magic policy. We have to invest in the stewardship, the tech stewardship capacity, of society writ large if we want to be resilient, you know, as the continued technological developments happen around generative AI and quantum and all of the new technologies that are coming out. 

If we don’t build that muscle with as we rise to each of the, you know, the current challenges. If we don’t build that underlying muscle to a similar strength to the muscle we’ve spent, you know, like, decades and centuries building to develop technology, then we’re going to be out of balance and we’re going to kind of run ahead with these technologies in a way where the negative consequences are going to get bigger and bigger and bigger because of that imbalance between of our ability to create and scale technologies versus our ability to steward them. 

Voiceover:        We hope you’re enjoying this episode so far. At OSPE we’re here for you, making sure government, media and the public are listening to the voice of engineers. You can learn more at OSPE.on.ca.

Jerome James:  OK. So, what does that stewardship look like if it’s not a, you know, government – if it doesn’t look at, like, what the European Union is trying to do with regulations? Does it look – have we not been successful with regulating technological advances in the past? You know, we’re not seeing, you know, biological three-legged dogs walking around or the dogs with [no eyes? 00:07:10].

Mark Abbott:    Yeah. And there was just an article about someone actually trying to do, like, using CRISPR and gene editing to bring back woolly mammoths which is – sounds a lot like the beginning of Jurassic Park, right? 

So, you know, you know I think the short answer is, no, we haven’t been super successful at proactively regulating technology. Which isn’t to say that we shouldn’t. Like, regulation is absolutely part of it and necessary and we can do better in terms of anticipatory regulation and things that move faster. But I’m of the belief of as much as we need to do that, regulation is never going to be enough. And so what government needs to start doing is not just regulating – putting guard rails in place for specific technologies and challenges – they need to keep doing that – but they need to start massively investing in the tech stewardship capacity of society. Essentially, like, again, you know, in order to avoid living in one of these dystopian tech futures, we need the regulation but even more we need to have this kind of more distributed capacity to steward our relationship with technology in all citizens and all, you know, focused in all sectors and all challenges. And that’s the awakening I think that’s just starting to happen right now. 

Jerome James:  And the key to that stewardship is what, exactly? Knowledge? 

Mark Abbott:    Yeah. And I mean it’s like continually advancing understanding of the nature of technology in general and the specific technologies that are coming up, making more values-based decisions, and actually having that translate in our behaviours. 

So think again to, you know, in the environmental movement, right? Yeah, there was there was legislation and interact international cooperation on the ozone layer and on climate change. But there was also the blue bin and there was, you know, people making individual choices about, you know, how they live and how they heat their homes and – in fact, you know, I framed it as an analogy, like, the environmental movement, you know, in the ’60s and now the awakening to technology. It’s actually not an analogy; it’s a next chapter in the same story. Like, environmental stewardship. Like, why do we need to worry about the environment? It’s because our the decisions we’ve made about our technologies and our socio-technological systems have led to an unsustainable relationship with our natural environment. 

Tech stewardship just takes a kind of a step further in that it’s not just the natural environment; it’s also our social environment. So technology is sort of shaping our relationship with all our environments: our natural environment; our political environment; our social environment. We need to become more aware and more skillful about how we shape, you know, how we shape our technology and how we kind of mediate that relationship with all our environments. 

Jerome James:  And that was a great example, the whole blue bin scenario, because I believe that started as a University of Waterloo kind of project or initiative.

Mark Abbott:    I didn’t know that. 

Jerome James:  And it was rolled out in Waterloo, the KW area, before many different areas in Canada and then the world. You know? The KW has had a blue bin program since the ’80s and that’s all I’ve known. So when I have grown up and travelled to different regions and it’s so foreign to me –

Mark Abbott:    Again, I think it all starts with an appreciation of what’s actually going on, right? And, you know, to keep going on the relating it to the environmental movement, there’s a great book called The Wizard and the Prophet that tells the story of almost the pre-environmental movement. So the wizard in the book is Norman Borlaug, who’s the person who invented dwarf wheat that is credited with, you know, saving I think a billion lives from famine and in Southeast Asia. And so he was the – in the book he’s the wizard who was like, technology can save us; we can innovate; we can find a way to feed everyone. The prophet in the book is William Vogt who is sort of a precursor to the environmental movement. He was talking about environmentalism before anyone else. Ran what a lot of people say was the world’s first environmental conference in the ’40s. 

And so the book kind of contrasts these two world views: the wizard who’s going to kind of always technologically innovate their way out of problem and the prophet who’s saying, wait a minute, we already have too many people, you know, on the planet and there’s no way to keep this sustainable and new technologies are just kind of fooling us to squeeze more people into a planet that’s already oversubscribed. 

And so imagine William Vogt, like, later in his life actually committed suicide because despite all of the things he did to sort of bring attention to this he felt like he failed. He saw something before everyone else – way before everyone else. He had made heroic efforts to bring it to the public attention. And even though, you know, by any objective standard he did an amazing job of this he felt like he’d failed because he saw how much more there was to do. 

I think our challenge today is, for me and for you and for others, who are seeing the awakening that needs to happen right now, is how do we learn from what’s been successful in the past and use new social technologies and digital technologies to accelerate the awakening that’s happening? Right? 

So in the environmental movement we can look at that analogy. The blue bin, for all of its kind of more tangible impacts, also had a big social impact. Hey, I have a role in this. We all have a role in this. Right? So if we’re thinking about as change makers now, what’s the modern day equivalent of the blue bin? And I don’t know for sure but I’m wondering if it could be the cellphone. Like, we all have probably somewhat problematic relationships with our cellphone, you know, and how it’s been set up and what we’re doing. 

So maybe, you know, as we’re looking for one of those leverage points that actually drive this larger awakening, maybe, you know, there’s something like each of us kind of becoming more intentional or our relationship with a cellphone that might be the modern day equivalent to the blue bin. 

Jerome James:  Right. Right. Oh man. I feel that if people learned more about the controls that they had over being inundated with ads or notifications that they can really take hold and be more meaningful and mindful of their tech journey. I think everyone should take some sort of, like, course on how to actually use your phone wisely and turn off notifications and all that kind of stuff. Because it’s not intuitive. And it’s set up in a way for you to fail at the beginning. And maybe that’s something that needs to change in legislation. 

Mark Abbott:    Well, and ideally, the reason I talk about cellphones is because it’s so ubiquitous and everything. But kind of like the blue bin, ideally we find something that everyone can relate to, everyone can actually engage with. So, like, this is one small step in tech stewardship. But people don’t stop there. You realize that that same – the same kind of dynamics that are at play there, those same value tensions – you know, my privacy, my convenience, my wanting to kind of like connect with other people versus, like, numbing and sort of avoiding thinking about things because I’ll just watch another show on Netflix. Those, like, fundamental tensions, those same kind of patterns of tensions, show up, you know, in other parts of our day-to-day life when we’re deciding whether or not to buy an electric vehicle or to, like, install a heat pump when we’re deciding where to live. And, like, you know, the various kind of components around that. 

It shows up in our workplace when we’re, you know, when we’re designing – like, maybe we work for a power company and there’s a tension between centralized, you know, power versus distributed power. The better we build that muscle of spotting those tensions in our own life, our work life, you know, our sort of, you know, our societal dialogue, we’re building the underlying capacity to deal with all the problems, not just one problem, right? 

Jerome James:  Exactly. Data privacy and security are critical aspects of tech stewardship. What strategies should organizations employ to protect sensitive data and ensure user privacy? 

Mark Abbott:    Yeah. You know, this is a great example of there are so many best practices and principles and guidelines being put out there. I would say one of the great things about tech stewardship practice is you build sort of a habit, a rhythm, of socio-ethical reflection. And as you’re doing that you can connect out to what are the latest tools and resources that suit your own context. So there’s no one answer to how to handle privacy and data stuff that I think is kind of universally helpful and applicable. 

If you’re working in different contexts the answer is different and there’s probably already tools and guidelines out there that you could use. The problem is that those tools and guidelines get written and then they sit on the shelf. Right? And that’s why practice is where the real bottleneck is. 

It isn’t that people don’t care about these issues. It isn’t even that they don’t have tools in the toolkit. It’s that people aren’t day-to-day seeing, perceiving, finding those opportunities to put, you know, their concern and all of the tools that are already in the toolkit and all the tools that are available to use. 

Jerome James:  Interesting. And – [audio ends]

 

[End of recorded material 00:16:25]