Undisciplinary

Fireside Chat: Send us a text! Plus some random thoughts & announcements

June 15, 2024 Undisciplinary
Fireside Chat: Send us a text! Plus some random thoughts & announcements
Undisciplinary
More Info
Undisciplinary
Fireside Chat: Send us a text! Plus some random thoughts & announcements
Jun 15, 2024
Undisciplinary

Send us a Text Message.


**Below was generated by AI and doesn't really have much to do with what we actually talked about!**

Curious about how American English phrases like "it's been a minute" are shaping everyday language? Join us as we explore this playful debate and kick things off with an engaging discussion. We also introduce an exciting new feature that lets you send us anonymous text messages through Apple Podcasts and Spotify, making it easier than ever to share your thoughts. Plus, don't miss my shout-out to Pat McConville's new podcast, "Concept Art," and my experience of being featured on it, along with Jane's thrilling update on her new role as co-editor-in-chief of the Research Ethics Journal.

We then pivot to the ethical nuances of using ChatGPT for qualitative data, reflecting on how it mirrors everyday opinions shaped by the internet. Diving into the role of research ethics committees, we question whether their primary function is to protect institutions or to facilitate meaningful research. Drawing from personal experiences, we highlight the diverse cultures of various ethics committees and discuss H. Tristam Engelhardt Jr.'s critique of market solutions in secular humanist bioethics. Finally, we tackle the unsettling practice of using AI to recreate deceased loved ones, inspired by the documentary "Eternal You," and explore its broader implications for cultural death rituals and commercial interests.

Undisciplinary - a podcast that talks across the boundaries of history, ethics, and the politics of health.
Follow us on Twitter @undisciplinary_ or email questions for "mailbag episodes" undisciplinarypod@gmail.com

Show Notes Transcript Chapter Markers

Send us a Text Message.


**Below was generated by AI and doesn't really have much to do with what we actually talked about!**

Curious about how American English phrases like "it's been a minute" are shaping everyday language? Join us as we explore this playful debate and kick things off with an engaging discussion. We also introduce an exciting new feature that lets you send us anonymous text messages through Apple Podcasts and Spotify, making it easier than ever to share your thoughts. Plus, don't miss my shout-out to Pat McConville's new podcast, "Concept Art," and my experience of being featured on it, along with Jane's thrilling update on her new role as co-editor-in-chief of the Research Ethics Journal.

We then pivot to the ethical nuances of using ChatGPT for qualitative data, reflecting on how it mirrors everyday opinions shaped by the internet. Diving into the role of research ethics committees, we question whether their primary function is to protect institutions or to facilitate meaningful research. Drawing from personal experiences, we highlight the diverse cultures of various ethics committees and discuss H. Tristam Engelhardt Jr.'s critique of market solutions in secular humanist bioethics. Finally, we tackle the unsettling practice of using AI to recreate deceased loved ones, inspired by the documentary "Eternal You," and explore its broader implications for cultural death rituals and commercial interests.

Undisciplinary - a podcast that talks across the boundaries of history, ethics, and the politics of health.
Follow us on Twitter @undisciplinary_ or email questions for "mailbag episodes" undisciplinarypod@gmail.com

Speaker 1:

okay, so welcome to. I don't know what we're gonna call this. We can't keep on having little subgenres, or can we? I guess we can do whatever we want fireside chat. What are we gonna do? Maybe we could build a fire sing a couple of songs.

Speaker 2:

Huh, why don't we try that?

Speaker 1:

Oh, a campfire. Well, isn't that all snug and comfy Fire?

Speaker 2:

no, good no.

Speaker 1:

All right, so we're just going to have a short little catch-up because it's been a while. It's been a minute, as some people say. What do you think about people who say it's been a minute?

Speaker 2:

or it's been a moment. I think they are American, Chris.

Speaker 1:

Yeah.

Speaker 2:

I've been railing against American English and I say that with love to my American friends and family. I blame the internet, I blame ChatGPT, I blame. I just feel like.

Speaker 1:

Right.

Speaker 2:

I blame the youngins.

Speaker 1:

I don't know man, american, english and infiltration has been around for quite a while.

Speaker 2:

Yeah, it has, Maybe it's. I don't know why it's bothering me this month in particular.

Speaker 1:

Mm-hmm, is it because of I don't know why it's?

Speaker 2:

bothering me this month in particular. Oh, it's probably because I've been doing too many of the New York Times games. It's all on me.

Speaker 1:

Uh-huh yeah, you can't blame Americans for speaking American.

Speaker 2:

But you kind of can sometimes when it's dumb.

Speaker 1:

Anyway, it has been a minute or a moment, um, depending on your prejudices, uh. So we do have a couple of updates. One is on the technological frontier, uh, in that, depending on what medium you're listening to this by, but I think we've tested it out, at least on the apple podcast, serving whatever it is app, as well as spotify. If you go into the episode that you're currently listening, you can send us a message, a text message. I've no idea if you're going to get charged to send me. I can't imagine you would, but that day of charging for text messages seems to have been one of the past.

Speaker 1:

But anyway, you can write a little message to us to say, I guess, whatever you want, and it is completely anonymous, we don't get a. It's not like it comes to our phones. It goes to sort of the host of the podcast or the server or whatever it is. We're very technical here, so we don't get your phone number. We don't even know the name of who sends it. So if you do want to be identified, you could either have an alias or you could use your own name. But yeah, you can communicate directly with us that way if you would like. There is always the older avenues of X, or Twitter, which is slowly dying, or email undisciplinarypod at gmailcom.

Speaker 2:

Probably also send a letter to our respective universities.

Speaker 1:

You could send a letter. So there is that. That's just a bit of an update. Do send us through a text or a question or a comment. It would be exciting to see how this works.

Speaker 2:

Just as a spoiler, chris. I just sent one. So if you get one, it's from me. Sorry Did you say who it was from one. So if you get one, it's from me.

Speaker 1:

Sorry Did you say who it was from. No, I didn't Look, it's just come through. Someone has said undisciplinary is really cool. Would that be you, Jane?

Speaker 2:

I didn't remember having written cool, that's a bit embarrassing.

Speaker 1:

Yeah, so it does bit embarrassing. Yeah, so it does come through reasonably quick. The other bit of news and I haven't told you this, jane, and this is, I guess, both self-serving but also a shout-out to a friend and fan of the show, pat McConville. Pat, who has been a previous guest as well as a previous correspondent he was the one who said we talk about going to the gym too much he has a new podcast out called Philosophy, concept Art. Oh, cool, yeah, and he has interviewed a number of people, including myself in the latest episode, and so it's called Concept Art, I should say. And the sort of question is how does art shape ideas? Concept Art seeks to answer this question through conversations with philosophers and thinkers about their scholarly work.

Speaker 1:

So get that wherever you get good podcasts and give it a follow. So including myself. Supriya has been on there recently, yeah, and a number of other guests, so I quite enjoyed it, but you know, I guess I have narcissistic tendencies and enjoy talking.

Speaker 2:

You're a narcissistic man.

Speaker 1:

About myself. Yes, yes, I talked about my art. So yeah, check that out, Send us a message. And what else, Jane?

Speaker 2:

Well, I've got some news, breaking news, and that is that this is what I just was mentioning earlier me and Bridget Hare at the Kirby Institute at UNSW and Eve Aquino, who has been on the podcast and is a colleague of mine at Achieve at the University of Wollongong, we are going to be taking over the co-editorship in chief oh, what is that title of research ethics journal, and we're super excited because we think there's all sorts of um interesting bits and pieces going on in research ethics at the moment, particularly um to do with um new technologies, how people interact with them, how we use them, um.

Speaker 2:

But I'm also continue to be interested in things like how we do ethics in emergencies and that sort of thing. It's one of those things where in Australia, you know, with the national statement, we've got really strong and, I would say, good guidance, except for there are all these bits that come up all the time where you're like I don't really know about that there are all these bits that come up all the time where you're like I don't really know about that.

Speaker 2:

So the three of us are excited to be able to do some fun stuff with research ethics.

Speaker 1:

Yeah, well, I mean, it is, as you mentioned, with the new technologies, quite an interesting thing to do. An interesting thing to do the way that we can, you know, increasingly access people's lives through various media. I've also been thinking about, with ChatGPT as a sort of you know, whether you could use that as a source of qualitative data, because if it is an amalgam of the sort of internet, is ChatGPT the man on the Clapham Omnibus? I think I may have talked about the man on the Clapham Omnibus.

Speaker 1:

You know the everyday person. The pub test. Yeah, I know who the man is.

Speaker 2:

So you and Chris Deggling frankly quite a fan of the man on the Clapham Omnibus.

Speaker 1:

I'm an ironic fan of the man on the Clapham Omnibus.

Speaker 2:

No, because do Americans know what the Clapham Omnibus is? So here I am back to my bashing. But I guess the internet, what shapes the internet shapes chat GPTs. So it would give you a certain kind of man, woman, person, but they're unlikely to be on a Clapham Omnibus.

Speaker 1:

Yes, so all right. Well, congratulations on that position. Research ethics. Thank you very much. Yeah, and I think it would be good to you know we've talked about research ethics in different ways before here, and the role of research ethics committees. I think it's an interesting one yeah, like what you know, are they a defensive purpose, to sort of, uh, just make sure institutions are protected um or allowing good research? Yeah and who's in the room for the decisions yeah, all of that.

Speaker 2:

So the three of us that I just mentioned, we're all on different ethics committees that all have quite different cultures attached to them and, yes, it's something that we're all particularly interested in how those function, the work they do.

Speaker 1:

Well, in case the listener hasn't realised, we don't really have a specific show in mind here and I have just picked up a book that was in front of me but it touches on something, Jane, that we have talked about, I think, off air, before, and I was going to send this through to you just because it's sort of a smack on some of the stuff we've been talking about. So I know people are like what book are you talking about, Chris? What book are you talking about? It's an old book by a guy who, if you know who he is, some people find him a bit of an old conservative bioethicist, but I think he's actually quite interesting and nuanced.

Speaker 1:

H Tristam Inglehart Jr wrote a book called Bioethics and Secular Humanism the Search for Common Morality, and so this was written in the early 90s, particularly with a US context. Secular humanism is sort of a bit of an outdated term, I would say, these days, but it's kind of trying to find a. It's sort of. He's writing this in the context of the sort of culture wars, of the sort of latent days of the moral majority in the US. And how can we think of bioethics and healthcare without appealing to, even though liberal, pluralist society? But anyway, that's not what I was actually going to draw attention to.

Speaker 1:

Oh, okay, you can comment on that as you may please, but he has this little subsection in it towards the very end of the book, saying why market solutions are so unproblematic in secular, humanist bioethics. And this is something that we've talked about in different ways, just the way that I think the lack of criticism among bioethicists not so much around public health people, but I'd say around bioethicists this lack of criticism because I think of this function of consent that if, particularly in sort of liberal Western societies, if people consent to being, you know, having a procedure done, selling a Morgan, being a surrogate consent in the market guarantees freedom, and so there aren't many concerns. Not, you know that, say, a feminist bioethicist, a Marxist bioethicist, a Catholic or Christian bioethicist, an Islamic bioethicist and so on, may have more breaks in place, whereas if it is a secular, humanist bioethicist, at least according to Tristan Engelhardt, and I would agree the market is a tidy way of solving problems, so long as consent is there.

Speaker 2:

It's an interesting reflection on. Can I assume that Kristen Englehart is a white American man?

Speaker 1:

He is. He's from Texas, I believe.

Speaker 2:

Oh, from Texas. No, I'm just thinking. You know, there's a particular kind of ethics that says that as long as we all get to do what we want, we're fine, basically, you know. And getting to do what we want includes consent. So if we say that we're okay with something, then cool, those problems have been overcome. It's just such a subscription of the population for whom that actually works in any kind of reliable way.

Speaker 1:

Yes.

Speaker 2:

Yeah.

Speaker 1:

So he's not a fan of it. He's not saying that this is good, he's sort of more descriptively saying this no, no, no. Well, I mean, I only randomly just picked up this book without any intention, but no, I mean, what you say is right, that's what he says exactly.

Speaker 2:

Okay, okay, okay, I thought he was saying well, because I still feel like we come across that a lot. Hey, so that the idea of of some kind of justice is based on everybody having the same opportunities to do things, uh, and if you've got the wherewithal to make them happen and you, you know, make your good decisions and all of that, then then that's justice, right, and to me that like I don't agree with that, by the way, but um, to me that's what I thought you were suggesting there, that kind of way of thinking where everybody is expected to have the same sort of capacity to act in the same sort of way.

Speaker 1:

Yeah, and that includes consent.

Speaker 1:

Yes, yeah, and that's where he's saying the authority of the decision comes from. So he says the market has a justification similar to that of democratic communal endeavors it is justified by the authority of each and all who participate. When one seeks the authority of any transaction, presuming that the traders own what they trade and are not deceiving or coercing each other, the immediate answer is that the market derives its authority from the choices of the participants and then later goes on to say the market is a way of creating answers where proper answers cannot be discovered. So in the context of, I guess, bioethical dilemmas, the market does provide that sense of an answer, or at least a workable, pragmatic answer, if people are giving their consent, quote-unquote, freely. But yeah, he raises those sorts of objections that you alluded to, with not all choices are made under equal conditions.

Speaker 2:

Yeah, I actually heard somebody the other day. We were talking about consent and somebody said something about well, they agreed to it. It was in the fine print. It made me think about you, know. We were just talking a little bit earlier about social media. It made me think about you know.

Speaker 2:

We were just talking a facebook group or a subreddit or something like that then presumably they have agreed somewhere when they joined that network I have no idea, yeah, um, that their, that their information might be used in a certain way. Um, and I feel like that's a bit of a common misconception, like, oh well, they, well, they agreed to it, so that's fine. I have no idea what I've ever agreed to on social media, and I think the idea of being informed gets missed. Yes, and a lot of that yeah, yeah.

Speaker 1:

What is the nature of informed consent?

Speaker 2:

yeah, I wouldn't be the first person to ask that question, but the small print thing really interested me. I was like that. I just don't think there's any way that we could think that small print uh can stand in for consent like ticking a box because you want to be able to read something online well to continue this um journey in unplanned podcasting.

Speaker 1:

I mean. This informed thing, though, is interesting, and particularly when it comes to social media, because it it's making me think of. I'm at the moment revising a course, uh, that we'll be offering at deacon university next trimester love, sex and death, and and death, and in that I'm wanting partly I'm wanting to bring in a week on consent and some interesting philosophical work being done around consent in sexual relationships. You know, can consent meaningfully be given in those sorts of relationships? The debates about that. But then also and this comes back to the social media use and the long life of social media, which so a colleague who will be co-teaching this, patrick Stokes.

Speaker 1:

He's done some book, done some book. He's written a book on I've forgotten the title, sorry Pat, but it's about digital remains and how do we the digital death and the way that we do leave the. And so social media, you know, whether we're alive or not, we do leave these remains around, and how do they get used and how can they be used so you can turn things into. You know death bots where you could have they be used so you can turn things into. You know death bots where you could have if, particularly because all of us leave these digital footprints around uh everywhere and you know we certainly have say spoken a lot in the context of this podcast that you'd be able to recreate an artificial uh voice of either of us um.

Speaker 1:

Imagine that, just going on and on again, james Deathbot's talking about Americanisms again.

Speaker 2:

Yes.

Speaker 1:

But anyway so.

Speaker 2:

Yeah, I know I'll stop that now, but you've just reminded me about a movie called Eternal. You have you seen this?

Speaker 1:

movie I have not Neither have. I Okay, let me tell you. So tell me about it.

Speaker 2:

Let me tell you what the internet says about it. I know about this because we had two really terrific scholars joining us at University of Wollongong this week from the Netherlands, and they both work on um, the philosophy of AI, um, and they were talking about this documentary which is called Eternal you, I guess it just. Um was just screened at Sundance this year or something, uh, where people can sort of bring back to life their loved ones and interact with them via sort of virtual reality technology, and it sounded to me horrifying. Like they Anko and I'm sorry, sorry, anko, I don't remember your last name, but you're probably not listening um, anko was talking about a, a woman in korea who'd lost her seven-year-old daughter, which is just an unthinkable thing.

Speaker 2:

Anyway, yeah and um had a and had an AI version of her created and was on the documentary interacting with her dead daughter, which sounded like an incredibly difficult thing, but also incredibly difficult for the people watching. The people watching the documentary said that they felt sort of complicit in something that was really wrong, which I guess suggests some kind of ways that we should never use AI or ways that we're, at least at this point, really uncomfortable using AI, you know, bringing back somebody's dead child, for example.

Speaker 1:

Yeah, so this is the sort of stuff that Pat is working on, and I would be really interested to watch that documentary.

Speaker 2:

I just looked it up.

Speaker 1:

Yeah, so maybe we can get Pat on and watch the documentary and have a film review.

Speaker 2:

Hopefully it's better than the Whale.

Speaker 1:

Yes, I was thinking about the Whale. Recently for some reason oh that's right I had someone who constantly said that their camera on their laptop was broken and I was curious to know if there was something going on. Anyway, Pat was involved in a conversation on ABC Radio National called the Digital Dead. It aired in 2020, but they discussed that very case of the Korean woman.

Speaker 2:

Oh, okay.

Speaker 1:

It's also with Elaine Casket, and Elaine Casket actually brought this up to say that there's, with the case of the Korean woman, there is actually more of a cultural understanding that needs to be put in place. Where I'm just going off from what I heard Elaine talk about, so send us one of those text messages if this is way off, but that the AI recreation of that was in line, or maybe a technological updating of a Korean death right in a particular way and so that the girl and the mother you know. I mean, for me it was very hard to hear about anyway Like I just think it sounds from my perspective. I don't know why you would want to put yourself through such an experience, but anyway, like it was sort of otherworldly the interaction, like I think there was like fairies and butterflies and unicorns. I don't know if there were unicorns, but you know it was sort of magic realism a little bit as well in the interaction.

Speaker 1:

So sort of like a dream, like inducing a dream-like interaction, which Elaine Casket said is part of this sort of cultural practice around death in Korea, and so I think it would be interesting in this area and obviously these people are studying and researching it the way that AI is used to reflect pre-existing cultural practices and understandings of death and to modify those. But yeah, so that's really interesting as well.

Speaker 2:

If we've got like this, yeah, okay, no as well. If we've got like this, yeah, okay, no. I was just thinking that if we are judging, if you like this practice on a on based on a movie that we've heard about from our colleagues that screened at Sundance. Um, the decontextualizing of that is really important, I think, because we might think that it's horrific or whatever and think about how it affects us as watchers and all of that, which just seems potentially really problematic, you know, yeah.

Speaker 1:

Well, it goes back to, I mean, and this again, neither of us have seen this documentary, but I guess there's that history of, you know, particularly colonial Christian, British coming into different contexts and denouncing different death practices as barbaric and inhumane in various cultures. But yeah, this is not to say like the documentary and the impulse behind it. As well as you know, what pat and elaine and others talk about is there are huge commercial interests all of this sort of stuff. So I don't think we want to sort of be naive to that um and also so.

Speaker 2:

so this is the other thing, right, whoever made the, the documentary is getting a whole lot of publicity, even even if we're not talking about the technology, right, if we're talking about the people who are telling the story and screening the film and getting talked about around the world and all of that. I I went to a movie called the Flats the other day, which is about trauma, basically in a block of flats in an area of Belfast, northern Ireland, and it was to me ethically troublesome because the documentarian had people sort of recreating their trauma and having other actors create their trauma for them and have the person who had experienced the trauma directing the scene and that sort of thing and all of that.

Speaker 2:

So it became a little bit meta, right and all of that so it became a little bit better, right, because all of that was depicted in the documentary. But then so this director who was not in that population or not of that population that she was making a movie about, then sort of gets flown to sydney to come to the Sydney Film Festival and talk about her movie and so on. And there's other people who are living, I think, quite difficult lives with ongoing trauma from a lot of different things. Get to what like. Recreate it for my entertainment.

Speaker 2:

I don't know. You know it felt very exploitative. I love a documentary and I have been feeling quite uncomfortable about that when I watched. I would be interested to see Eternal you and see how I feel about it.

Speaker 1:

But I mean, this comes back to research ethics, Like what many documentaries would not be made if they had to go through a research ethics?

Speaker 2:

committee. That's exactly what I said as I was leaving the cinema. Imagine trying to get that one through an ethics committee.

Speaker 1:

Yeah, yeah. One documentary that stayed with me for a long time we Live in Public about the sort of early days of the internet, mixed in with Y2K, mixed in with all of that sort of dot-com bubble and excesses of New York. Yeah, fascinating documentary, but yeah, lots of ethically dubious practices throughout.

Speaker 2:

All right.

Speaker 1:

All right, all right. Well, we it's been grand. Trying that out. So this is Friday afternoon, go and enjoy.

Speaker 2:

Pouring with rain, a soda water.

Speaker 1:

Yeah, I noticed that outside the window there. It's pouring. It's not pouring with rain. Down here in Geelong, it is just bleak and cold. Good Geelong times, but yeah. So this was an extended discussion of check out the text message feature and I hope next time we're recording I'm not apologising because you're all charged $10 a text message via Spotify or Apple, but look, we've each sent ourselves a text message, so we're all in this together, all right. Oh, and check out Pat's podcast, not necessarily the one that I'm on. They're all great Bye.

Speaker 2:

Bye, thank you.

Updates, Podcasting, and Research Ethics
Bioethics, Consent, and Market Solutions
Ethics of AI and Death Practices