The Human Code

Ethical Risks and Opportunities in AI: Insights from Andreas Freund

Don Finley Season 1 Episode 38

Send us a text

Navigating the Intersection of Humanity and Technology with Andreas Freund

In this episode of The Human Code, host Don Finley delves into an insightful conversation with Andreas Freund, co-founder and CTO of Interweave. With over 30 years of experience in technology, Freund discusses the evolution of human needs in technology and the role of distributed systems like blockchain in fostering global collaboration. The episode also covers the ethical and existential risks posed by AI, the transformative impact of generative AI on society, and the emotional and technological readiness required to navigate the future. Freund emphasizes the critical need for communities and individuals to adapt and find sustainable ways to leverage AI while addressing the challenges of capitalism and resource control.

00:00 Introduction to The Human Code

00:49 Meet Andreas Freund: Tech Visionary

02:09 The Evolution of Human Needs and Technology

06:49 Blockchain and Distributed Systems

09:12 Generative AI and Societal Impact

12:06 The Future of AI and Humanity

24:05 Practical Advice for Navigating AI

26:16 Conclusion and Sponsor Message

Sponsored by FINdustries
Hosted by Don Finley

Don Finley:

Welcome to The Human Code, the podcast where technology meets humanity, and the future is shaped by the leaders and innovators of today. I'm your host, Don Finley, inviting you on a journey through the fascinating world of tech, leadership, and personal growth. Here, we delve into the stories of visionary minds, Who are not only driving technological advancement, but also embodying the personal journeys and insights that inspire us all. Each episode, we explore the intersections where human ingenuity meets the cutting edge of technology, unpacking the experiences, challenges, and triumphs that define our era. So, whether you are a tech enthusiast, an inspiring entrepreneur, or simply curious about the human narratives behind the digital revolution, you're in the right place. Welcome to The Human Code. In this episode, we're excited to welcome Andreas frind co-founder and CTO of interweave with over 30 years of experience in technology, including expertise in web three cryptography and ERP systems. Andreas brings a unique perspective to the intersection of humanity and technology informed by his deep understanding of theoretical physics and his passion for teaching the next generation. Of web three builders. Today, Andreas, and I will share his insights into the evolution of human needs in technology and how we're moving towards a global collaboration for survival. Enabled by distributed systems like blockchain. A, thought provoking discussion on the ethical and existential risks posed by AI, particularly in military applications and beyond. The transformative role of generative AI in shaping the future, along with the societal challenges that come with it. Join us as we dive into these compelling topics with Andreas frying. This episode is packed with valuable insights that will challenge the way you think critically about the role of technology in our collective future. You won't want to miss it. Welcome. I am sitting here with a good friend of mine now, Andreas Freund. And I gotta say, Andreas, thank you so much for being on the show. it's a real pleasure to have you here. But also the first question that I got to ask you is what has you interested In the intersection between humanity and

Andreas Freund:

In part, my own personal healing and growth journey, that has always been based on relationships and community. So the work that I've done in blockchain technology, things that I build around self sovereign digital identity, around decentralized brand economies, the things that I've written about have always been with the focus on community. And to be able to build sustainable communities at scale, which we as a race have been struggling with.

Don Finley:

I could absolutely see that. I think, humanity functions really well and in a tribal. but as we've expanded out the decentralized nature of our society, we've hit a couple of bumps. where does your passion lie today?

Andreas Freund:

answer that question, I would like to take a little step back and open the lens a little bit. So

Don Finley:

please.

Andreas Freund:

humanity evolves through different stages, We were initially subsistence cultures in, Africa. So we, competed for resources and then later on collaborated for safety as we build, small communities, small families that grew larger and larger. As our technology evolved, our needs that could be fulfilled changed. And as our needs are changing, we are emotionally growing as a child would, A child screams initially, why? Because it needs to be fed The only way it can communicate that is by screaming. It doesn't have the means, the capabilities to articulate anything else. So safety and, sustainability of life is paramount for children, but as they, the brain evolves and it can articulate other needs, these other needs are met through technical needs often. know, you have games that you play when you're small as a child, and then those evolve. The technology evolves, As we start to, evolve our needs, our technology evolves with it. So as we, as individuals, grow, we're using more and more complex technologies. As we, as Societies grow. We're using more and more different technologies because our needs evolve as the technologies evolve and we see, oh, now these needs that I actually have that could never be met can now be met, The need for comfort, once comfort is met, the need for meaning of life or personal meaning, but this is all I. focus there is a framework for that called Spiral Dynamics that talks about a spiral that has different memes and they evolve from pure subsistence all the way up to evolving, meaning, through community but all of this is, I focus, it's either I compete for, or I collaborate for a need, And with social media, you can see that. Influencers and people on social media, they. are out there. They're collaborating with others in order to generate meaning for themselves, not for the community yet. It's all still I. So the next level of the development of our human consciousness as a global tribe. And we have all the different means existing at the same time. You have, subsistence tribes, you have tribes that are warring, so you have all the different, emotional development stages that a child goes through as it becomes an adult, also present in the different, societies and the next meme after the meme of I collaborate for meaning with community, is we collaborate for survival. So it starts a new spiral at the end. And it's all about survival. But we, as a global tribe, and we can only solve our challenges, or global survival, as a species, if we are predominantly in this new meme, and we all think about, we collaborate for survival. And you can see that we're moving towards that because we're starting to develop technologies. that are enabling that, that are enabling the WeCollaborate for survival.

Don Finley:

Where, are you seeing those technologies

Andreas Freund:

technologies around, distributed systems. Blockchain technologies, for example. Blockchain is one of the first technologies that has the WeCollaborate, baked in. Terribly inefficient. And misused. That's what technologies often are, Their best use is uncovered, later on, best use for the effect efficient steam engine of James Watt, which up until this point has been the most impactful development in human history ever, was primarily used for economic advancement and war, not for the community as a whole. That came later. Once, you build a transportation, mass transportation that could be used by everyone.

Don Finley:

and it's also interesting cause it's like that we mentality and developing technology for war definitely not on that level

Andreas Freund:

is right. War is about the I, That's, versus the we. So, the technology is there, but our mindset isn't there. So we are not utilizing the technology. In the most effective way. We're still using it with an eye mindset, hence meme coins, NFTs on open sea and all that. bullshit that is, around that is, pump and dump and scams that are, get rich quick. So, whereas there are a lot of people who are, and if you look into the communities. that are around different projects, you can quickly discern which ones are genuine and they really want to build something new. And there's a real meaning of we, If you look at, for example, IPFS, right, which is really, selfless thing. the, Ethereum community, at large, evolving the Ethereum blockchain. They're truly democratic, a we slow process, but very, very effective. And as we see, as we starting to scale these things and experiment and learn to be comfortable with technologies and abling this we behavior, this information flow, this consensus building, the alignment of intention and values. That is hard. And We're still not there. We're not emotionally evolved enough to have caught up with the technology. That's also a significant problem when we're talking, for example, about generative AI.

Don Finley:

can completely see that, Because we're creating for the self, we're creating for our small tribe, and yet at the same time, that ability to create is vastly different from what it was five years ago. But how do you see it as impacting society?

Andreas Freund:

our predominant meme, that's basically what you would call capitalism and capitalism is all about resource control. So you can extract rent from the resources Open AI. Google, Meta, you name them, Microsoft. It's all about resource control, funneling the resources, human eyeballs, into the, generative AI funnel. And, they're openly talking about replacing, search through AI, would disintermediate. A lot of companies. I, for example, use generative AI tools daily in my practice. I wouldn't be able to properly work and at the speed that I do serving different clients in my startup, in my own personal business, without that. And, generative AI tools may take me from a mediocre developer at best to at least a good one, if not a really good one. And that is now. So, if you look at what Boston Dynamics is doing with, humanoid robots, the ability for them to peel an egg.

Don Finley:

Wait, has Boston Dynamics been able to peel an egg? Oh,

Andreas Freund:

there's, yeah, I think it's Boston Dynamics. It might be, may have been, sometimes fuzzy on the source because I see so much, not always, does not always gets, properly stored.

Don Finley:

Okay, we're in similar bling on a daily basis that there is an issue of keeping it all straight. but just to let everybody else know that the real significance of that is human dexterity of hands has been a real difficult challenge for robots. And one of the reason why Amazon warehouses still have people packing the boxes is because there's nothing like a human hand for picking something up and moving it.

Andreas Freund:

That's right. Even though, they have now a percent automated, smartphone, manufacturing facility. I think it's Japan. that is fully AI driven and that learns from process breakdowns and self corrects. That's now,

Don Finley:

I mean, that is really awesome. Like as far as we'll come back to the capitalism meme and that question, but there is this opportunity that we have to basically take jobs away that people may not want to be doing, and still allow for, our consumption to continue or like us to, thrive in different ways. But what does a community look like? What does a society look like in this age of

Andreas Freund:

If we ever get to the age of we, and we're not going extinct before, which I would put at about 60 percent to 80 percent probability, that we will wipe ourselves out or we are going to be wiped out by, AI, in combination with climate change.

Don Finley:

So climate change and generative AI being the kind of like

Andreas Freund:

Yes. just, there's a very simple gedanken experiment that you can do, It's already used in today's battlefields at, an alarming rate. the, Gedanken experiment is what happens when you have basically two AI's battling each other by redirecting and directing, human forces, technology, drones, drone swarms. What are they battling each other? And the goal is to win what, and then safeguards are slowing one AI down compared to another. What is going to happen? Will you remove the safeguards? There's the famous scene in the movie Hunt for Red October, one of my favorite movies of all time, where, there's a Russian hunter, a submarine, Going after, Sean Connery's, submarine from also Russian submarine that has escaped and they missed the first time that they shoot at it. And then the next time, they missed because, the safeties were on and the torpedoes were not activated as soon as they were fired. So the commander removed the safeties, fired. And it ended up killing themselves because the safeties were turned off.

Don Finley:

Ah, okay.

Andreas Freund:

if you think about it, AI losing against the other, and then humans seeing that and knowing that they're going to lose, what are they going to do? Will they also remove the safeties? What happens when there are no more safeties and AI models whose primary purpose is to win a war, win a battle. They don't care, there's no concept of humanity, of ethics, of morals, unless they have been trained to do that, which, quite frankly, it's not going to happen in, AI models that are supporting the military, because you will always lose, The ethical and moral AI will always lose against the one that is not going to have ethics and morals because it is hamstrung. So then when one AI wins. Of the AIs without safety, what's going to happen then

Don Finley:

it kind of feels like you're walking down the Terminator timeline.

Andreas Freund:

a little extreme, but there are so many other things that are, what can happen and what happened before, generative AI taking over, critical infrastructure, Because it's protecting critical infrastructure. Now it's being taken over by the winning side, so to speak. What then? So there, are many, many things that are really, really scary that can happen before we even get to Skynet way, way before then.

Don Finley:

And I, think that's a good point to make is that like Skynet is a great kind of visual representation of the drama that could be happening. but there are easier ways to take

Andreas Freund:

And we're doing it to ourselves already. So. because we can't even manage social

Don Finley:

Oh yeah, we're really,

Andreas Freund:

a humane way.

Don Finley:

no, but is the challenge of social media. do you see AI helping in that? do you think AI could actually help to bring us towards each other instead of, antagonizing each other? Cause the challenge I'm seeing is we're building AI, or at least we've built in recommendation engines, newsfeed, Those are all based on, maximizing attention, maximizing engagement, Which is to the benefit of, let's just use Facebook, Facebook meta. they gain value from us being in this heightened emotional state that keeps us on the application.

Andreas Freund:

they're doing that? Or, Why we are doing that?

Don Finley:

Well, my take is that? it brings us back to capitalism, It brings us back to the end purpose of the business and going back to, Friedman, that we are a shareholder capitalist society right now. And the end benefit of the Facebook platform is really for the shareholders, and the customers, which we're not the customer, we are the product

Andreas Freund:

one step

Don Finley:

in this capacity.

Andreas Freund:

you ask, so why AI cannot be used to bring us closer together. the basic answer is that businesses will always choose the easiest path to monetization, because that is what shareholder value. If you maximize revenue, if you can increase cashflow, if you can accelerate increase in cashflow, by using certain techniques, you will do that. So what is the most powerful attention grabber is our genetic programming that has been there for evolutionary purposes, It is the fear of other. Fear is the twin of anger. Fear begets anger. And therefore, tribalism exists to protect the small community against the other, because the other, was killing the tribe because it was so dangerous. animals, other warring tribes. So it's always other. So we're programmed at our DNA, To always fear other. And this base fear is very easily triggered. And if it's easily triggered, right, that means attention to attention to something means eyeballs. Eyeballs mean money through. That's why the easiest path to was by increasing tribalism, increasing fear that goes to anger. it's very, very, very, very simple. And that's why you will not see AI being used for bringing the communities together. It's being used to create even more fear. Therefore, even more anger, therefore, even more attention to the content that is stoking that, fear and therefore is more easily monetizable.

Don Finley:

What's a path forward that doesn't drive us into the ground, how can we look at our usage of AI in a way that is, that collaborative, like we, that, from that we

Andreas Freund:

so the big challenge is that our innovative energy and capability far outpaces our emotional. And also our intellectual grasp of things. We're not trained, society is not trained to understand technology, only to use it. But to really know what to do with technology, you need to understand it. If you don't understand it, you can't even contemplate the dual uses, And the experts have been pushed to the side because they have been cast, again, in that framework of, Us versus them of other, So, it's extremely challenging to pass things unless there is a massive catastrophe, Things really didn't change at a global scale until after world war two, until 70 million plus people died, There was a, there was a Holocaust where we looked at each other and said, this is so horrific. We, as a species must be different and must do differently. And only then did we just barely, get together. And that thing didn't really hold as we can see now, this is, this whole fabric is disintegrating because generational knowledge and generate emotional knowledge is disappearing. And we're going back to exactly the same thing that we had before, except that now we have. Much better technology to annihilate one another. and it's a cycle.

Don Finley:

who is it that said that World War 3 will be fought with nukes and World

Andreas Freund:

not even sure that World War

Don Finley:

sticks and stones?

Andreas Freund:

with nukes. It could just be fought with. that's sufficient. so the comp, the only hope that I currently have is that our infrastructure, the electric grid and our capacity to produce AI chips. it's just going to, hit an impasse. it's like, we can't produce enough infrastructure for AI to evolve as fast as it has been. So we're still hitting like, Moore's Law is still well and intact for generative AI capabilities and chips, but the energy that's required is ginormous, just for the training, primarily, once the model runs, that's fine, It's just enormous, and the capabilities that are required and the number of chips that people want is just insane. So building these capabilities will take time and maybe this slowing down of the trend, because we don't have enough resources to move ahead fast enough is going to save us because we're going to recognize all the things that are going sideways and we starting to see that. And that's my hope. that, we don't have enough resources to grow the capability fast enough, so that we'll never be able to catch up with it to make sensible decisions about it.

Don Finley:

I think that the opportunity is there. We're definitely in a hype cycle in some capacity and, investment definitely flows heavily in during that hype cycle. And we've seen it a couple of times in the last 10 years for blockchain as well. but then, we also see the winters coming or the trough of disillusionment. happen. And maybe that could also be a gating factor for how fast we're running

Andreas Freund:

Sam Altman

Don Finley:

know,

Andreas Freund:

able to write, to raise the 7 trillion that he wants to, then, too many people have too much to lose to allow for a winter.

Don Finley:

That's an interesting one because you're also looking at it's use case adoption. And I'm still seeing enterprises, focused on proof of concepts. we're seeing rushed implementations that, yield results that are very positive upfront, but then also like I've seen car dealerships that are, have been liable for, what chat GPT says, even after the person like purposely jail breaks it kind of component, or, there was, I think it was New York city. Where their chatbot was actually telling the citizens to, not

Andreas Freund:

and that, that is, again, this stuff is still controllable, we're at the, if you look at,

Don Finley:

Yeah.

Andreas Freund:

And that's my favorite, favorite sort of like metaphor and comparison, because this is still exponential development, So at the beginning, if I make 30 exponential steps, I travel around the world 27 times, so that's the lake with the lily pads, It's like, nothing happens for a long time once you realize that something is wrong, the whole lake is covered within a matter of three four days. So, we as human beings, evolution has made us purposefully not to be able to recognize exponential behavior. evolution made us such because, we optimized survival and optimizing survival means reducing relative error and relative error, the information that we're gaining from optimizing relative error is stored logarithmically. So the logarithm of an exponential function is a straight line. So even if it's exponential, the most we see, we perceive as a straight line, because in the Savannah, The individuals who optimize relative error were like, Oh shit, there are lines coming, let's get the heck out of here. Versus the ones that were optimizing absolute error. There's like, Oh, there are lines coming. How many are there? Wait, one, two, wait, how many? Wait, let's wait a little bit longer. They didn't make it.

Don Finley:

Yep. That's absolutely fascinating. So what I want to ask you then is how would you recommend, anybody who is either in the AI field or, impacted by AI in some way or wanting to be, participating in this, what are some either skill sets or opportunities that they have in this world today?

Andreas Freund:

short term, there are two things that I would recommend. Number one is figure out how to use generative AI tools in the job you have right now and figure out how to use them really, really well so that, you are ahead of the curve for a little while. That won't last, This is a mere stopgap, Because the tooling will come out, it just makes it less likely that you're the first ones that get the boot, So, in addition to that, you need to figure out what other skill sets do you need to acquire while you're employed, so that you can pivot. Do you want to, do you want to learn, to become a carpenter? Do you want to, a plumber, do you want to learn how to, it's like, or software development is probably not, not a really good, use unless you're in the AI space. but really carefully look at, other. Disciplines and, figure out how you can utilize AI tooling there and retool yourself because you will have to retool yourself pretty much every two, three years from now on, until most jobs are gone.

Don Finley:

think you're hitting on a strong point and that's, the ability to learn is incredibly important. The other side of this is that we need to widen the scope of what we would think we would want to be doing from a career path, and opening our eyes to the fact that white collar jobs will likely, not be around for too much longer. Andreas, I gotta say, it's been an absolute pleasure having you on the show, and I really appreciate you taking the time and making this a great episode. I think we all do. I think we also need to, keep our eyes open and aware of, what the opportunity is versus what the challenges could be. So again, thank you so much, Thank you for tuning into The Human Code, sponsored by FINdustries, where we harness AI to elevate your business. By improving operational efficiency and accelerating growth, we turn opportunities into reality. Let FINdustries be your guide to AI mastery, making success inevitable. Explore how at FINdustries. co.

People on this episode