RevolutionZ

Ep 286 Evan Henshaw Plath on Social Media, Digital Identities, and People's Platforms

June 09, 2024 Michael Albert Season 1 Episode 286
Ep 286 Evan Henshaw Plath on Social Media, Digital Identities, and People's Platforms
RevolutionZ
More Info
RevolutionZ
Ep 286 Evan Henshaw Plath on Social Media, Digital Identities, and People's Platforms
Jun 09, 2024 Season 1 Episode 286
Michael Albert

Ep 286 of RevolutionZ has Evan Henshaw Plath, also known as Rabble, a visionary technologist with personal roots in developing Indymedia and even Twitter. He replays the history, logic, and implications of social media from its root democratic and participatory  intentions to its corporatization and erosion of privacy and meaningful engagement. Plath takes us, as I suspect few if any others could, from the shift from social media's early, open protocols to the centralized corporations like Twitter and Facebook that came to dominate. He then explains his ongoing quest to reclaim the decentralized spirit of the web including working on modern advancements like Nostr, and his current adaptation called Nos.social. Reflecting on historical movements like Indymedia and Occupy Wall Street, Plath emphasizes the need for autonomous spaces that support radical change and envisages the potential of independent, decentralized, privacy-focused platforms. He also discusses possible  sustainable funding of these independent platforms, underscoring a needed shift from owners and consumers to co-creators and the vital role of community collaboration. RevolutionZ listeners will likely know the depth of my antipathy for social media as usually encountered. So I hope you will listen and wind up feeling as I do, that  Plath's new project, Nos.social, is  well worth our attention and support.

Support the Show.

Show Notes Transcript Chapter Markers

Ep 286 of RevolutionZ has Evan Henshaw Plath, also known as Rabble, a visionary technologist with personal roots in developing Indymedia and even Twitter. He replays the history, logic, and implications of social media from its root democratic and participatory  intentions to its corporatization and erosion of privacy and meaningful engagement. Plath takes us, as I suspect few if any others could, from the shift from social media's early, open protocols to the centralized corporations like Twitter and Facebook that came to dominate. He then explains his ongoing quest to reclaim the decentralized spirit of the web including working on modern advancements like Nostr, and his current adaptation called Nos.social. Reflecting on historical movements like Indymedia and Occupy Wall Street, Plath emphasizes the need for autonomous spaces that support radical change and envisages the potential of independent, decentralized, privacy-focused platforms. He also discusses possible  sustainable funding of these independent platforms, underscoring a needed shift from owners and consumers to co-creators and the vital role of community collaboration. RevolutionZ listeners will likely know the depth of my antipathy for social media as usually encountered. So I hope you will listen and wind up feeling as I do, that  Plath's new project, Nos.social, is  well worth our attention and support.

Support the Show.

Speaker 1:

Hello, my name is Michael Albert and I am the host of the podcast that's titled Revolution Z. This is our 286th consecutive episode, and it's even our second this week. As mentioned, while doing one episode per week based on the book in progress titled An Oral History of the Next American Revolution for about 12 more weeks to come, I will also sometimes do a more normal episode as well. This time, our guest is an old friend, evan Henshaw Plath. So I guess this is a normal type episode, though with a not so normal type guest.

Speaker 1:

Evan, known also as Rabble, is a pioneering technologist and activist renowned for his work in social media and decentralized technologies. As the first employee and lead developer at Odeo, where they helped to create Twitter, rabble has been at the forefront of digital communication innovation. Their commitment to user-centric, community-driven platforms is evident in their work with NOS, social N-O-S period, social dot social. That's the link, a decentralized social media app using the NOSTR N-O-S-T-R protocol, and their role in launching and relaunching Causescom to empower grassroots activism. They also helped create the Indymediaorg tech team, well back advancing grassroots media. A former researcher at the MIT Media Lab Center for Civic Media and an Edmund Hillary Fellow, rabble combines technical expertise with a passion for political organizing and social justice, envisioning a future where digital communities thrive independently of centralized control, focusing on building digital commons-based communities. So, evan, welcome to Revolution Z.

Speaker 2:

Thank you very much. I'm really happy to be here. I admit I have not listened to all the episodes, but a number of them have been really wonderful and it's great to hear your voice and see you in person.

Speaker 1:

Thank you, I appreciate it very much. Have a long and a very rich and varied history in the high-tech world and particularly in the world of social media. So to start, maybe let's spend just a little time on what social media exists now, even on what it is, and on what's wrong with it. What about how social media is used, the culture it seems to generate, the habits it inculcates? What's wrong with today's social media?

Speaker 2:

What's wrong with it? Well, social media today is complicated because it's everywhere and everyone and it sort of permeates this world. And it sort of permeates this world. And when we started Indymedia 25 years ago now, the idea was that if we just gave everybody a voice, we gave the ability to publish videos and photos and articles online in an easy way, to web stream protests, all that things then all of a sudden, of course, everybody on the progressive end of the spectrum would come together in a bottom-up, grassroots way and we wouldn't, you know, we would be able to sort of topple the existing systems and replace corporate media. And in some ways we have succeeded and failed.

Speaker 2:

The mission of Indy Media back in the day 25 years ago have become the media of be the media that you want to see and have your own voice. That became social media. But what we lost was the part where we control and own it. So we got the facade, we got the ability to everybody have for everyone, but optimized, not necessarily for community engagement, not for collaboration, not for social health or social change. Instead, it's optimized to try and make us as passive as possible consumers of this thing, this sort of flow of information, because that gets us to click on ads. And so we have this sort of really weird messed up world where we got what we wanted 25 years ago as part of indie media and the anti-globalization movement and didn't get it at all because it became, you know, the new cable television with even better monetization.

Speaker 1:

Yeah, and it's not only make us into passive consumers of information, but make us into passive consumers of everything, and it has. Before we get to the brighter side of things, it also has these effects whether intended or not, they're sort of acceptable to the mainstream on the culture and on the habits that it seems to produce in people, the way it's currently organized. Do you have sentiment about that, feelings about?

Speaker 2:

that Sure, it's made. It's transformed people into thinking that we are always projecting and performing. So you know people's identity, their gender, everything that's always performative, it's constructed. But now there's an idea that not just are you performing for the community around the people, you're performing for potentially the entire world. Everything is recorded, every moment can be taken. You can become a star in a minute or the devil, depending on what happens, and everything is pulled out of context.

Speaker 2:

And so people see the rewards of being the star and getting millions of hits and likes, and that's a tremendous dopamine rush and there's easy access to fame. Mean Russian, there's easy access to fame. But also that means that we don't have privacy to have the weird conversations. We don't have privacy to mess up, because everything is recorded all the time. And so that creates a world where we are judging each other not so much for the work we do, but how the work is perceived and communicated.

Speaker 2:

So Optics, it's optics, it's all optics. So you know, instead of when George Floyd was killed, instead of people saying I'm showing up, I'm volunteering, I'm at that protest, all these people put a black box over their Instagram feed and thinking that they were doing a part and they were expressing solidarity, but expressing solidarity isn't what changes institutions that have power. It doesn't undermine them in and of itself in the current system. And so we've got this world where many the optics have become so powerful that we don't realize that we need to do the other work, and the optics are just a reflection of the reality and aren't the reality itself.

Speaker 1:

Yeah, and there's something about the speed of the whole thing and the flitting from thing to thing which seems to I mean, I talk to people who teach it seems to demolish people's attention spans. And now I don't think these things were intended. I don't think somebody sat around and said, okay, let's affect the minds of the next generation to have a short attention span or to express themselves in 25 words at a time and thus never talk about anything that requires 50 words or 300 words. I don't think anybody thought about that, but it was a side effect that was acceptable. You know it was. It's like a uh, kind of a weird Darwinianism in which the system allows those changes unintended, but it tends to crush other changes which it doesn't like, like, for instance, interfering with the flow of political information, which is becoming more and more of a threat, I think.

Speaker 2:

Does that make any?

Speaker 1:

sense to you.

Speaker 2:

It does make sense, but the idea that it wasn't intended is maybe wrong.

Speaker 1:

Really.

Speaker 2:

There's a book called Hooked on how to build habit-forming products and this book is incredibly widely read in Silicon Valley and people who do product design on these things, and essentially the book is a study of addiction and how you can leverage addiction in the digital products you create. So notice.

Speaker 1:

What's intended is to get their eyes to get their attention to hold them, but not, I don't think, the effect that it had on people's brains when they go out into the rest of the world. But that turns out to be useful also. So that's what I meant by the unintended consequence. It's like a side effect which turns out to be very useful for the powers that be Absolutely yeah, all right. So, looking at that, seeing all those problems, where did it start to push you, what's your reaction? To provide something positive.

Speaker 2:

So the first push, after working on Indy Media for a bunch of years, was into podcasting. Podcasting was just an emerging technology and we're like, well, if we could democratize radio, it would be a powerful thing, and before that, when we set up protests, we would set up pirate FM transmitters, we would set up pirate radio stations, and you had to be close to it and listen to it to be able to part of it. So the idea that anybody could have a radio program was a powerful shift and that one, for the most part, worked Like we democratized radio, and it wasn't enclosed and it hasn't been exploited, and I think we should celebrate podcasting as an example of what worked. Then we worked on Twitter, and Twitter all of a sudden became this sort of ability to feel and participate in the zeitgeist of what's going on, dissipate in the zeitgeist of what's going on, and that initially felt amazing and was fantastic, and huge amounts of social movements were able to use it effectively. The one aside is I don't think people know that Twitter itself was based on an activist project that we were using in 2004 called TextMob. That was just for a text messaging system for communication during protests, so we used it at the RNC and the DNC in 2004. And because we were doing that and developing it as an activist project, we brought it into the company where a couple of us were working Odeo and that was what introduced the company to building a text message based social network.

Speaker 2:

So when people see Twitter and say, oh, twitter has a lot of activist roots and it has a way in which people are using activism, those activists aren't appropriating the commercial tool.

Speaker 2:

What's going on is it's an activist tool that got commercialized and then, when it got commercialized, it started growing and getting funded and taken over by investors and market economics, which drive a different set of values, and they were searching for the ability to make money, make money via ads and that process is what led down the path of needing bigger and bigger audiences, like searching for the stars and searching for the maximum amount of engagement. And that is what made the platform so effective, both for grassroots activists but also people like Trump demagogues who used it as the new version of the radio that was used in the 1920s and 30s so powerfully. It became this way of intermediating beyond the world of professional journalism and directly connecting to people, and some people like Trump ended up being really, really effective at playing that game and were able to use it. So I don't think any of us in the initial stages thought that this platform of radical participation and radical democratization, of the ability to have a voice, would pull people in to such kinds of authoritarianism.

Speaker 1:

I'm sort of curious. You played a role in all that, yep. You even played a coding role in all that, am I right? Yep? So the early version of Twitter you coded? Is that correct? I?

Speaker 2:

worked on it with some other people, yeah, Okay. So I'm wondering it was not a dozen people in the whole company?

Speaker 1:

Okay, At that stage did you have the word limit on messages or the character limit? I don't even know, it's a character limit yeah. And what was the rationale for that?

Speaker 2:

So the original way to use Twitter was similar to how we were using TextMob, in that it was mobile first, text message based first, and the limit on SMS messages at the time was 160 characters. And so what we did is we said there should be 20 characters for 18 characters for your username, and then a colon and a space, and then your message, and so that 18 seemed like a you know, 17,. 18 seemed like a good length of name for a username, and then the rest was the message, and so that's how it got cut down to 140. And it was. It was not a design decision based on oh, it's the poetic way in which we will constrain the way people speak, and you know, there's all sorts of justifications afterwards about like, oh, this is why it sort of makes sense. Really, the entire system was built on SMS first and foremost, and SMS had these technical restrictions on how it worked.

Speaker 1:

And.

Speaker 2:

Twitter launched in 2006, and the iPhone only launched in 2007. So there was a whole year in which Twitter was around pre-modern smartphones.

Speaker 1:

Right.

Speaker 2:

So there was no mobile web app and there was no mobile web app. And there was no mobile apps. And there was, you had to use it via these text messages or instant messenger. It was much clunkier and then, once it became possible to send longer messages, everybody identified with the idea of sharing short messages, the idea of the constraints, no formatting, I can't say very much. Um, it has to be sort of easy and fast.

Speaker 2:

That took off and people kept it too bad well, you know, ironically, elon musk has actually removed it now and you can.

Speaker 1:

You can write really long tweets if you want 10 000 characters or whatever so so you, you're coming through this process and involved in it, all different stages and in various ways, and later now you're working on a new system, project, etc. Probably with some of the same inclinations that you had all the way back at the beginning, but in a different world, in a different time and knowing what went wrong with what was done before. So what are you coming up with?

Speaker 2:

So at the time I was frustrated with the consensus process that Indymedia used. It felt like it was very hard to develop new software where we went to a larger group and we had to go through a consensus process about all the details of the software and it made developing new software, that creative process, impossible. And so there wasn't just myself, there was a bunch of us who were Indy indie media activists on the tech side, who went to go work at Craigslist and Flickr and YouTube and Twitter and all of these early web 2.0 companies when, basically, we're like, can we take the activist stuff we were doing and use these resources that the companies are doing? And we knew that we were making a deal with the devil. We knew that we were walking away from open source. We knew that we were giving away the space we were creating.

Speaker 2:

But in those early days, from 2004 to 2010, 2011, there was an idea that these companies were actually small enough and well-behaved enough that they were pretty good stewards of the space. But the problem was, at some point they controlled all the keys, they controlled all the servers, they controlled all the servers, they controlled all the software and eventually they made the decision to turn off the openness and sort of enclose the commons. It felt like an open commons, everybody had it, but you had no legal rights. And then they enclosed it, and that's where a whole bunch of technologists and people working on it sort of went oh shit, this. You know, we got part of what we wanted, but we also lost a lot. And that was the moment, starting in 2011, 2012, where a bunch of us started looking at, saying, okay, is there a way to do this that's more of a protocol and less a software platform on a single server? Is there a way to do this using technology where it's not all held by one company?

Speaker 1:

Hold on a sec. Is there a way to do this? That's more a protocol. I don't think most readers have any idea what that phrase means. So, explain it a little.

Speaker 2:

So the internet is based on protocols. It's the technical standards by which one computer system talks to another computer system. When you use the web, that is a protocol. That's the HTTP that sometimes shows up in the web browser. When you send an email, that's a protocol. It doesn't all just go into one server. You write the email, specific standards and all these other servers know how to route your messages and they all operate on an agreed standard. Enough.

Speaker 2:

The main organization that creates these standards is oddly egalitarian and anarchist. It's called the Internet Engineering Task Force, and they use, you know they have this motto, which is, you know, we reject presidents, kings and voting. We believe in rough consensus and running code, and that is how the internet is designed. And it's this international body of nerds and engineers that get together in meetings and define how all the computer systems in the world should work together. Now, what happened with podcasting is it became one of these protocols. It became a standard. Anyone can implement it. It's not just on a server. What happened with Twitter and Instagram and Facebook and TikTok is they took the ideas of those protocols and they put it all on a single server and they said you access it via the web, but in there you're just in our little walled garden of space, and then they control everything.

Speaker 2:

And so the movement that started around 2010 was saying, oh, we need to reclaim and go back to the way the old internet worked, where we all have the rights to participate in this network and my data is more of mine and we can find technological solutions to this problem of enclosing the commons.

Speaker 2:

And that led to a whole movement of technologists and designers and users of these systems, and that's where you get things like the Fediverse and Mastodon and tab protocols like Matrix and all of the stuff around Linux this attempt to build this technological world that the capitalist corporate sector will use but they don't own and control. And it's a tricky balance because those corporations, for the most part, are paying developers to do this work in the shared open source commons Because they benefit more. You know they're able to contribute back to the commons because they benefit more. And so what we said when Twitter shut down this open ecosystem is we went, oh shit, we need to rebuild Twitter, we need to rebuild all of these social media platforms in a way that isn't closed down, and there were lots of us doing that. You know, a couple thousand engineers saying how do we do it, and it's been 12 years of people trying stuff and most of it fails.

Speaker 1:

But occasionally this stuff works I have a question when you, when you, when you approach that the couple of thousand of you, let's say, is it in your head we have to get rid of the ills that fucked up the past system, ie the centralization, the ownership, the control that's clearly in your head? Yeah, is it also in your head to ask, well, okay, what, beyond getting rid of all that evil, of all that harm, is the positive things that we have to incorporate to really do something that isn't just not bad but really good? Two separate steps, in a sense Get rid of the bad, conceive the good and implement. It Is that second part in people's thinking.

Speaker 2:

The second part is in people's thinking, but we don't have consensus on what that is or how it works, so it's easy for everyone to agree more or less what is bad with the existing systems. Like that you know, algorithms are okay as long as you have agency over those algorithms. Like it's fine to have algorithms that surface viral content, or surface just people who are close friends of yours, or surface just people who have like-minded content, or drag you into older conversations or focus on the immediate. Like all of those algorithms are fine. The problem with algorithms is that you don't get a say over it. So what we should be asking Let me just clarify something.

Speaker 1:

Yeah, let me clarify something by algorithm. Another word that serves, I think, for it in the way that you're using it is filter. Yeah, right, and in a sense, it's filters under the control of them, or under the control of you, exactly. And a filter that's under your control that gets your friends or gets everything about rabbits, if you're interested in rabbits, that's okay. But a filter that is under their control and starts feeding you stuff with the purpose of getting your eyes on advertising or whatever, is not good.

Speaker 2:

Exactly.

Speaker 1:

All right. So everybody in that community of 2,000, or however many, understands that and, rebelling against that, wants to do away with it.

Speaker 2:

Yeah, absolutely.

Speaker 2:

Yeah, okay, and this community was, for the most part, brought together by the Internet Archive. This community was, for the most part, brought together by the Internet Archive. And the Internet Archive believes in, you know, saving all books and all media and a copy of the Internet. And they saw the centralization and the control and the effects of big tech companies. And you know, the Internet Archive is in San Francisco. They're part of this world and so they started organizing conferences and events and bringing people together and and sharing ideas to say, okay, let's, let's figure out another way of doing this, let's figure out how to make the.

Speaker 2:

And the person who figured out the most was this very prolific hacker who had written lots of open source projects, named Dominic Tarp, who lived off the grid on a sailboat off the coast of New Zealand.

Speaker 2:

And, you know, is this alternative bearded.

Speaker 2:

You know hacker, sailor, you know creator, who decided to opt out of the entire system and because of that he wanted to build a social media protocol that worked on his sailboat and worked in a totally different way.

Speaker 2:

And he created this thing called Secure Scuttlebutt, and almost no one used it but the ideas in there. You know it worked great if you lived in a commune with bad internet connections, far away from everything, because when your phone went into town it synced with everybody and then it brought back and it like peer to peer synced in like a like a mycelial network of stuff. But the concepts behind that, the way the technology worked, that gave us the break from the way in which technology was working and the way the protocols were working of stuff that was coming out of Silicon Valley to a new way of doing it that it used a bit more cryptography, it used a model for how the data could be shared and synced and work. And all of the newer protocols that we now see Noster, bluesky, farcaster Lens there's a whole group of them. They're all conceptually based on this thing called secure scuttlebutt this you know this hacker on a table.

Speaker 1:

So these are all protocols. So these are all protocols. A protocol you described is a framework, a pattern of activity that mediates computer to computer communications. Yeah, how can you have 10 of them? I mean how can you have go?

Speaker 2:

ahead. There's thousands of them and some of them are things we see every day. Some of them are like, easy to engage with and see. They're the way in which a web browser loads. There's the way in which your email things. When you chat with someone, you signal that's a protocol, but some of them are.

Speaker 1:

Wait, I'm still confused. Take the web browser thing. Yeah, it's one thing for me to have a computer which uses a protocol, let's say the web browser one. It doesn't do me any good if there aren't other computers that are also using it. Yes, that are also using it. Yes, so you're saying now there are a fuckload of protocols which are understood by all computers?

Speaker 2:

No, each computer has to choose which software they run to implement which protocols.

Speaker 1:

So when you said that Scuttle, whatever it was, existed, it wasn't communicating with everybody because it wasn't on everybody's computer no, it was just communicating with the other people who are who wanted to use scuttlebutt okay, so now the, just so I you know the, the thousands of.

Speaker 1:

How would my computer, right, yeah, become a computer which can communicate with others that are using? I think one of the words you used was blue sky, yeah, so what happens to my computer that all of a sudden it can communicate with somebody who was using blue sky in japan?

Speaker 2:

so know. Remember some of these protocols. Your computer speaks many protocols already. The way the clock gets updated to know what time it is on your computer, that is a protocol. It connects to a server that's connected to an atomic clock that's run by a government research agency and gives you backup. Another one is the time zones. There's a protocol to define where time zones work and when they change and what geography you're in. You know your computer is using these protocols. It's designed to use them because otherwise when the time zones change you got to go manually adjust it and you don't think about it.

Speaker 2:

It's built into it Now with Blue. Sky sky you have to, or or nos, or or noster, or all these others you have to. Either you have to go log into a web page or go and install an application on your computer or your phone to use this protocol. Each one.

Speaker 1:

Okay. So when you invent a new protocol which is going to facilitate healthy, constructive social media, and it's sitting on your machine. That's nice, but now you have the problem of having it sit on everybody else's machine.

Speaker 2:

Exactly.

Speaker 1:

Okay.

Speaker 2:

You have this cold start problem, which is like the software is meant to be social and so the value of the software increases with every person that joins it. And you get this network and so that network effect of a, you know, a social protocol with five people using it is much less valuable than a social protocol with 10 people using it. That means that we have sort of a natural monopolistic tendency of these things. So as they start winning, they grow to be used by different communities and, you know, often those communities end up at the boundaries of language or the boundaries of self-identity, and sometimes they cross languages. So you have WhatsApp, for example, that has about 3 billion people using it and not all of them can talk to each other because they don't all speak the same languages. But you know, if you're going to use one with, you know, just five people using it versus 3 billion people using it, you really have to be interested in those five people, Otherwise you're going to join the 3 billion person thing.

Speaker 2:

Yes, I know this history well yeah, to get people to use these things, we need to create communities of value where people can feel identified and they join and and a reason to use the new thing, because no one chooses to use this one software, the other other, solely because of the technology. You don't use Instagram because you really like images over text. You don't use TikTok because you like video as the primary way of doing it. You use it because that's where the other people who are using it are talking and that's where that community has coalesced.

Speaker 2:

And so the problem is not the deterministic. What does the technology lead to do? It is sort of this symbiotic relationship between the culture of the users of it and the technology and how it works. The affordances of the technology, what it can do, how it can be reshaped and repurposed, matter a lot, but only in relationship to the people using it, because the culture of the users shape it, and so the interesting thing is the cultures and the technologies they don't reproduce when they go from one to the other. So when everybody left Twitter, when Elon Musk took over, twitter isn't going to reconstruct in the same way ever. That constellation of communities and dynamics and social norms is gone and what you get is subsets of that and recombinations of that on other platforms.

Speaker 1:

Okay, so, backing up a little bit, you've gone through all this early stuff. You and others have rejected the character of it as it existed. You're beginning to embark on trying to, in essence, replace stuff with better stuff. What's better? What are you offering?

Speaker 2:

So what's better is the new stuff we're building is about agency. People have control of it. They can connect to each other and they can do things that weren't possible before. So, for example, you can now have encrypted communications. You can have communications that only you and the other people you're communicating with have privacy, so that data isn't getting sold to a data broker, which is what happens in the United States, the data isn't getting used in machine learning to target you, the data isn't getting handed over to police on different records, but also you can then communicate about things that the platforms have decided are not okay.

Speaker 2:

And the really weird part about social media platforms is there's a particular kind of corporate California set of values that have been imposed on the entire world. So nipples are okay as long as you perceive the person with the nipple to be masculine. If you perceive the person with the nipple to be masculine, if you perceive the person with the nipple to be a little bit too feminine in their presentation, then all of a sudden the nipple is not okay. This is insane.

Speaker 1:

So is everything else, but go ahead. I get what you're saying.

Speaker 2:

So we should be able to form our own rules about what acceptable behavior is and how people should be able to do that, and a good example is like Did you just say that?

Speaker 1:

on a platform with a protocol, yeah, that's got lots of people. That's good Right as compared to in the past. That's good right as compared to in the past. There's such a level of possible security of communication that it wouldn't matter that the person who wrote the whole fucking thing. If there was such a person or the set of people felt that such and such utterances were bad, they couldn't do anything about it.

Speaker 2:

Yes, because they don't control it. They've defined the terms of the communication. They've given people the language and the tools, but the actual communication is between people on this permissionless, open network. So if you want to share porn, that's fine. A great example is, like you know, in social media apps, because there's AI that now looks for every word that's been said and transcribes and everything else. People can't talk about pornography and instead they use euphemisms. So they talk about the corn industry.

Speaker 1:

And that's pornography. Yes, corn, or you're making that up.

Speaker 2:

No, I'm not making it up. But another one is they've decided that it's bad PR if anyone does anything related to suicide, that it's bad PR if anyone does anything related to suicide. So you cannot talk about suicide on social media. People have to talk about unaliving themselves.

Speaker 1:

But what if somebody does talk about suicide? What happens?

Speaker 2:

Their content disappears.

Speaker 1:

Really.

Speaker 2:

Yeah, or their account disappears. And you know, to give you a sense, there was a guy who was a father and had a baby, and a baby had a diaper rash, no-transcript. Google was doing image analysis of every image that he took privately on his own phone and decided through the AI that this image was of a naked child. So they immediately reported him to the police, for you know, and then suspended all of his accounts. He lost his phone number, he lost access to gmail, he lost access to google things and everything else, and there was no appeal process like the ai decided oh, you know, it didn't decide you've committed some crime.

Speaker 2:

It decided the liability for us, google, the platform around this, is too high and we would rather destroy your entire digital life than risk right knowingly sharing some you know illegal content and the person, the guy, just lost everything and so like that's crazy.

Speaker 2:

It'd be like you buy a book and then the book has some content in it that you didn't even know and then like the police can come in and take every book from your house or they take the key to your house, because this is all the constructed world now, and to lose your accounts and lose all of your content and have to engage in this Orwellian doublespeak about what we communicate, all in the name of these platforms. Having a good, safe environment for advertisers is insane.

Speaker 1:

Okay, let me try and push back. So you create a new platform, you create a new protocol. You're concerned, rightly obviously, about the insanity and injustice and just grotesqueness of what you're describing. Yep. So you incorporate in the new system privacy, real privacy, security, people can't act, etc. Etc. And you remove from the new system, uh, centralized control and ownership yep, and now somebody says to you just like in the initial thing, there were unintended consequences that led into a sewer, do you have unintended consequences that led into a sewer.

Speaker 2:

Do you have unintended?

Speaker 1:

consequences that lead into a sewer. Okay, go ahead. I mean, I thought I was maybe going to run up against a little bit of a wall there, but I guess not.

Speaker 2:

So there will absolutely be unintended consequences. Like we, we, we build a new system. It is complicated, it is social. There's all this stuff going on. We don't know how it'll all work and one of the biggest underhanded consequences will be if we create the ability for us to have viral but private, engaged social communication around stuff that is encrypted and there's no central authority to control it. There's no place you can go to to delete it.

Speaker 2:

That means that people who will share content that we massively don't agree with will also be able to use this. You can use Tor to evade censorship in China and you can use Tor to buy drugs on the dark web and share child pornography, but most people are not sharing that child pornography on the dark web. Most people are using corporate platforms that have chosen to look the other way, like telegram. We will so. Like the proud boys and those folks, they'll probably use this stuff, but we've got to deal with them in a similar way to how we dealt with them in the past. You know, and antifa's famous for like the protests in the street, but most of what Antifa organizers do is track down and out neo-Nazis and, you know, white supremacists to their coworkers. Like they, they, they use social clues and engagement and join these groups and then get information about them and record copies.

Speaker 1:

like that work will still need to be done what happens if somebody says to you okay, it seems fine because the power is decentralized yeah but this rich person comes along and has a gazillion dollars and a gazillion means to create a gazillion messages and therefore he can predominate in your intended to be democratic, equitable, sensible project. Yes, At work.

Speaker 2:

So that is also true of the current system. They have access to media outlets, they have PR, they have producers, they have the ability to craft messages. And what's worse is it is also now true with AI, in that we can make believable messages, videos, photos, text, that you can do this at scale. You don't need the Internet Research Agency having buildings full of people in Moscow posting this stuff and having to learn English and learn enough about American culture. You can just run an AI to do it. Learn enough about American culture.

Speaker 1:

You can just run an AI to do it. So let's not get too into AI, because I'm already thinking I want to have you back on for a session on just AI.

Speaker 2:

Yeah so, but the point is it is vulnerable to those things. But we can build tools that let you try and solve that things between different people. So in these new protocols you have this thing called web of trust. So in Facebook's world, facebook has a metric of how trustworthy you are and how viral your content should be and who should see your content, and they have all these, what kind of person you are. But you don't know what those numbers are.

Speaker 2:

In Nostra and Blue Sky and these other emerging protocols, you define your social relationships and when you see someone new, you know how trustworthy they might be because there's numbers and scores of mutual friends and you know this person has a bunch of people I know who are blocking them versus other people who like them.

Speaker 2:

You know, and that is visible to you. And so when someone comes in completely from the outside trying to do an influence operation, it's more visible, a way in which people look at social media influencers who's like blow up all of a moment, and now when you see that happen, you're like, oh, was that an industry plant? Like, did they get scouted by some? You know label, record label or producer or something and then get a PR agency and produce that and that happens. But before social media that was the only thing that happened. Like the only way to become a pop star was a music label signed you and you know, paid for the recording studio and paid for the PR person and the producers and producing the CDs or the cassettes and sending them to the radio stations and doing like that artificial influence system has always existed. Now we get transparency a bit into how it's working, because we can see it and the opinion of it has gotten much more judgmental.

Speaker 1:

What if somebody says I'm trying to think up. You know what if somebody says okay, but if you lean that way, don't you increase rather than decrease the infatuation with, or the absolute worship of, appearance that is, you're trying to get like what do you call it? Hands up, clicks something, in other words, your online persona. I don't know enough to know. Your online persona is seeking, perhaps manipulatively, perhaps by lying right, whatever to be appreciated Exactly as compared to being honest and, you know, pursuing what you really believe in.

Speaker 2:

So what if somebody?

Speaker 1:

says it could lead to that Huh.

Speaker 2:

So for some personality types and some kinds of people, that is, you know, it's like crack cocaine, it's like you know, it's like they go, do we do it? But if you look at how most people use social media, the majority of people on social media platforms have never published once. Most people are not part of this pay attention to me creator focus thing. We see those people because they're the ones producing lots of content, but the vast majority of people never have posted anything. And then there's a smaller minority that like content, but the vast majority of people never have posted anything. And then there's a smaller minority that like content and occasionally they do a reply. And then there's a smaller, even group of people who share content. They, you know, they're curators.

Speaker 2:

And then there's a teeny percentage of people, even though anyone can do it. And then there's a teeny percentage of people, even though anyone can do it, who are doing that creative thing where they're putting themselves out there and they're organizing on by what they've been seen. And so if you look at young people today, the new thing on Instagram is to never post any photos on Instagram, like all you do is these ephemeral reels that you can choose the scope of who they go to, and your block of images, which was the central part of Instagram, is empty, so they're opting out of this. I'm always being performative and I'm always on stage, and so people have agency in what they do there, and different culture norms are going to arise on how to handle that and it's going to be complicated, but the solution isn't for everybody to do nothing or to not produce the solution.

Speaker 1:

you want people to participate, right, we want to increase the level of participation.

Speaker 2:

We want to diversify the viewpoints and participants in the system.

Speaker 1:

But we want the participation to be healthy. We don't want it to be lies and manipulations to get hand clapping or whatever. It is Exactly Okay. So back to your project. Tell people what it is that you and other people who you're working with are creating and why they should pay attention and relate to it.

Speaker 2:

So NOS means us in Spanish and in Portuguese, and the idea is that it's our network and it is a social media app. You can install it. There's a web version. You can go on the iPhone, android, everything else and when you publish there, you control your name, you control your identity, you control your ability to connect audiences and you control your ability to create stuff, and so you can go on there and you can write short form things. You can write tweets, but you have all of the kinds of social media apps so you can do a live stream. You can do a Twitter spaces clubhouse type thing where you get all of your contacts together and you have a conversation.

Speaker 2:

There's an open, decentralized version of Spotify where, instead of sending money to the labels, you can directly send money to the artists for the songs you're listening to. You can directly send money to the artists for the songs you're listening to. There's collaborative document editing. That's like a wiki, except the versions of the documents are all subjective from your point of view, so you see the edits of yourself and everybody in your spectrum of people who you're following and you collaboratively create documents. You can do medium type posts of long form blog articles, podcasts, all that kind of thing, all of these normal social media apps you do, and your name and your followers exist on this decentralized network that you control network that you control. And so in some ways, you join and it feels very normal, it feels very safe, it feels exactly the same thing, except in this one you can follow Edward Snowden, and when you're talking to Edward Snowden, you can see his content, reports on people. So Snowden can say I think this person is a spammer, I think this person is committing fraud, I don't like this person. I muted them and you can be like. Well, I trust Edward Snowden. I'm going to use his recommendations to shape my view of this social space. And then there could be someone else like Jack Dorsey's on there. You can be like I hate Jack Dorsey, he screwed up Twitter. He's supporting RFK Jr, whatever. You don't like him. You could block him and not use any of his recommendations and stuff.

Speaker 2:

And so it's about inverting that agency and power and being able to decide what you want. And so when you join, it should feel similar than what you have, but also you don't have to be careful about. You know, if you want to talk about suicide, you could just say suicide. You know, if you want to talk about frustration with abortion being made illegal where you live in your state, you can talk about. You can use the word abortion, which you cannot use on Instagram or TikTok, like it's a banned word. You know. That's why you have to say, oh, I'm going to, you know, cross state borders to go on a camping trip as a euphemism for an abortion. So, because the rules are constructed by your social community and not some corporation, you get to say what you want and some people say horrible things, but you get a community process by which you can enforce that. So you get all the social apps that you're used to, but you get sort of self-governance of your own digital spaces.

Speaker 1:

In the existing, you know, in the mass social media. I don't know what to call it, but in social media, as we've criticized it.

Speaker 2:

We usually refer to it as like centralized social media.

Speaker 1:

Sometimes, corporate social media.

Speaker 1:

Okay, so in centralized social media, there is the center, there is this source of control, and if I ask myself, well, what are the motives that operate and contour what's going on? I know well, shit. I should look at the motives of the center, of the place from which the power emanates, and then I'm going to quickly see that the motives are to maximize profit and maintain control, which are not always exactly the same, and I may even also see, say with X, that the motive is literally to elevate, I don't know, trump or you know. In other words, there are various motives emanating from that center.

Speaker 1:

You have this decentralized thing, uh, but I wonder if there are also motives. So, for instance, in the construction of NOS, is that the way of the social system that you're producing? Are there some motives, that is, instead of maximizing profit of the center, for example, increasing the distribution and access of good sources, or are there other motives like that, so that it isn't only that you're not owned, it's also that that leads to substantial differences in what is in fact going on. It might look similar, but it's different, yeah, so what about that?

Speaker 2:

In 1997, subcomandante Marcos wrote a letter to a media activist conference in New York and part of that letter he said we need a media revolution to make revolution possible. And that is kind of what we were doing with Indymedia. That's kind of what we were doing with social media. That resonates for me for what we're doing with NOS and the social network that we're building and these protocols. We are building the tools that give us the social space, that give us the platforms by which we can organize for social change space, that give us the platforms by which we can organize for social change, and it gives us the ability to have autonomous spaces where we can organize ourselves, where we can own our own spaces. Just like Occupy Wall Street created, this temporary liberated space where you could have a different model for social engagement, where you could have a different model for social engagement.

Speaker 2:

The ultimate goal for these social media protocols and for these technology projects is to give tools, space and technology to activists, to people who are engaging in social change, so that they can do their work more effectively and organize people to change how society works.

Speaker 2:

So the real goal is how do we change the underlying physics of communication and economy and everything else that opens up the possibility for, you know, non-reformist reforms, to quote you. But like some of that stuff that you talked about and like we've sat down and talked about it, and Woods Hole and many years ago, deeply shaped my thinking about this, in that we need to build things that help us continue to build other things and you were talking about policy changes and things like that but what we're doing in the technology space is building technology that doesn't end us up in sort of a stuck endpoint but continues to open up space for more radical social change and give people space to organize. And so the ultimate goal behind this is how do we give ourselves the tools and the institutions and the infrastructure and the culture that continues to drive forward change, where we can learn and build a liberatory space?

Speaker 1:

I'm for it. We're at just about an hour and I think you have another meeting. So I'm wondering I mean I could go on forever with you and I suggested maybe you can does it make sense to have you on for another session where we talk about AI and its implications?

Speaker 2:

Absolutely. Yeah, no, I actually don't have a thing for a little bit. I've got a little bit of time.

Speaker 1:

Okay, so let's do that on another occasion, because I have to admit I'm becoming concerned, seriously concerned about it, not just because it can destroy everybody, but because, working as it's supposed to work, as it's meant to work, as people will celebrate and cheer it. It's going to reduce humans to something less than human.

Speaker 2:

It's a different kind of an angle on the.

Speaker 1:

Thing.

Speaker 2:

There was a meme I saw the other day which was I wanted AI to clean my house. I wanted AI and robots to clean my house and do the dishes so that I could spend time hanging out with friends and reading poetry and making art. And what I got was AI that makes poetry and art and hangs out with my friends without me, and all I'm doing is the dishes and cleaning the house.

Speaker 1:

Whoever did that is right on the money. I think, regrettably, most people aren't seeing it that way. They're just. Anyway. Is there something else about NOS that you want to bring up? And don't be bashful about it? I mean, it's not a huge audience, but there's an audience listening that can pay attention and visit and stuff. I don't think you're at the stage. Well, answer me this Are you at the stage where you're really trying to get users?

Speaker 2:

Well, answer me this Are you at the stage where you're really trying to get users or are you still trying to figure out how to contour everything, how to communicate about it? Fill, use the Nostr protocol, including NOS, and you know it has roughly 800,000 people using it today, but that's not NOS, that's.

Speaker 2:

Nostr Nostr, but that's the people you talk to on the protocol, and so we're running a program for journalists and creators who are interested in experimenting with new platforms, and we've worked on getting some of these outlets online. And if you go to NOS N-O-S dot social, you can find information about these programs where we're helping people get on board and understanding it. And we're doing two things with that One, we're learning from people about what their real needs are and what their real desires are and how they want to use it. It's a process of co-design. And then also we're building up a breadth of kinds of content on the platform so that you can find different kinds of things. Everybody has a thousand different interests. We are a multitude inside of ourselves and so it's about creating a space on this platform where you can connect to people and you can learn different kinds of things and collaborate, and so that's a lot of what we are doing right now is figuring out the technology, figuring out the community, trying to get people to grow.

Speaker 2:

We want people who are interested in experimenting with something new, interested in trying something out, and they realize that it will have bugs and warts and not work perfectly. It is incredibly hard to produce the quality of applications that compete with the billions of dollars that Silicon Valley spends. It is incredibly hard because that stuff is so well-designed and so well-optimized and we are so used to the things. Working like that and just making it like that is really hard. So we're building the technology out and we're looking for people who want to build and design it with us together and help co-create it.

Speaker 2:

Now NOS is on this Noster protocol, but it's important to know that inside this app you can follow and talk to anyone on Blue Sky, anyone on Mastodon, and when Threads, which is Facebook's new Twitter-like app, joins, you'll be able to talk to anyone on this. So all of these networks Mastodon, Blue Sky, Nostr, Arcat all of the threads they're all actually connected in this larger world called the Fediverse and there's, you know, a YouTube replacement and all these different things you can do. And the most important thing is not so much that you use NOS or Nostr, but that you start looking at beyond the corporate platforms, beyond the centralized platforms, into something that is a commons that we create and own ourselves, because then you're not a consumer of this, You're a co-creator of this, and that, I think, is the important part is that people are realizing that they're in some ways joining a movement, and it's a movement to reclaim this space and reclaim this ownership of the public sphere, yeah, which spend billions to make everything work perfectly and quickly, et cetera, et cetera.

Speaker 1:

Not perfectly, quickly and perfectly from their point of view, yeah, I think atrociously from the public's point of view, but it feels like it works and it's hard to compete with that. Okay, I get that, but now where do your funds come from? I mean, a leftist listening to this has to be beginning to think well, hold on a second. How the fuck are they doing this? All right, he's describing what he wants to do. Anybody can describe what they want to do, but he's also describing that he's doing elements of it. How are they financing it and how are they going to finance it as it grows into the future?

Speaker 2:

That's a great question, and figuring out how to finance it has been a hard process. We, you know, we spent time trying to figure out how to get donations and crowdsourcing. We, you know, at one point I set up a multi-stakeholder cooperative like what's called a platform co-op, and where we're at right now is, ironically enough, it's very similar to the robber baron era or the era of princes, having sort of patrons of the arts patron. You know things like that. So I have done enough work in Silicon Valley that there are a bunch of Silicon Valley rich people who are upset about the world that they themselves created and so they give me money.

Speaker 2:

I talk about the mission and the vision and what we're doing, but I do it in such a way where what we're building is open source, to try and not repeat the mistakes of venture capital. Repeat the mistakes of venture capital, not repeat the stakes of. I'm building this like we did with Twitter, where we built this amazing collaborative platform that people felt like they owned, but really some venture capitalists owned it, and so the primary funding right now actually comes from a grant not an investment, but a grant from Jack Dorsey. Not an investment, but a grant from Jack Dorsey, where he's put, you know, $5 million into my project and another 10 or 15 million into related projects on the same protocol, and it's remarkably hands off at the moment, like he's not telling us what to build, what to say, how to do it at the moment, like he's not telling us what to build, what to say how to do it. We, you know, the whole point was he didn't want to repeat the mistakes that he himself made, so he doesn't have ownership over it.

Speaker 1:

But what happens if? Let me just ask you from the fundraising world, what happens if, six months from now, you're growing right and you need more money, and you need more money, and then at that point he says to you hey, nice job, but the next batch of money comes with some strings and you've developed in a fashion that needs the money to keep functioning and to grow, and so now you're faced with this difficult situation.

Speaker 2:

Yes.

Speaker 2:

And so the urgent need is to find a sustainable economic system by which these organizations and platforms are able to fund themselves from the users and from the operation of the network and get enough economic remuneration to support it, so that we are accountable to the users, accountable to the people using the system, and not some outside entity, be it a wealthy donor, wealthy investor or advertisers. And so to do that, we've got a system of micropayments and a system by which you can do micropayments, and I think we need to solve the economic model. Otherwise, we're just going to create something that's very valuable and someone with more money is going to come in and buy it out, Even if it's all open source. Once you get to the point of this could shut down. We can't afford to continue working. Or we sell it out and we get it. It doesn't matter how good the agreements you have are and the legal structures you have here.

Speaker 2:

If that option is, we have no financial viability independently or we take the money, then you lose all the leverage, and so our goal in the next couple years is to build alternative funding mechanisms, build funding mechanisms that are fundamentally bottom-up and participatory, and that's. We haven't talked about it very much, but that is what I'm actually doing with Causescom, which is an advocacy platform similar to MoveOn or Avaz, where we are going to build out a social movement crowdfunding platform for activists, because we need tools to fund this stuff that aren't dependent on the capital markets or wealthy patrons or foundations or, necessarily, the government. We can use some of those things. The EU does lots of good funding First up. There's a few other places you get it, but we need an autonomous system of economic support to system that underlies what we do, or we're not going to be able to challenge the system.

Speaker 1:

Obviously, I agree, and it's a difficult problem and you're working on it. The fact that you're working on it is the crucial fact. How many people? Just so people can get an idea, you said around 800,000. Now, what do you see is, five years from now, assuming we haven't blown the world up or incinerated it or et cetera, et cetera, but assuming things are still rolling along where do you see it going in terms of scale, I mean?

Speaker 2:

the goal is to have this replace the existing social media apps and systems.

Speaker 1:

The goal is to have billions of people use this and my goal in asking the question was to have you say exactly that but we don't get there all right away.

Speaker 2:

We have to do like you know, you don't build something for everybody day one. You build it for specific communities that have specific needs and you build it out and you grow it. But you know, when we were working on Scuttlebutt it was thousands of people doing it. Now there's hundreds of thousands and, depending on different parts of the network, the blue sky part of the mass, you know, blue sky is about 6 million, mastodon is about 10 million. So we're getting into the you know tens, twenties, millions people. That's still teeny on the global scale of things.

Speaker 1:

Of course.

Speaker 2:

But it's enough to know how some of these things work at scale and we need to keep building it in a healthy way that gives people tools to self-manage themselves. Because if we don't give people tools to self-manage trust and safety and moderation and behavior within these stuff, you don't give people the commons governance tools of saying these are what the rules are. Here's the transparency, here's what happens if you break the rules. Here's who's the side. Like all of those things. If we don't have those, then people will run back to the shopping mall of centralized social media, the shopping mall of centralized social media Right.

Speaker 1:

Maybe I would even go on a step further, although it may be a step that you're already taking, I don't know and at one point earlier you mentioned that it's a kind of a symbiosis between the people who are working on the technology and the structure and the rules and et cetera, et cetera, which is a growing set of people, and the general culture out there.

Speaker 1:

The general habits that people have Now, the general culture out there and the general habits that people have are a bit distorted precisely because we live in the world we live in and all the pressures that are on people and that yield these bad habits and these bad inclinations, like to go back to the mall To address that in the sense of it's not just providing tools that let you self-manage in a neutral fashion, ignoring that context, but some sort of not just solve the financial problem but also solve the I don't know what to call it the consciousness problem, the attitude problem, the habits problem. Is that part of the thinking that's going on, or is it not necessary? Or am I right? It is necessary and you'll have to do it eventually.

Speaker 2:

I think you're right, it is necessary and I think that we don't have a solution for it yet. So one people learn a lot. As you enter institutions, you learn the norms of the institution and you find mentors and you mimic other people's behavior. So some of that is the case, and if you look at organizations that have scaled in this sort of voluntary way that you know, there's the digital ones, like the people who maintain Wikipedia or the people who maintain Debian Linux, these are many thousands of people who are entirely cooperatively, democratically, without central control, managing vast economic resources. And so we have models from Debian and from Wikipedia and a few other projects and a bunch of open source projects on how to do this and the technology skills. But we also have models on how to run a network and self-govern that are older.

Speaker 2:

Alcoholics Anonymous is this oddly Christian anarchist network in that there's no formal leaders, there's no formal structure. There's no formal structure, like there is no president of Alcoholics Anonymous. All tasks in Alcoholics Anonymous are rotated on a regular basis. You have all the different roles and you're supposed to rotate them on a. You know it's a balanced job, complex of work of mutual aid and so, but people don't do those things unless they're taught and they find someone and someone mentors them and they teaches them and they learn how to do these things. And we will need to figure that out of how to communicate that and how to teach people the ways to do it.

Speaker 2:

But the benefits, like one advantage we have, is we have these models of these organizations that we can copy and we can learn from. We can learn from the Alcoholics Anonymous, we can learn from Wikipedia, we can learn from Debbie and Linux. We can learn from all these other projects that have massively scaled collaborative work to build this infrastructure and realize that most people working on using Linux don't have to engage in the governance of it. They don't have to engage in the maintenance of it. Almost everybody who uses Wikipedia doesn't have to engage in the politics and the social dynamics of the creation of this.

Speaker 2:

There's ways in which you contribute, but you don't have to.

Speaker 2:

You know you don't have to do very much, and so what I suspect will happen is we will have a small subset of people who voluntarily choose to engage in the running of infrastructure and another group that's developing norms, and another group building out algorithm and AI for this system and another group, which is like we are the trust and safety labelers that help you find which content you should be able to see and give you tools to filter it, and so I suspect what we'll have is different communities of people that will operate with a fair degree of transparency.

Speaker 2:

You can see what they're doing and are actively advocating and recruiting for more people to join, because they want more people to participate in that work and in an open, permissionless network that there can exist multiple parts, so if you have a conflict, you can fork it, you can split the community and have two with different sets of norms, and you see this on Reddit. Actually, did you know that there is a meth users Reddit and so it's a sub Reddit community on Reddit just for people who are talking about how to most effectively use methamphetamines?

Speaker 1:

No, obviously I didn't know that.

Speaker 2:

But here's the crazy part. But then there was a group of people who decided that the meth users group subreddit was too politically correct and inclusive, and so they started an anti-woke meth users subreddit. Right God.

Speaker 2:

And like to me, that's brilliant. To me, that's like okay, two communities, like everybody, was this community. Some people decided that they didn't want to work this way, this other way they were able to split and some people can say racist, homophobic shit while they talk about why they want to use meth and how they're using meth. And the other group, which is actually much larger, is like no, actually we don't want the bigoted stuff while we talk about meth. Just talk about meth, Don't talk about, you know, like right-wing politics.

Speaker 1:

I have to admit I'm not in love with that. But the reason I'm not in love with that is isn't just because, okay, one side's right and one side's wrong I mean, they both have problems but it's because and this is one of the features of social media the siloing of identities, so that you know your contacts all keep pushing you and pulling you further and further in a particular direction and you'd never see you, never engage with the other directions. So the the, the discussion just doesn't happen. It happens all over the place. It happens right now, right and left in the united states.

Speaker 1:

Well, let's call it I don't know what to call it liberal and republican, fascist and mainstream, whatever you want to call it. On both sides there's no communication, and that, arguably, is the biggest problem. It isn't just that the views are fucked up they are, but it's also that and neither side understands that in the eyes of the other side it's just as fucked up as they think the other side is. I mean, it's incredible when you actually look at it. The Trumpers think that the you know, the liberals are guilty of all the things virtually all of them that the liberals think the Trumpers are guilty of. It's just incredible.

Speaker 1:

It's like a psychiatrist projection thing, I don't know, but in any event, you know it's a big project. You're taking on something. There was a point I can't even remember the dates when we decided that we were going to try and create we being Z, mostly me, honestly we're going to try and create an alternative, with some help from programmers, to just Twitter and Facebook. And you know we made something and you know it had some potential. It's not as near as smart as what you're doing, but it had some potential. But the existing culture and the simple fact that, yes, you could join that and talk to N people and you could join Facebook and talk to a million times and people and you couldn't compete with that. But it seems like you're on the road to doing what's almost impossible, which is competing with something that has barriers to entry that are so high that they you know they can't really be climbed. You're simply going around them. Um, you know you're just ignoring the barriers.

Speaker 2:

Yeah, and for years I was told there's no way we could possibly do this for years I was told that there's like no way you'll convince people to join these other things.

Speaker 2:

And there were those of us off having these meetings at the Internet Archive and collaborating and building these things and, you know, happily talking to our dozens or hundreds of users on stuff, and we were warning. We were like look, facebook, now Meta lies. Like they actively went in and intentionally destroyed the media in the news industry. Like they said you must do videos and they lied about all the numbers on the videos, the viewership numbers, to get all these media organizations to switch all their content to video because it was good for Facebook's advertising model.

Speaker 2:

So, and no one left them and no one abandoned them to that.

Speaker 2:

And, oddly enough, elon Musk coming in and taking over Twitter and it wasn't even his politics, because his politics are terrible, but it was actually just kind of like breaking everything for everybody else and like tearing the community. And Twitter is still X, is still a viable platform. It's amazingly hard to break these things once they get going. Once that happened, enough people realized that this is a major problem and that we can build an alternative and we have a chance to do that. It still might fail in the road back at the time when we you know, a bunch of us switched from working on indie media to working on web 2.0 and started making, you know, helping build all these companies that ended up, you know, they did democratize the media, but we didn't own it. And so we have another chance to do that and I think that we may succeed and I am sure that in a decade we can sit down again and we can talk about all of the massive mistakes and structural problems of the things we're doing today.

Speaker 2:

Remember I went to a protest about Mumia Abu-Jamal back in the 97, something like that 96. And at the end of the protest in Philadelphia with tens of thousands of people, I got home and I looked at the newspapers and I looked for news coverage and there was nothing there. It was absolutely nothing. The only place you could hear anything about it was like on Pacifica, like a few community radio stations, a few public access television stations and that was it and that like we don't live in that world anymore, like we have changed that dynamic and we have new problems today.

Speaker 2:

But you know, you created a lot of the alternative media and the alternative news and the underground press and the whole world of like the media infrastructure of the movements of the 1960s and seventies and it was so much harder to get that and to do that and to get that attention and get those magazines printed and how much expensive it was and get them distributed to different stores, and then you had to go in to the right record store or food co-op to see it on the magazine rack to read that article.

Speaker 2:

And so we have sort of broken down that edifice and created a new thing and it's got new problems, but we have much more agency to have real voice today than we did in the past and we're going to try and change the underlying rules of the system again to give us new technology, and I am absolutely certain that we will create new problems that we will regret and need to deal with in the future. But my hope is that it's this non-reformist reform. It gets us further down the path and it creates space by which we're able to build the next thing and we're able to build social movements on top of it.

Speaker 1:

I commend the whole effort, obviously, and we will try and help as we can and I hope listeners will pay attention and lend their hand also. We'll see, but I think unless there's something in particular that I haven't asked that you want to talk about, we should probably call it at this point.

Speaker 2:

Yeah, it's been a long conversation and it's been an absolute pleasure. It's been an absolute pleasure and I would just say that to the revolution Z listeners and and to you, michael. Like the work that you did and that the Z magazine crew did and that the Z summer school did, everything else has been absolutely transformational for me in understanding the work in my life and the politics of it, and I'm eternally grateful for that because it helped me make sense of what's going on in the world. It helped me understand my role and other people's role in social change and, like you sent me an email in 1997 or something, saying you saw some article about some tech project I did and maybe we should get together and that was critical to kicking off my political education. So thank you.

Speaker 1:

Thank you, I appreciate you saying that. I hope it's true. It's very good and hopefully maybe I can be helpful in the future too. We'll see, absolutely. But in any case I might just tell you a little story.

Speaker 1:

You know Lydia a little bit. You know her. I may have trouble I think I'm going to have trouble doing this but in any event she used to do theater and plays and all sorts of stuff like that, and every once in a while she would come home on a friday or saturday night, having done the, the show she was director and in it, and so on and so forth. It was all political. And she'd come home and she'd be upset and she'd say, uh, you know, there were 15 people, or there were 10 people, and some nights there would be 100 or 200. And what I would say is, on the night that there's 200 or 20,000, the impact on the individuals might be such that it matters, but it didn't do anything, you know. And on the night when there was 10 or 15, maybe Rosa Parks is sitting there and it makes a gigantic difference. And you know what can I say? You're my Rosa Parks. So thank you.

Speaker 2:

I'm just one of many activists who's been working on this stuff for a long time, and you know some of the things I've worked on. I've worked out. It's had a big impact and I feel incredibly grateful for that. I feel very grateful for all the generations of activists, including you and Lydia, who trained the next generation, who communicated it, who brought together people to make sense of it and make sense of the world and what we're doing and how it works and what we think of when we think of and that work, and that you know we never know who will touch and who will use our stuff and how it will be used. Like I didn't know when we started working on Twitter and worked on podcasting that it would have that big an impact on changing the world.

Speaker 2:

Like I am not the people who organized on the ground, who did Black Lives Matter. I showed up at the protest, but I wasn't the organizers. I wasn't the people behind the Arab Spring. I wasn't the. You know all of these things. You know I wasn't involved in Me Too, but I was able to help create the system that they were able to use. Like you know, we built the system that allows hashtags to exist, and those hashtags ended up being a useful tool in all of these movements for social organizing, and so I think that that's just something that we all need to remember is that you know you don't know what's happening till later and you do this work and you hope that people you talk to, people who use your stuff, people who read your stuff, will find it useful and will use it in their own organizing and you pass it down because this project of social change is multi-generational and has to win we have no other option.

Speaker 2:

I'm incredibly optimistic and I know that you for all of those things like. I've seen that optimism from you and things go to shit. But we keep on working and that's the only way we get forward. So thank you.

Speaker 1:

Okay, All that said. This is Mike Albert signing off until next time for Revolution Z.

Evolving Social Media and Cultural Impact
Reclaiming the Open Internet
Building Better Social Media Protocols
Impact of Social Media Platforms
Decentralized Social Media and Technology
Building Sustainable Funding for Independent Platforms
Building Alternative Social Media Platforms
Impact of Hashtags on Social Justice