Artificial Intelligence Podcast: ChatGPT, Claude, Midjourney and all other AI Tools

Using AI to Fight AI with Alex Fink

May 20, 2024 Jonathan Green : Artificial Intelligence Expert and Author of ChatGPT Profits Episode 309
Using AI to Fight AI with Alex Fink
Artificial Intelligence Podcast: ChatGPT, Claude, Midjourney and all other AI Tools
More Info
Artificial Intelligence Podcast: ChatGPT, Claude, Midjourney and all other AI Tools
Using AI to Fight AI with Alex Fink
May 20, 2024 Episode 309
Jonathan Green : Artificial Intelligence Expert and Author of ChatGPT Profits

Welcome to the Artificial Intelligence Podcast! This podcast is your guide to using artificial intelligence to open new revenue streams and make money while you sleep, without the hassle of a traditional job. Our host, Jonathan Green, is a best-selling author who shares insights and strategies from a tropical island in the South Pacific.

Today's guest, Alex Fink, is a pioneer in the battle against the inundation of low-quality content and advertisements online. With a background in leveraging artificial intelligence for good, Alex has developed a platform that helps users sift through the noise to find the content that truly matters to them. His work is centered around the idea that the internet should serve us, not distract us, with endless, irrelevant information.

In this episode, Alex discusses the challenges of modern content consumption, where advertisements and clickbait have overwhelmed our digital spaces, making it difficult to find quality information. He explains how his platform uses AI to filter content, allowing users to access articles that match their interests and values without the clutter of irrelevant ads and articles. This approach not only enhances the user experience but also encourages a healthier online ecosystem where quality content is rewarded.

Notable Quotes:

  • "It doesn't matter if content is created by AI or a human, as long as it's good. The real issue is that AI makes it easier to create low-quality content en masse, overshadowing genuine, insightful work." - [Alex Fink]
  • "Ad blockers do a great job at filtering ads, but we're focusing on the quality of the underlying content. Ideally, ads should pay differently based on the content quality, encouraging better content production." - [Alex Fink]
  • "I view AI like alcohol; it amplifies your tendencies. If you aim to produce high-quality content, AI can help you do that more efficiently. But if your goal is to churn out volume without regard for quality, AI will enable that too." - [Jonathan Green]
  • "The future of content isn't just about creating more; it's about using AI to filter and find the content we truly want. This not only saves time but enriches our consumption with quality and relevance." - [Jonathan Green]

Connect with Alex Fink

Website: otherweb.com

  • Here you can explore Alex's platform, download apps for Android and iOS, and learn more about how AI can improve your online content experience.

Connect with Jonathan Green

Show Notes Transcript

Welcome to the Artificial Intelligence Podcast! This podcast is your guide to using artificial intelligence to open new revenue streams and make money while you sleep, without the hassle of a traditional job. Our host, Jonathan Green, is a best-selling author who shares insights and strategies from a tropical island in the South Pacific.

Today's guest, Alex Fink, is a pioneer in the battle against the inundation of low-quality content and advertisements online. With a background in leveraging artificial intelligence for good, Alex has developed a platform that helps users sift through the noise to find the content that truly matters to them. His work is centered around the idea that the internet should serve us, not distract us, with endless, irrelevant information.

In this episode, Alex discusses the challenges of modern content consumption, where advertisements and clickbait have overwhelmed our digital spaces, making it difficult to find quality information. He explains how his platform uses AI to filter content, allowing users to access articles that match their interests and values without the clutter of irrelevant ads and articles. This approach not only enhances the user experience but also encourages a healthier online ecosystem where quality content is rewarded.

Notable Quotes:

  • "It doesn't matter if content is created by AI or a human, as long as it's good. The real issue is that AI makes it easier to create low-quality content en masse, overshadowing genuine, insightful work." - [Alex Fink]
  • "Ad blockers do a great job at filtering ads, but we're focusing on the quality of the underlying content. Ideally, ads should pay differently based on the content quality, encouraging better content production." - [Alex Fink]
  • "I view AI like alcohol; it amplifies your tendencies. If you aim to produce high-quality content, AI can help you do that more efficiently. But if your goal is to churn out volume without regard for quality, AI will enable that too." - [Jonathan Green]
  • "The future of content isn't just about creating more; it's about using AI to filter and find the content we truly want. This not only saves time but enriches our consumption with quality and relevance." - [Jonathan Green]

Connect with Alex Fink

Website: otherweb.com

  • Here you can explore Alex's platform, download apps for Android and iOS, and learn more about how AI can improve your online content experience.

Connect with Jonathan Green

Jonathan Green 2024: [00:00:00] Using artificial intelligence to fight against artificial intelligence. With today's special guest, Alex Fink, 

Today's episode is brought to you by the bestseller Chat, GPT Profits. This book is the Missing Instruction Manual to get you up and running with chat g bt in a matter of minutes as a special gift. You can get it absolutely free@artificialintelligencepod.com slash gift, or at the link right below this episode.

Make sure to grab your copy before it goes back up to full price.

Are you tired of dealing with your boss? Do you feel underpaid and underappreciated? If you wanna make it online, fire your boss and start living your retirement dreams now. Then you can come to the right place. Welcome to the Artificial Intelligence Podcast. You will learn how to use artificial intelligence to open new revenue streams and make money while you sleep.

Presented live from a tropical island in the South Pacific by bestselling author Jonathan Green. Now here's your host.

A topic near and dear to my heart is how cluttered everything gets between my news feeds and advertisements. Everything is surrounded by recommendations for things that I don't want. I'm not interested in it, and I find distractions for years.

I didn't know that Facebook ran ads 'cause I had such a good ad blocker running that people would talk about. I say I've never seen any. I've blocked them out because I wouldn't use the platform. It's the reason I stopped using MySpace. It became more ads than profile. We're coming back. That era again, if you look at Google search results, the page has 50 results of which maybe two are actual search results.

And it's known that even if it's not a paid search result. Large companies that spend money on advertisements somehow also get pretty good organic search results too. So their thumb is absolutely on the scale, the original attention is gone. And interesting is your project, what you talk about is helping people to actually get the information they want and block out the information they don't want.

So can you tell me how you first got interested in that idea and then how you're using AI to implement that? 

Alex Fink: So actually one of the things that you said is the key here, which is that the ads themselves can be blocked. But what we can't quite fix, or at least couldn't fix until a couple of years ago, was the way in which the underlying content has been changed to accommodate ads, right?

And that's the part that we've been trying to fight all along, which is it, okay, you want to monetize using ads, do that, but why are you writing a different article? That probably means it's not done for us, the readers. It's done just to make monetization better. So I got into it in a sense, because I've been an information junkie, I've just been consuming a bunch of stuff for several decades, and at some point I started noticing everything is getting worse and I don't know why.

And so I started looking into it and the hypothesis I came up with was that if everything is monetized using ads. And adds pay per click or purview. Then somehow all content evolves to become click bait over time. 'cause that's the single selective pressure. And so it almost doesn't matter if people have good intentions, bad intentions, the content itself will drift that way.

And maybe the people who are trying to fight the tide are just going to get left behind. And we started off by developing a bunch of filters that just tried to measure things. Try to figure out, is this a clickbait headline, not a clickbait headline. Is this offensive content not offensive? Are these links affiliate links, or are they actual links to something that is a proper reference to what you're writing?

And then we accumulated about 20 of these. We created a nutrition label for every article based on these. And at some point, the users of this nutrition label product asked us, can you make your own platform? And that's how the other work was born.

Jonathan Green 2024: So it's interesting to think about how content is driven, even the fact that we call it content now, right? People say I'm a content creator. We used to say, I write news articles, or I create videos, but now it's all, everything is content and I. The biggest example of this, I think of is when articles started having top 20 lists, and you'd have to click next page to see all 20.

Yeah. And the reason they do this is to get 20 times the page views. So that's worth 20 more than a long form article that you read on single page. Because each time the ads reload, they get paid a little bit more. And when you see this content race to the bottom, what I'm really interested in is.

Exactly how we can extract ourselves to this. 'cause even ad blockers, it's a constant war between ad blockers and sites. And even sites like YouTube now say, oh, if you block the ads, then we'll ban you from our site. Like you're not allowed to use the site. And I get, you get those warnings all the time and there's sometimes they're really tricky, oh, if you really wanna support us, let our ads run.

And then some of those sites actually have an infected ad that you're block the ad blocker is actually blocking a virus and Right. Other malicious things. I've even seen advertisements that will use your computer processing power to mine for crypto coins. So they're doing all sorts of different strange things and it's this constant race between ad blocking and ad creation technology.

So it sounds like you're taking a different approach, which is actually at the. Just looking at the content as opposed to looking at the code. So you're actually looking for the human side. Is that correct? 

Alex Fink: And actually, if we run forward into what I'd like to do 10 years from now, I would like to have a world in which the ads pay a different amount based on the quality of the underlying content.

So that the same ad, same click, same everything, assuming they are. Normal ads and not the kind that mine Bitcoin on your computer. But if they are normal ads, they shouldn't pay the same amount on TMZ that they do on CNN. Not saying CNN is perfect, right? But it's better than TMZ. So from my perspective, there should be some differential to incentivize TTMZ to become a little more like CNN.

Right now, everybody is incentivized to become. Those kind of listicles that you mentioned with the 20 times clicks next, and there is a bunch of fake articles underneath that look like real articles, but they're ads too and things like that, right? So everything is evolving in that way and it is a race to the bottom, by the way.

One thing you mentioned that these listicles, that Top 20 are the ones that were the breaking point for you. The breaking point for me was Autoplaying videos. Right when I just noticed that first they appeared in one website, then they started appearing in others, and now they're everywhere because you can't afford not to have them.

Essentially, if your competitors have them, their engagement is going to be better. So nobody wants it. It's bad for the users and for the outlets. It's basically just an armed wrist. But you have to do it because otherwise you will be beaten. Yeah, our approach has been not to focus on the ads. Ad blockers do a great job at that.

One day I would want to have an ad blocker that blocks ads differentially based on what the page is. But let's leave that for later. For now, I want to focus on what is the quality of the underlying content? Does it match what you wanted to consume? So if you wanted to. Consume entertaining article, let's say cute animals doing stuff.

Then an article titled, stop what You're Doing and Watch The Elephant Play with bubbles is just fine, right? But if you are going to CNN and you want to read the news and CNN publishes an article titled, stop what you're Doing and Watch the Elephant Play with Bubbles, which they did, then something's off, right?

This article is in their news section and you know it's there because people click on it. 

Jonathan Green 2024: Yeah, I'm thinking about a lot of times there's content that people post, like I like about my LinkedIn feeds, and that's the social network I'm most active on. And so much of the content is, there's a huge spread in quality.

There's some really good content, like high quality, really well written articles thought out, but they're buried below, especially now. This is why I most wanted to talk to you about it. Just a lot of AI generated low quality content, like not even, I don't mind, and this is me. I'll read a book that I'm pretty sure was written by ai, if it's good.

As an author myself, that's my line is different than other people's. I don't care if it, if someone programmed chat GBT, but if the end result of using the machine is that the book is really good, I go I like the story. That's what matters to me. The same thing I feel about content.

It's okay if you use AI generating images if they look cool. But so much, and this is what I was thinking about, I don't know. More and more ais have the ability to see there's a huge spectrum in quality of images and some of the images now look worse than what we saw 20 and 30 years ago with clip bar, like people are generating really terrible images and they go it's what I made posting it.

And I am like, today I was working on a thumbnail for a YouTube video, and it's a picture of me, AI generated image of me. I probably did 40 or 50 generations of different versions of the image to get down to six finalists, and then I picked three for the three thumbnail variations, right? To think of just using the first three images generated and go these are the first three image.

That's what I'm using. I don't think that way, but a lot of people do. So can your tool, your approach, other web help the way your AI works? Can it block like social media content as well based on quality? 

Alex Fink: So right now we're only applying it to things that are written in the form of an article. So essentially news commentary, Wikipedia, research studies, anything like that.

We essentially don't scrape social media directly. We can tune the models and apply it to that as well, but we will have to figure out what the standard is because on social media, people post with different intents. And again, maybe one post is intended to inform you. The other post is intended just to entertain you or to be sarcastic.

You don't want to apply the same filtering criteria to both. [00:10:00] It wouldn't make sense, right? The sarcastic post might be good for what it is. It would make for a terrible thing if the intent was to inform you. But. I want to go back to one thing that you mentioned, which is that you don't mind if AI did it.

If it's good, our approach is the same, and I've been actually talking to a lot of journalists and people on like cable news lately, et cetera, and they always ask this question, can you tell if something has been done by AI or not? And my immediate responses, which I don't always voice, but that's the thing that I hear in my head is why do you care?

Is it good? That's what should matter, right? If AI created something good, then I'd rather read that than if a human created something bad. And that's my approach to things. But the reality is AI makes it easier to create bad stuff. Yes, some people like you might use it to create a better book, but for the most part, if a journalist is substituting a large percentage of their work with ai.

They're not doing it to create the best work they can. They're doing it to create more output. That's what it's mostly used for. And so instead of 10 click bait articles per day, it will be 200 click bait articles per day. It's not going to be a great work of investigative journalism where they use the AI just to edit the result.

Jonathan Green 2024: Yeah, I think of. AI like alcohol. Alcohol brings out whatever your real personality is. If you have a bad personality, alcohol makes it louder. It just makes it a bigger version of who you are. And the same thing with ai. People who wanna make low quality content or have that approach, it just gets bigger.

I always think about my experience. A lot of people are surprised, but I don't like the internet because if I, it wasn't my job, I would never be online. I just wouldn't, I'd be living my life in person, living out loud, and maybe that's because it is my job. It becomes when you don't, whatever you do at work, you don't wanna do at home.

One of the things I think about is what if you could just block out I remember when going on Facebook, was exciting. Remember those times like 20 years ago? You're like, oh, I can't wait to see. What, and maybe this is because it was my young single days, they say if any of the cute girls I was flirting with, had posted a new picture or something interesting was happening, or what people are doing tonight, like everyone posted about, I went on vacation, here's all the pictures.

Or we went climb rock climbing, here's, and people would upload like a hundred pictures and now my feed is not like that. You never see things that are interesting, especially Facebook's the worst for me, but. The same thing for Twitter. Twitter, I'm looking for specific things. I can never find what I'm looking for and maybe it's 'cause I don't know how to search it, right?

But there's a lot of content I want to find 'cause other people find the tweets I want, which are about, oh, I figured this out with ai. I had this development. I'm looking for up-to-date news and what people are looking on. It is really hard to find. And the positive energy is what I think about the most.

Social media seems to have made everyone less social. And there's been this shift, and I just wonder what I like about your concept, and I know it's still an implementation phase. The idea of you could just block out content that was AI generated or that's not interesting. And maybe that's the thing is that my line is oh, I'd just love to see stuff that's only positive or happy or interesting because that's a different line for everyone.

But the idea with an ai, the more you can get. Personal, you can start to say I use a news tool that aggregates like a hundred AI news feeds for AI news articles, and I tell it. I don't like the article. I like that article. And it still slips. A lot of the ones are that are wrong, still slip through, but less and less.

Yeah. Because there's so much content. So for that's a volume. Issue. And so I spend one to three hours a day just going through news articles. But the thought of doing it more efficiently, I'm always thinking about, yeah, real value of AI is that you can do good things faster rather than you can do more 

bad 

Alex Fink: things.

Yeah. And by the way, we take a very similar approach to the one you described. We try to scrape, I think right now it's close to 30,000 articles per day. And then we try to pare it down until you get the best one, according to your definition of best. And. One of that is using things that the I models learned about the article to filter out.

That's the nutrition label and your different filters that you can configure. The other one is creating a bullet point summary for every article so you can skim articles instead of reading the whole thing and then decide which one you actually want to read. But even beyond that, you mentioned that you like things that are more positive and happy.

True, but some people like the opposite. And so one of the things that we decided to do is classify every article according to the emotions it's likely to evoke in the reader. And le let you configure which emotions you'd like to receive. So if you like the hopeful, the positive, the educational, you can push those sliders to the right and push the depressing and infuriating things to the left.

But maybe somebody will do the opposite. I'm not going to enforce happy content on you, right? If you like to be outraged every day. Then there's content out there for you. 

Jonathan Green 2024: Oh, I know. There's certainly, there's plenty of content for that. Now asis are getting smarter and they keep releasing. There's, this is why I have to read the news, is there's a new update or new version or something every single day.

There's every day someone asks about a tool that I've never heard of before. Because it's, we're reaching that just constant release of content. Are you finding that you're able to leverage these advances to make your algorithm smarter? To get people to there quicker? Because the hard part is that I don't know what I like until I see it.

That's a the standard a lot of people use. I know if I like it when I see it, which is really hard for a computer to interpret, which is why. Artificial intelligence, that critical component. That's what's interesting to me because when someone's, even when I do that, I'm working on something, and this is very, something happens a lot when someone's hiring like an artist.

They go, I want a logo. What do you want it to look like? I don't know. I like it when I see it, and every graphic designer hates when they hear that, but that's how a lot of people view content. They can't describe what they like and don't like. Are you finding that? Do you have people go through a process where they look at articles and they go, I like this.

I don't like that. I look this. I don't like that. Does that help to train the ai and then you can then use that knowledge for the next person so they get through it a little faster? 

Alex Fink: That was exactly the idea of why we ended up with a tender like interface, right? Because that essentially, I don't wanna say forces, but it incentivizes the user to give us feedback every single time, as opposed to only when they feel like clicking the thumbs up or thumbs down.

And so we learn faster what people like. And it doesn't often match what they would have told us if we just asked them. Point blank, right? People sometimes don't know that the particular topic actually pisses them off, right? They've interacted with it a lot in the past, but when you ask them swipe right or swipe left, they end up swiping left and suddenly they are happier when it doesn't appear on their feed anymore.

So that's why we chose that interface. I think it works pretty well, but even beyond that, I still think that. Okay, we've inferred something about a person's preference. TikTok does that too, right? But I think it's fair to show to a person everything we learned about them and let them change it if they disagree, right?

So we learned that you like a particular emotion, but maybe you disagree and you actually want less of it, and you were just swiping wrong. Or maybe you were swiping based on a gut reaction, an immediate reaction. But now that you're looking at the configuration, you want to optimize for the long term.

And in the long term you would like the ratio between the emotions. You get to be 

Jonathan Green 2024: this much. Yeah. I also at different, sometimes I'm working on different projects, so sometimes I'm just doing a certain amount of research for one of my side businesses and I'm in a completely different mode as opposed to when I'm in entertainment mode or relaxation mode.

We look at, watch different content, we view different things. This is why YouTube has. Sometimes gives you the wrong recommendations. No, I'm not at work. I don't wanna watch AI videos all day long. Like I watch a ton when I'm doing research mode. When that's done, I wanna watch entertainment type stuff.

Like a different type of content. And sometimes you can't tell that delineation, so there's. This issue people have, which is separating work from play, but also when people create all of this content, like the person who has created this content, then it gets filtered through your app. So now people are getting the content without seeing the ads.

Is that a problem for the content creators like that? They're like, Hey, you're showing my content. Without, and now the ads are, gods saying I can't make money from it. Like, how is that it? It hasn't 

Alex Fink: been so far because we follow fair use. We don't show the entire content. We create the summary. We show the summary.

If a person actually wants to read the whole article, they click through to the original and then they read it and they see the ads if they are there. Now, one thing we did do is we eliminated essentially anybody with a paywall. So we just don't collect content from them. We don't summarize it. We don't show that to you because we think first of all, if they have a paywall, it means they don't wanna share.

We respect that, but also from the user's perspective, if you click on a link after reading the summary and now you see the paywall instead of the article, that's a terrible experience. They came to an aggregator to avoid doing that, and so why would you give them links to something that requires them to do that?

I know that Google News and many others do that. We thought it's just weird. And so if any outlet has a paywall, we don't link to them. 

Jonathan Green 2024: I'm always suspicious of that, that a search result will lead me to something that I have to pay to see. And I always wonder I think it's Forbes, who does it the most?

It always makes you crazy. I'm like, why are they always appearing in the [00:20:00] search results when they don't want to be in the search results? And what kind? I also think what kind of dummy would pay for to subscribe to a blog? Like I just, that blows my mind as well. Like the content is that. Sorry, Forbes, but the content is not that much better than any other side out there.

They just happen to be the one that, and I wonder why, I'm sure, I know it's a financial thing. I'm sure there must be some type of arrangement for Google to constantly rank their articles when their articles should be unsearchable, right? The spiders shouldn't be able to read the article if I can't read the articles.

So I know there's something, I dunno nefarious, but I wish it wasn't going on. So I like that idea. That's very interesting. It's a very interesting approach with fair use as well, because. More and more Google's moving towards, oh, we're gonna take the best part of your article just showed in the search results so people never have to go to the actual article.

Yep. And I think the summary thing, that's very interesting because I use summaries all the time. I use an AI to summarize YouTube video before I watch it. I even teach people when you're watching my videos, 'cause some of my videos are long that you can jump to the section that matters to you. If I'm watching someone here's AI News of the Week and there's 30 articles, I probably know about 17 of them, so I don't wanna watch those parts of the video, but I will then jump to the parts that are relevant to me.

So I don't see summaries as the same thing as where they pull out the answer to your question from the article and don't show the actual article, which I think is. People rightfully are upset by that because now they, no one has any reason to click through. Yeah. That's the trick they're pulling.

It's very interesting. What are you finding, like what percentage of articles do people actually click through to, are people still reading as many articles as they were before or are they just reading summaries and jumping 

Alex Fink: off? So I don't know what they were reading before. But I can tell you that the rate of the click through under the summary is always going to depend on the length of the summary.

If the summaries are shorter. Then people will read a larger number of summaries before choosing to read one long article, right? So in our case, we're trying to keep those summaries to three or four bullet points. Therefore the click through rate is going to be small, but the number of articles people click through is still relatively large.

It's just they skim through a lot before they pick one they want to commit to. And that's the whole point, right? And the whole point is you don't want to read 2000 words to decide whether you wanted to read this or not. You want to decide in advance, right? It's just if you are shopping for food, you don't want to eat everything on the shelf to figure out if it's good for you.

You, you want to look at the nutrition label, maybe look at the description, the ingredients list, and that decides, so the summary kind of acts as that. It's almost like a, an ingredients list in a sense. 

Jonathan Green 2024: Yeah, that makes a lot of sense to me because most of the time. For example, I don't watch entire movie trailers.

I'll watch just enough to decide do I wanna watch it or not. 'cause now movie trailers show you the whole movie, or they'll show too much. I go, no, just let me. I just wanna know if I want it. The same thing when I'm shopping for books. I read, I'm a voracious reader. I'm just trying to find out what genre the book is because there's certain genres that I can, certain I don't.

So I'm just trying to get that basic piece of information, which so often. So often that's my decision making process. Like I can't tell from your cover what genre are you, I can't tell from the description. I'm not trying to, I don't wanna find out the mystery. I just wanna know enough. Am I gonna like it?

And one thing I've always found interesting is that in the romance genre, you have to say in the description if it has a happily ever after or not. Because if you don't, this is, and this is something I learned 'cause I worked in this space, is that if you have a bad ending and you don't warn people.

They get really mad. Like they, they do not like that. So it's the only genre I know of where you have to let people know that like they don't end up together. If you're gonna have a, not happily ever wrapped, they call it ha HEA. If you don't have that, you have to put it in the description.

'cause people want that warning. I don't even know how it ends. I just wanna know if it's gonna end. But the bad guy's gonna win or lose. People in different genres want that specific thing. Like I read science fiction, but I hate first contact. Like I wanna be in the book after. Take me a hundred years past that where we're now the rebellion's happening.

You know what I mean? And that's a very specific thing. 'cause most people hear that. They go it sounds the same to me. I go, no, it's completely different. And I think of the future, people are surprised. I have such a positive view of the future because I think that AI, when used correctly, like I use it to bring me the content that I want when an.

Exactly like you do. I use a process to find articles and aggregate articles. Then go to those and then I end up reading the ones that I wanna read and watching the videos that I wanna read. I actually consume more content because I sift out more of the stuff I don't wanna read. 'cause a lot of articles, for example, I don't wanna read an article about funding.

That's not what I'm looking for. I'm looking for articles about like specific type of developments and. That's allows me to go, oh, this article is not what I want. This is a different area of technology. Or this article's very technical. I don't want that. I don't read a lot of white papers about how AI are built.

It's just not my focus. My focus is on usability, not the backend. Other people focus on that area. So the ability to sift actually allows us to get a better result, not a worse result. So I think this is very interesting 'cause a lot of people are so focused on the using AI to generate tons of content.

Rather than using AI to get to what we want faster, that acceleration, I think that's very interesting. So where do you see as the future of content? Because one thing I think about now is that someone has an AI write a script and another AI make the video. Then another AI uploads the video with an AI generated thumbnail.

Then my AI summarizes the video and I have another AI text to voice read me the summary and it's maybe I should have just talked to the person because of seven ais in the process of the content. It 

Alex Fink: could be. That's dystopian the way this you describe it, but I suspect what's more likely to happen is more like.

A farmer that can use a tractor and a weeder and a whole bunch of other implements to tilt their soil and to get it ready to produce a very large yield where it previously took maybe 150 farm workers on that same farm to produce one 10th of the yield. So ultimately there will be a farmer somewhere, and the same thing with content creation.

At least with good content, there will be a creator somewhere, but all the things that previously required a lot of tedious work from other people in that chain will be auto automated away, which is great for us. The consumers probably great for the world as a whole, and it's probably going to displace a very large.

Number of people from their jobs and cause a painful transition in the process. Using that farmer analogy again, somewhere in the 1920s, 30% of the American workforce got displaced, right? And not all of 'em have recovered even until today. You're still seeing large groups of people and locations that are basically.

Poor because they haven't quite recovered in four generations from the mechanization of agriculture. I think we're going to probably see something like that. I hope we handle it better this time, in part because the people getting displaced are white collar and so they have more political will, more political power, so hopefully they can fend for themselves better than agriculture workers did in the twenties.

But for us, the consumers of content, it's going to be great. And I agree with you, curative AI is much more interesting than generative AI because ultimately all generative AI does, even in the best case, is save time for the creator. But what do I care as a consumer? I just don't. 

Jonathan Green 2024: Yeah. I always say to people, nobody who reads my book cares how long it took me to write it.

If it took me two hours or 2000, the reader only cares if the book is good. Nobody cares about my journey, my struggle, how many rounds of editing I had to do. Nobody is gonna buy a book. They go, this guy, he edited 15 times. That kinda only edited 10. I'm gonna read this one. Nobody defies that way, right?

We don't shop that way. We don't care as the creator and the consumer. I think about that a lot. So my final question is. Do you think this will lead to a democratization of content creation, as in smaller creators will have a better chance of being found? 'cause for a while that was the case, a really small to make content. A lot of people would notice it, and now it's So the search results are a nightmare. Yeah. The quality, if you write an article for the purpose of search engine optimization, you have to write a worse article. You just do. You have to write a longer article? You have to. You wouldn't write, you don't write the same article.

'cause you're looking at how many keywords do I have, which terms do I have to use? Do I have enough things in the tags? Do I have enough images? With the alt tags, you add in all of these elements that are written for the search engine, not for the reader, the end user. And there's even tools that help you optimize that.

I had to do that for a long time now. When I write articles and I drive traffic from social media, I go, oh, I can just write a good article. I don't have to worry about this other element of trying to get search engine optimization because I can drive traffic in a different way. It allows me to just go focus on content again, which is an unbelievable feeling.

I go, oh my gosh, I can write stuff that's just good. Again, like I was 15 years ago. Do you see that with a tool like yours, which is going okay, whatever's good, rises to the top without looking at the source, as in it doesn't matter if it's from a big news organization or small organization, it just matters if people, if it's well written and it's good content, do you think this will actually lead to a democratization and kind of go back to that era where the best content wins?

If 

Alex Fink: everyone were to use it, yes, but that's a big if, [00:30:00] right? And so it's hard for me to predict what percentage of people will follow this kind of model as opposed to trusting what the big search engines do. And one day, like you mentioned before, the big search engines are just going to change from giving you a list of links to other people's content to giving you the right answer.

With no links to it whatsoever. And that's the opposite of democratizing content because that right answer might have just been produced in the handshake deal with somebody, right? It could just be the answer of whoever paid that company the most to appear in the right answer, and maybe with some profit sharing in the backend.

I don't know, basically is the answer because it depends on consumer behavior, and I am not smart enough to predict consumer behavior. I can predict what technology will be available and how the technologies will compete and fare against each other, but I have no idea what consumers will do given the choice.

Jonathan Green 2024: Yeah, I, an example for me is when I'm looking at a review of a video game, I always look for a small creator. I don't trust. This isn't, and a lot of industries are notorious, but large video game reviewers make all of their money from ads from large video game companies. So they're notorious for, you cannot write a negative review of, or you can't go below for certain companies games no matter what.

So they're not allowed to be authentic. And to all our creators. Who aren't monetized yet, who no one's willing to spend an ad with, they're not worth bribing. So it's like there's an authenticity there. And it's a weird way to say it, and it's the same thing I say about my YouTube channel. My YouTube channel is quite small.

My personal YouTube channel is smaller than all by other channels. I go, I. I can write negative reviews of products. I don't trust people who only write positive reviews. I'm like, you've never encountered a bad product. Really? You've that's, I can't imagine I would love to have such a blessed life where every product I brought from Amazon was as good as I thought it was gonna be.

Every time I got a haircut, it looked great. Like what a life to lead, where someone reviews a thousand video games and they're all five stars. That's amazing. I would love to live that life. Like what a happy universe, but I don't trust them and yeah. That's, I think where authenticity comes from. I look for people who they don't like everything, so if they like something, it means they really like it.

And I think that the idea of having a way to filter out the things, like people think, I get asked this all the time. They think I use AI to journal all of my content. Like I do the opposite. I use. I can't, I have to be more authentic because people are more suspicious of me. As an AI expert, a person who talks about ai, they go, oh, he must use AI generated images of his family must do this.

And I have to really make it clear that when I do AI generated images, like all of my images are pictures of me running from exploding helicopters or jumping off of an exploding submarine because it's, there's no way someone encounters that many exploding submarines. If you watch enough thumbnails, you go, oh, it can't.

So it reveal the entertainment reveals itself, right? Because it's exciting imagery as opposed to it's really close to reality, but 10% better, which is where deception happens. I stay outside the Unca Valley by going extreme, and I think that's the future for AI that I like, is that allows smaller people to be more entertaining.

It allows artists to increase their volume of output with high quality output, and it allows the consumer to filter out the things that they don't wanna see because that's. What we've been trying to do for so long. That started all the way back with TiVo when you could you can record television, just not watch commercials.

That was right. This panacea. I remember when I was in high school, every year the amount of commercials would go up and it 30 minutes show. It was like 27 minutes of show, then 26 minutes. Now it's, I'm not even sure if it's even 20. Last time I looked, it was eight minutes of commercials for 30 minutes of television.

20 or 30% of the everything you watch is commercials. So of course getting that time back is so valuable. If I don't watch commercials, I can watch twice as many TV shows. So I think that has me excited for the future. So I really like the direction you're going, and I think it's really positive use of AI and I think that's really exciting.

Where can people find out more about what you're doing? Get more sense of how they can filter in the things they want and filter out the things they don't and improve. The democratization of content in the internet. 

Alex Fink: So our website is other web.com or if you want to download the apps, they're on Android and iOS called Other Web, right?

And we keep adding more and more products beyond just the website and the apps, but you'll be able to find all of that on our website. Now I do want to con caution with regards to what you just said that. Besides just the production and the consumption, there's also distribution and monetization and how it's done and who handles it is probably what's going to determine whether content gets democratized or actually gets concentrated in fewer and fewer hands like it has in some industries where it's really hard to produce an independent film right now and have people see it.

Oddly enough, it's much easier to produce the film, but nobody's going to see it because distribution is controlled by fewer hands than before. So I don't know if the Blair, which project can happen today essentially to the same extent that it did in the past. So I think that we need to push as consumers.

Not just for tools on the consumption side, which is what we're trying to do right now, but for better tools on the distribution side as well. And other web will be a player in that part of the industry at some point, but I urge others to join as well. 

Jonathan Green 2024: That's very interesting. I think you've given us a lot of food for thought and this has been an amazing episode.

Thank you so much for being here for today's episode of the Artificial Intelligence Podcast. Thank you so much.



Jonathan Green 2024: Thank you for listening to this week's episode of the Artificial Intelligence Podcast. Make sure to subscribe so you never miss another episode. We'll be back next Monday with more tips and tactics on how to leverage AI to escape that rat race. Head over to artificial intelligence pod.com now to see past episodes.

Leave, review and check out all of our socials.