The neXt Curve reThink Podcast
The official podcast channel of neXt Curve, a research and advisory firm based in San Diego founded by Leonard Lee focused on the frontier markets and business opportunities forming at the intersect of transformative technologies and industry trends. This podcast channel features audio programming from our reThink podcast bringing our listeners the tech and industry insights that matter across the greater technology, media, and telecommunications (TMT) sector.
Topics we cover include:
-> Artificial Intelligence
-> Cloud & Edge Computing
-> Semiconductor Tech & Industry Trends
-> Digital Transformation
-> Consumer Electronics
-> New Media & Communications
-> Consumer & Industrial IoT
-> Telecommunications (5G, Open RAN, 6G)
-> Security, Privacy & Trust
-> Immersive Reality & XR
-> Emerging & Advanced ICT Technologies
Check out our research at www.next-curve.com.
The neXt Curve reThink Podcast
Silicon Futures for January 2025 (with Karl Freund and Jim McGregor)
Jim McGregor of TIRIAS Research and Karl Freund of Cambrian-AI Research joined me to recap January 2025, another action-packed month in the world of semiconductors and accelerated and non-accelerated computing on the neXt Curve reThink Podcast series, Silicon Futures. The trio also shares their thoughts on where the industry and the tech are going in 2025.
We parse through the key announcements semiconductor industry and AI headlines of January 2025 and their year-end thoughts:
➡️ CES 2025 - AI everywhere (1:53)
➡️ Nvidia's supercomputing at the edge (3.00)
➡️ Nvidia's vision for the AI PC challenging the meaning of the AI PC (5:49)
➡️ The compelling use case of AI on the PC (8:14)
➡️ CES 2025 impressions by Intel, AMD and Qualcomm (14:48)
➡️ Jim's sparse trek on the CES 2025 floor - MIPS, Synaptics, NXP (18:18)
➡️ Arm's reinvention and their ongoing battle with Qualcomm (19:10)
➡️ CES 2025's hidden wireless gem (20:20)
➡️ DeepSeek and the implications on the AI industry (22:30)
➡️ Reasoning and agentic AI changing the slope of AI (28:30)
➡️ DeepSeek's semi-open kimono (30:34)
➡️ A middle finger to U.S. AI diffusion rule, Stargate, and regulation (34:05)
➡️ The engineer who will design the first 8th generation tactical fighter (37:20)
➡️ The prospects and pivot of Quantum - Willow & Trillium (37:55)
➡️ Jim, Karl, and Leonard give their thoughts on AI and chips in 2025 (42:46)
Hit both Leonard, Jim, and Karl up on LinkedIn and take part in their industry and tech insights.
Check out Jim and his research at Tirias Research at www.tiriasresearch.com.
Check out Karl and his research at Cambrian AI Research LLC at www.cambrian-ai.com.
Please subscribe to our podcast which will be featured on the neXt Curve YouTube Channel. Check out the audio version on BuzzSprout or find us on your favorite Podcast platform.
Also, subscribe to the neXt Curve research portal at www.next-curve.com for the tech and industry insights that matter.
Happy New Year!
Next curve.
Leonard Lee:Hey everyone. Welcome to this next curve rethink podcast episode, where we break down the latest tech and industry events and happening into the insights that matter. And I'm Leonard Lee, executive analyst at next curve. And then this Silicon futurist episode, we will be talking about the crazy beginning of 2025. I don't recall. Ever witnessing a January so insane in my life. But anyways, I am joined by the scaled up call Freund of Cambrian hyphen AI research and the scaled out Jim McGregor of the famed curious research. Gentlemen, how are you doing? Very
Jim McGregor:well.
Karl Freund:Happy. Well, confused, but very well.
Jim McGregor:Yeah, I think he goes out of his way to try to redo the introduction or make it grander ever more grandiose at every
Karl Freund:grander. No, no, the proper word. Actually, Jim is ridiculous. There you go.
Jim McGregor:Yeah, I told you I have the best word. So, yeah, you tend to make them up though. So you're rubbing off on me,
Leonard Lee:right? So, Yeah, this is just pure insanity. But before we get started, remember to like, share and comment, you know, share your thoughts on this episode and remember to subscribe to the next Curve Rethink podcast I don't even know where to start gentlemen, but, why don't we talk about, let's just start off with the beginning of the year. Which kicks always kicks off with CES 2025. I know, Jim, you were there. I was there. well, let's start off with that. I think maybe we do things in order because, otherwise, we'll be all over the map. So let's talk about let's talk about Silicon at CES 2025. What were some of the highlights that you think that the audience needs to be aware of and continue to keep in mind as we go through the course of the year?
Jim McGregor:Well, I think you have to be aware of the broader trend, and that is AI everywhere. And obviously, we saw that as CES with them trying to put AI and everything all the way down to a walking cane. it gets a little ridiculous, but, Jensen has talked about these, super cycles or super waves. And so has, Victor Peng how it's going to be the data center. And that's going to be the enterprise. That's going to be the edge. I think we're starting to see in 2025 that edge, solution, but at the same time, the other waves are not stopping. the big news, especially on the silicon front at CES was NVIDIA's introduction of the Blackwell for desktop and mobile. New GPUs, AI GPUs, if you will, for desktop and mobile PCs, as well as what they call digits. This little 4x4 box that they basically crammed a DGX down into a mini box. You can put on your desktop for doing AI stuff. and they worked with, MediaTek on it. MediaTek actually did the SoC, development integration of it. It used the NVIDIA Grace CPU and the Blackwell GPU into essentially of what amounts to a mobile SoC. So, it's not stopping we definitely see that in the fact that, AI is just pushing the boundaries of everything at this point in time.
Karl Freund:I agree. Um, the Digits platform to me was the highlight of CES. It didn't get the press it probably deserved just because it doesn't yet run Windows. I underline yet. I'm convinced it will run Windows, shortly. I don't know when, but, why would you go to that expense if you're not going to tap into the larger market? And that larger market is anxious to have something like Digits. it's got like a hundred times performance of any other AI PC out there. just in terms of AI performance. you say 3, 000. It's like less expensive for a PC. And I said, not for a PC that thinks it's a supercomputer. it's not too expensive at all. So, I think it's a game changer. I think it's NVIDIA's way of entering the x86, dominated market for PCs and eventually even laptops, I suspect. And it was so small. I mean, it looks like a DGX, too. The face of it looks like a DGX.
Jim McGregor:the placard to describe it. It actually got the name on it. It was four times larger than the device.
Karl Freund:Yeah, I would imagine we'll hear more about it at GTC in another month,
Yes.
Karl Freund:I think it will steal a show, especially if they can announce Windows.
Leonard Lee:Yeah, I thought it was kind of, funny that some folks thought that the shield that he brought out, Jensen brought out, was a new chip. That
Jim McGregor:was funny. They said, wow,
Leonard Lee:they have a wafer scale accelerator now. I was like, no, no, no, no, no, no. That's not what he was trying to say. No, that, that, that was a joke. Yeah, yeah, yeah. No, they don't have, they didn't make wafers that big. It was definitely interesting to see them bring that whole supercomputing story to the edge. I think that's really what it was all about physically, I a lot of attention to that. But then, Jim, both of you guys are making a great point about, NVIDIA bring their a I. P. C. story. And we've heard this in the past. Jensen say, Hey, look, we already have, like, what to whatever 1, 000, 000, a I. P. C. S. out there already. And this was even before. Yeah. AMD even introduced the first quote unquote, PC processor that had what you might deem an MPU. Um, they didn't call that at the time, but that was like, what, almost 3 years ago now going on through 2, 2 years, right? It's a 2 or 3.
Jim McGregor:And we've had 3 generations. At least three generations have been refused running AI on them and capable of a hundred to a thousand times the performance of the processor AI capabilities with the NPU.
Leonard Lee:it's interesting to see NVIDIA bring that angle to the AI PC story. It's not novel, but it's definitely getting more, obviously, denser. And in a much smaller form factor that's represented by digits.
Jim McGregor:fortunately, it all boils down to Microsoft support. So when Microsoft starts supporting AI on the discrete graphics cards and on the digits platform and everything else, that really opens up everything. But it's not just that. Also, it's support on those platforms themselves. I mean, like, if you use Rockham today from AMD, it doesn't run on Windows yet. So, you're having to use the Linux platform, which isn't a big deal for a PC, but still,
Karl Freund:it's a big deal for users. It's not a big deal for developers. Developers should buy and get Linux.
Leonard Lee:Yeah, I mean, that's a great point though. Right? And then it also begs the question of, well, number 1, what do we mean by AI? is it really the serious stuff, or is it the stuff that's kind of, answering feature type stuff that we're seeing, quite frankly, in Copilot plus PC, right? There's nothing there that, at the moment, that's really compelling. I think what we're seeing in terms of serious is still cloud bound for the most part. Right? And then it's moving its way to, higher power edge infrastructure. But then I think this is the 1st time we're seeing at least, Nvidia bring that. Super computing, capability or architecture down to, a smaller form factor now,
Jim McGregor:I would have agreed with you going to 2025. that there aren't really that many compelling use cases for AI in the PC at this point in time, because most of it's in the cloud. However, I now see that a little bit differently, and that is the compelling use case isn't necessarily these applications that are using AI. It's AI itself. It's the ability to download, LLAMA, to download DeepSeq, to download all these different models and be able to do development, and models and modification of models and integration of models. On your PC. I mean, yeah, that's still a smaller community than the broader ecosystem. But still, I think that is the killer app today. I think the killer app is the developer. The killer app is a developer? Okay. The killer app is being able to do AI locally, for a developer. And just the power and the capability that we have to do that with.
Leonard Lee:Sure. I mean, developers developing application and there's that broader question of. monetization that's still lingering out there. Oh, yeah, it's only going to happen through these scaled out and valuable applications. And I think that's really what I'm referring to. I don't argue with you in terms of the value for the developer, because that's where I think actually, Nvidia might be bringing some serious gain. in the approach that they're taking, which is what I consider top down. But then, maybe we don't dwell so much on Nvidia, although I think they made the biggest impression. But there was a lot of other stuff that happened, right? Obviously, we still have sort of that bottoms up stuff that's happening. with, what Intel, AMD and, Qualcomm are doing, right? and, I don't know, what are some of your thoughts there?
Karl Freund:think the world's still looking for the killer app. And to say it's the developers, true, Jim, but I, what I think people are looking for is, yeah, but I'm not a developer. So what does this mean for me? Right. And so that can go one of two ways either you make that person a developer By giving them the tools they need to create real value added applications for themselves or you realize that The real killer app isn't an app at all. It's the data. It's the data that's on your pc It does not exist in the cloud except for in a backup form that's where the AI PC could find real traction and making everybody more productive by helping them find and use and summarize data that's on their PC today. It's not dependent on the cloud, Qualcomm and AMD and Intel. They all talk about. Well, it's all about being able to access data. And be able to, excuse me, process data on a free processor because you already bought it. Yeah but AI is becoming commoditized so quickly that cloud AI is becoming practically free. So I don't know if that's enough of an incentive to developers to create real value add apps. on the PC itself.
Jim McGregor:Actually, I think gaming is going to drive that, because I think gaming is going to drive it to a hybrid model, like what we've seen in the past, where some of it's done in the cloud and some of it's on device. So I think gaming is one of those applications that's going to drive more and more of that AI processing on device.
Karl Freund:You mean casual gaming or serious gaming?
Jim McGregor:Well, there's a fine line there depending on who you talk to, but I would definitely say serious gaming. I mean, we're not talking Android games. We're talking, anything that is at least a double A or triple A game.
Karl Freund:Well, that's where I think NVIDIA has got a lot to contribute, right? I mean, Qualcomm's been very careful when they talk about gaming, to talk about it as casual gaming, because they know you need an add on, PC card, excuse me, GPU card to handle serious gaming. But with Digits, you don't. You don't need a PC, you don't need a GPU card. You've already got a Blackboard on your desktop and that's where I think it could now in terms of market size It's a relatively small market when you look at the market for data center GPUs, but it is another Differentiator I think for Nvidia.
Leonard Lee:I struggle a little bit Trying to figure out where else other than, let's say, augmentation of non player characters, generative AI in particular, that form is really going to make a huge impact because, if you think about AI and how it's actually. Impacted, gaming, game development, a lot of it has to do with, DLSS and the rendering engines and, improvements there that tend to be step changes, but in terms of the grand scheme of things, they tend to be incremental improvements. It's the dynacism. How do you leverage these large language models generative AI to introduce dynacism into the, experience. especially in the form of these non player characters Where I think there's differentiated utility, and that's always been the weak point of a lot of various, gaming formats, whether it's, first person shooters or open world type, fantasy adventure games. so I think. we'll probably need to do a lot more tactical thinking to figure out where generative AI really fits and where it's going to express this, exponential or logarithmic value everyone is talking about. I just don't see it, even though there's a lot of great visual representations or presentations of the impact. It's at a tactical level. I don't see it as being that broad. It tends to be a little bit more narrow than I think a lot of folks assume. But anyway, That's just my take now. Oh, what did I stun you guys? No And we need to have that what do you call it BS card, right? Like the little sound we need a little sound. No,
Jim McGregor:no, we need the slap sound,
Leonard Lee:you know,
Jim McGregor:don't
Leonard Lee:No, we don't you really I mean I can add it later Maybe we do use sound just you know, like hand gestures for that And then as cues, right? But, I love you guys. so, yeah, the other announcements, obviously, 1 of the things on the front was the announcement of, a, you know, the rise in 300 pro was a pro. Yeah, pro and then, Dell coming on board, with, you know, going to enterprise or commercial with these guys. I think that was like a, a headline that really drew a lot of attention and excitement, obviously, curious research. You guys are at the Intel thing, right? Oh, yeah. So what's your take there? We have Arrow Lake coming to market.
Jim McGregor:Actually, the big takeaway there, obviously Arrow Lake's coming to market, but the big takeaway there was Panther Lake. They were showing off Panther Lake, which is the next generation, and it's on 18A, which says 18A is on track and everything else. And I've actually seen, over the past month, I've seen some of the yield numbers on 18a. It is definitely on track. So when we start thinking about generation Xeon, we start thinking of Panther Lake, all those products, in terms of manufacturability are on track for, later this year. So that was the biggest takeaway for me, for, for us at, Intel's event.
Leonard Lee:Yeah. Okay. I thought they were pretty stealth, they didn't get a lot of attention, but they were everywhere. I was a bunch of OEM, like Samsung, Lenovo. Events and you saw Intel folks everywhere and so I think their go to market motion is a little bit stealth in previous years. They were pretty boisterous. Right? but it's interesting that I heard quite often. Hey, where's Intel? They, they're so quiet. It's like, no, They're there,
Jim McGregor:well, and they're meteor like core ultra is doing exceptionally well in the market. you have to remember that when they introduce products, every time they introduce a new product, they have more than 100 OEM design wins out of the gate.
Yeah,
Jim McGregor:they are just there and they're cranking and they're ready to go. and Arrow Lake, obviously the big thing there, which Meteor Lake lacked is the fact that you can have a discrete GPU with it. So we're going to see those higher performance mobile workstations, all those real STEM type PCs and everything else are going to be coming out, using Arrow Lake as kind of the base CPU and using an AMD
Leonard Lee:Yeah. Qualcomm went the opposite direction. With, what you just, mentioned here with, the introduction of, I think it's snapdragon X, right? 600 and below, lower tier when
Jim McGregor:they're building out their product family.
Leonard Lee:yeah, I think they're, going off of the smartphone playbook and trying to expand, the footprint and support of copilot plus PC, to the lower tiers, and actually 1 of the things that I noted in 1 of the, reports that I am putting together is that they're diffusing copilot plus PC, with hexagon, which has the 40, tops compute much faster than what we've seen them do on the smartphone side of things, right? Historically. So that's an interesting dynamic and strategy that Qualcomm is executing on, I think, largely on behalf of Microsoft. It's interesting.
Jim McGregor:They're keeping that TOPS number up there, that NPU performance up there and scaling the rest of the chip around it.
Leonard Lee:Yeah. And so, anyways, I think that's CES. What else is happening?
Jim McGregor:unfortunately I didn't get the chance to actually even walk the trade show floor because, some of the vendors took up entire days, at CES with, events. but still, there was a lot of really, really good information there. I did get to see some of the other stuff, especially some of the IOT stuff. synaptics had their Astra platform where they're being able to put AI and just. The smallest devices, some of the things that we're going to see coming out that are going to be leveraging companies like synaptics, NXP, even talking to the MIPS guys, which, are now based on risk five architecture and where they're going, they didn't announce anything at CES, but they are at embedded world coming up. And the fact that they're pretty much supercharged around where they think risk five is going to go and how it's going to compete. especially in this new dynamic where, people are viewing arm and a little bit different light after the, the lawsuit with Qualcomm in December. So it's going to be interesting and good for them. The 1 thing that I'll note for them is the fact that they're not trying to be a pure IP company because I don't think you can survive as a pure IP company in our industry anymore. You have to provide. Complete systems, you have to provide the software, the platforms and even arms going that direction with trying to provide chips for some of their, key customers.
Leonard Lee:That's where I think a lot of these companies run to an identity crisis and the challenges associated with it. Right? Because they're positioned in a certain way in the ecosystem. So how do you, how do you reinvent yourself? And you've mentioned The challenges of doing that, especially the impacts on customers or licensees. but, speaking of, the lawsuit, I know that, you were there, right? I was at trial. Yes. But, it's interesting Qualcomm filed, what they call a post, trial brief, that was on, the 29th. that's an interesting development or follow on to the trial. I'm really interested to see how, this particular motion, plays out,
Jim McGregor:one of the things that I'm interested in, how companies are using wireless and stuff. And there's one company, and this was their second time at CES. It's called Morse Micro. They're out of Australia. And it's interesting because they're using older Wi Fi technology or lower band Wi Fi technology to get longer range. we're talking like a half, we're talking like a mile range, several thousand kilometers. I have one of their systems. I'm just, I'm hoping to test it this week, but that might be a game changer in terms of wireless. Cause right now, if you're using wifi, you're limited to, especially either, well, you're either using proprietary RF technology where you have to have line of sight or using wifi where you're limited to a couple of hundred meters at best. And. having a long range wifi instead of having to do cellular and everything else that could change the dynamics of the market. so I'm interested in testing this system out. it's kind of interesting to me to see what people are doing. And obviously on the other end of the spectrum, there's other companies there that are pushing Bluetooth and wifi to the lower power levels. So, a lot of innovation around wireless technology out, even outside of the cellular spectrum.
Leonard Lee:So
Jim McGregor:are
Leonard Lee:they
Jim McGregor:using like beam forming or something like that to be able to keep Pretty much everyone uses beam forming, but they're also using lower frequencies. They're using, I think, oh, 900 megahertz to like, I'm not sure how high they go, you're limited to about, I think it's like 42 or 43 megabits per second uhhuh rather than Okay, broadband speeds. But, you start thinking about security applications, you start thinking about, coverage of events and stuff like that. There's. Significance there. So it'll be interesting. I'm just seeing that, and we're starting to see NTNN NTNs, we're gonna see more NTN this year, especially as we get into Mobile World Congress, the non terrestrial networks, especially as companies start launching new satellites and new satellite networks. So there's gonna be a lot of innovation, I think, in 2025, around wireless technology. Okay. And this is coming outta CES right? This is coming outta CES and Okay. I think it's, we're gonna see even more going into mobile world nce.
Leonard Lee:Okay, well, let's then now dive into the topic that I know Carl really wants to jump into, which is this deep seek stuff.
Jim McGregor:Is there anybody here believe that they trained those models for 6 million?
Karl Freund:No,
Jim McGregor:no,
Karl Freund:no, absolutely not. But that being said, it does show that, the U S engineering community has been so focused on just getting stuff out the door. And so they kind of keep all their programming at this level, the Chinese engineers didn't have the chip that would allow them to stay at this level and get the performance they need. So they came down a level. So it's a lot of hard work of optimization at the machine code level that they did and a lot of innovation at the model level that they did to be able to get those costs down. I don't think they did it for 6 million dollars, but they definitely. Got a lot of attention. All of a sudden, everybody, including NVIDIA, now supports it, and people are playing with it, and enterprises are trying to figure out whether they can use it or not. And if they do use it, is there any, security concerns that they should be, aware of and try to mitigate? So, we haven't seen the end of this play out yet. but obviously, it's a dual edged sword. You bring the costs down, I need less chips. But, Devon's paradox says, if you bring the cost down, then more people will use it. So you need more chips. And we don't know where that dividing line is going to fall. We don't know whether it's net positive or net negative in the overall market. And from a volume standpoint, from a use standpoint, I think it's a huge step forward to making AI more pervasive and Helping lower the costs and you will drive more use cases. Again, we just don't know yet. We'll have to see what happens.
Jim McGregor:And that cost is the big thing. At their introductory price, they were lowering the cost over using OpenAI by 97%. 97 percent to do an inference processing workload. now that's just an introductory price. They're raising that up. But still, if you reduce that price by, 80, 40, I think it's going to be 80 percent or somewhere around there, but even 40 percent or 50 percent you cut the cost in half. That really cuts the legs out of the current business AI so far. And that gets to a very important point, the fact that we have to make it more cost effective. And I agree with you, Carl. I don't see a change in demand. I don't see people stop ordering GPUs because they still think they're going to need them, and hopefully this is going to spur more demand more than anything else. But it puts a lot of pressure, even if they did use illegally other people's models. that's still being decided, but it still puts a lot of pressure on everyone else out there in the A. I. community, not just to get stuff out there, but they have to be able to offer that inference processing of workloads on these models at a much more efficient price. Yeah,
Karl Freund:and it may actually drive increased demand for older GPUs. I've read recently that there's a lot of demand they can't meet even a 100s because they're cheap And if you can get the job done using DeepSeq's model or derivative of that, then you're going to save a lot of money, but that will increase demand for older GPUs. I don't know that NVIDIA planned for enough wafer capacity to continue those product lines beyond what they would normally last. Given the normal evolution of going from H 100 to B 100 onto the next generation, beyond that. So it'll be interesting to see how it shakes out. We'll know a lot more in, probably six months maybe.
Leonard Lee:Yeah. Here's the thing that everyone needs to reckon with is the economic impact. I hear this whole, Jevin's paradox thing. I don't quite buy it. There's way too much association of empirical laws, scaling laws, all kinds of weird theories about what's going on and what will happen. I think what we have to do is really look at the reality of. What's in front of us and to your points, they brought the cost inference down ridiculously, to a level that I think, the current ecosystem, global ecosystem, primarily dominated by, us. It is disruptive. It changes the economic and I think that's what the industry is going to be grappling with probably for the next 3 months. And in terms of the game theory here. whether or not they, um, took the data or these accusations, they need to be proven out. the game theory is, if they, trained a certain aspect, of the model or a phase in its training, whether it's pre or post at that cost and comparatively it is. Orders of magnitude more efficient and, cost efficient, right? and, with quality parity, which apparently they've achieved that is the biggest risk to, The U. S. led ecosystem, and I think that's the thing that everyone should be really concerned about. And I think there's a lot of deflection and denial going on around that. And I think that, frankly, is dangerous. if the Chinese, company deep seek actually prove that you can do all this stuff. That we assume has to be done with the super, massive data centers, right? That are going to create an energy crisis, in, a fraction of that footprint, a fraction of the resources required that's game changing,
Karl Freund:that there's been two developments in AI in the last six months that I think change the slope. one is genetic AI. And the other is reason. both of which are, you read and watch demos of what this stuff can do. It's just mind blowing. But what they don't tell you is, but you can't afford it. You can't afford the power and you can't afford the cost of doing it. You can't afford it. I read an article today about, open AI's new reasoning PLA platform called Research, which is, kind of a web search capability.
Yeah.
Karl Freund:But it would make, Jim and I, out of business just because it can do what we do and probably do it better. The problem is, it's so expensive, it's only available for the$200 a month. open AI users. And at that, I don't know that OpenAI is even making any money at it. if you apply the techniques that DeepSeek has applied, all of a sudden, those kind of, earth shattering applications become much more readily available. And that, to me, is the next phase of AI, where it's going to go, and it'll go this year.
Jim McGregor:Yeah,
Karl Freund:there's
Jim McGregor:no doubt. There's two key points here. One is the fact that we always knew that using recursive training and knowledge distillation and these other techniques was going to bring down, the cost of AI and make it easier to make, more focused models, more efficient models and everything else, and obviously DeepSeq used some of these techniques. They had to, but also the fact that after they did that, they made it open source. And that changes everything. When all of a sudden they're willing to give it away to where you can download it, you can play with it. You can do your own research on it. You can do your own training or retraining and optimizations on it. That, really changes the game. I mean, we've had a rush just to get stuff out there, but not a rush to really enable it. And that's really what DeepSeek has done is enable it.
Leonard Lee:Yeah,
Jim McGregor:but,
Leonard Lee:I want to get your impressions on this, observation is that they didn't open the kimono entirely. What they open source was a fraction of what I think folks would have liked to have seen and wanted. They open source the weights. They didn't open source the data set or even let people know what was actually used, even though there are references in their papers and 1 thing that we do know is it was, trained on. data sets that were largely, off of Chinese content or data. Yeah, go ahead about Tiananmen Square. Yeah, yeah, yeah. No, none of it had anything about Tiananmen. It's obviously biased. If it was entirely done on, let's say English Western, um, you know, maybe, there would have been some. Injection of, uh, some undesired, hallucinations, or at least on the part of the CCP. The other thing is that they didn't entirely describe what the entire system looks like and how it comes together. Right. What I noticed is a lot of people are taking the model and just running on standard, H 100 instances and they think that that's replicating everything. my impression just reading the reports actually, and doing a little bit of, deeper diving that those are things That haven't been replicated yet. And so I don't think the full picture of what the optimization, actually was is visible or even replicable. I think they've kept that close to the vest.
Jim McGregor:You're right. They have and obviously there could be legal issues there if they are using somebody else's data set or models So you got to kind of expect that and once again believing that they did this for six million dollars Nobody really believes that at least nobody that has ever worked on these systems believes that so obviously there's information we don't have about it, but I think we're gonna see I think it's funny because It seemed like everyone thought that, oh, this is going to crash the demand for chips. this is going to raise up all these AI companies. It drives the demand for chips and it opens the door to hundreds or thousands of AI companies that are going to look at what DeepSea can theorize about what DeepSea did and try to replicate it.
Leonard Lee:Yeah, but you know what, I think we're in an information digestion period. I mean, all the indications that hyperscalers are continuing to Invest is, that's a lagging indicator of the event, right? The deep, what we need to see is what happens in the next 3 to 6 months as we start to learn more about, what actually happened here and what the, you know, what deep seek is actually trying to do. I think you're pointing out something really important. The fact that they open source. This is just really curious. Especially in light of, what the Biden administration issued before they left office, which was that, uh, what do they call it? It is a goofy name, uh, interim interim thing of, AI
Karl Freund:diffusion, AI diffusion, AI diffusion or terminology they use.
Leonard Lee:Yeah, I mean, it was almost like open sourcing. It was like, thumbing their nose at that whole. Construct, you know,
Karl Freund:absolutely. Well, open sourcing was designed to impact us competitiveness in the marketplace. and really hurt the companies that are leading the research in AI, whether it's Google or open AI or what have you. so I think there were politics at play in that call more than some people believe.
Leonard Lee:Jim, your thoughts,
Jim McGregor:No, I completely agree. I think that, uh, we've had, AI diffusion. We've had the Stargate announcement about,
yes,
Jim McGregor:and then we had the announcement of deep seek. the 1st thing I'll say is that thinking that. Any government, especially the US government or Western governments are actually going to regulate impact or do anything otherwise in this segment is kind of foolish when they can't even spell AI. And yes, it's only 2 letters. I, I don't believe that they are means Apple intelligence.
Karl Freund:1,
Jim McGregor:that's 1 token, by the way. The, the market, the industry, the technology is moving just so rapidly and it also kind of points to the fact that this is a global ecosystem and especially, we've spent the past 40 years making it a global technology ecosystem and thinking that we can segment this. And build barriers between countries and everything else to stop this. We can't. Matter of fact, if China locked down rare earth materials, our whole electronics industry would tank overnight. If we didn't have assembly in China. Yeah, our industry would tank overnight. We are a global industry and how many companies how many u. Companies have developer resources all over the world in china and india and Every part of the globe, you go where the talent is so to think that these tariffs, these regulations or anything else, or even, investment is going to have a huge impact on the direction of the market. The industry of the technology is absolutely foolish.
Karl Freund:I agree. you can't contain this innovation and by the regulations, the US government has tried to implement this just shows that it didn't work.
Leonard Lee:Yeah,
Karl Freund:it shows that strategy is not working. So let's not double down on it. Let's figure out how we can get more talent in the U. S. not just through H1B visas, but by investing in our education. systems and getting more students trained. China's, blowing passage just in terms of the raw wetware that's being applied to AI and that's a long term threat to U. S. competitiveness. You're not going to stop that by passing regulations, preventing people from getting access to our technology. That's not going to work. We have to invest in the. intelligence ecosystem, which starts with people.
Jim McGregor:I completely agree. Matter of fact, we should be investing in education. We should invest, be investing strongly in the next generation. Now I have two boys right now that are getting their masters in AI and robotics. So I'm doing my job. What are you two doing?
Leonard Lee:You know,
Karl Freund:I have a, my daughter's a veterinarian, so that's not gonna help.
Leonard Lee:Yeah. I have a five-year-old who I know is going to be designing the eighth generation of fighters of the future, so, you know, can be gen platform. Come on. Yeah. I mean, already prototyping it right now, it's like ridiculous. Yeah. I'm doing my job. Jim
Karl Freund:Yeah. There's one area we should touch with, we're going to run out of time, but one area we should touch on briefly, and that's quantum. We saw a tremendous explosion in interest and investment dollars being poured into quantum in November and December. And then, Jensen and Zuck, said, nah, that's not real. and just yesterday, Bill Gates came out and said, no, actually, this is really going to happen. It's going to happen in the next three to five years. meanwhile, IBM is quietly digging the trenches to fight that quantum battle, which is the next big battle that will come about. So if you take a look at what Google did with Willow, you combine that with Trillium, which is really an impressive chip. You start to get a feel that we are approaching an inflection point where everything is going to change. And, both in terms of, you combine that with the portability that DeepSeq brings in, and you end up with a very different world than we've enjoyed for the last two years.
Leonard Lee:Well, don't you think quantum would just completely change everything anyways, especially as you're looking at neuromorphic computing, and then maybe even having to go to a different architecture for. Age quote, unquote, if that's what you're trying to pursue. I mean, I can't imagine that everything is just portable the way, you know, I mean,
Karl Freund:it's quantum will coexist with traditional computers. You need to have traditional computers did pre processing and post processing and normal scale scaler operations. but you combine the concepts of AGI and quantum and that's where the magic could occur.
Jim McGregor:just the amount of processing you can do and the data you can process in quantum. And everyone right now is talking about how AI will impact, will benefit quantum, especially in terms of developing circuits and doing error mitigation and correction and everything else. But no doubt, eventually, quantum is going to benefit AI as well.
Leonard Lee:Yeah, you mentioned IBM. I know that they've always been kind of conservative about expectations around quantum. my concern is that there is this pivot, to continue this AGI narrative and it's just another hype cycle. I think the danger here is that you start to place expectations on this. as you start to see the economics of scaling the current approach to quote unquote AGI, kind of run its course and wear itself out. I think IBM to their credit at least 7 years ago was the right one where this is experimental AWS. Their head of quantum said the same thing. This is like research stuff, guys. We're not ready to go commercial with this stuff, even though there may be some like, you know, applications, like, in the early days of supercomputing. Where you might be doing weather models with this stuff or molecular biology research and things like that, or maybe even drug discovery. I mean, not one always coming, but these are like, super computing themes that will continue to find a new home in the next generation of computing. But in terms of, Revolutionizing things on your smartphone. Let's not get our hopes up too high, you know,
Jim McGregor:it's data center play, but I think that, it advances the data center significantly. And, you know, any video would agree with that. I think by 2030. with the advancements we're seeing with, especially with error correction and with bit development, talking to Alison, Bob about their cat bits this morning and how they're using a feedback loop to reduce the, bit flipping, this is truly, this is a physicist level. But, and there's so many different qubit technologies under development and under consideration at this point in time. But I agree with Carl. I think by 2030 we're at a point where you have to be invested in taking quantum seriously because it raises the bar again.
Leonard Lee:Yeah. I mean no doubt when it happens. I think it's always a matter of timing, right?
Jim McGregor:Okay. We ready for the
Leonard Lee:okay. So here, let's do this. This is the 1st episode of 2025 and we're all just completely baffled brain fried. Oh, totally brain fried. And, yeah, unfortunately, we're not going to do this 5 hour thing that, Lex Freeman does even, we're trying to. Do something much denser, but I'd like to get both of your impressions of what, what we can expect in 2025. I know it's so cliche, but let's just do it.
Karl Freund:Jim and I were talking earlier. We were both working on our annual, here's what to expect in the coming year, blogs, and I stopped writing it because I have no idea. I really don't know what's going to happen this year. I do think that we're going to see a shift where the return on investment. Of cloud computing companies and other challengers to NVIDIA, including Intel, which has just cancelled, their next big AI chip. They cancelled it. Um, AMD's, we'll see. At least it's got a call tomorrow. We're going to find out more tomorrow, I guess, but not clear what market they're pursuing. you've got Avidia and you've got all the cloud providers. where's room for another, commercial semiconductor vendor to come in and say, yeah, well, I've got a better solution here. And that'll apply to Intel, apply to AMD and all the other startups out there. And so they're all going to be scrambling to find a niche they can fill and own. I'm not sure where that's going to land. I think at the end of the year, NVIDIA is strong, even stronger as probably the only viable commercial semiconductor provider of AI accelerators. What do you think, Jim?
Jim McGregor:I would agree with you in the fact that they have the complete stack from, Jetson platform all the way up to Blackwell, and they're looking at the whole solution. They're looking at the complete software stack as well. So I don't think anybody has the vision that they have. And the fact that the entire industry right now is focused on going towards agentic AI, and they're already looking at physical layout. I think they're doing it right because they're the only player in there. There there's no one else there. And if you saw some of these humanoid robots, the CES, they were impressive. I think 2025 is a year of practical AI. I think we see a lot of AI start moving into practical applications, whether that's for the chatbots or the productivity tools for the enterprise, or whether that's actually for embedded systems that add more intelligence, not necessarily generative AI, but even just traditional AI. Yeah. Seeing practical uses of it. Throughout 2025, I think we're going to see more innovation around wireless technology that kind of build up to the next wave, which is going to be 60. it's going to be WiFi. It's going to be Bluetooth. It's going to be all these, but they're expanding as they go. So, I think I am excited about some of that stuff. I think that, quantum is, let's put it this way, I don't know where the huge innovation things are going to happen, because there's so many things going on, from silicon technology, to software technology, to physics technology. I would say that is probably the theme that we're going towards 2025 and beyond, is that we're now not just looking at, battling Moore's law. We're not looking at scaling, you know, Dennard scaling and stuff like that. We're battling the laws of physics, and we're trying to address the limitations that we have with physics. And that's going to lead to a whole new wave of innovation, whether that's this year or beyond.
Leonard Lee:But hasn't that always been the case though? That's the thing, right?
Jim McGregor:I think we're going to a whole new level.
Leonard Lee:it's an entirely different technology, in a way, right?
Jim McGregor:I think that 2025 is also going to be filled with a lot of Despair, we don't know what's going to happen with, the social political situation that we have between countries and governments in this point in time. And quite honestly, they could tank everything, very quickly. so I'm very concerned about that. very concerned about. building up barriers between countries, especially when we have a global industry, I'm worried about, the distribution of talent, which Carl brought up, definitely worried about, trade wars. And tariffs.
Leonard Lee:Yeah, definitely.
Karl Freund:My company is Cambrian AI. But what happened after Cambrian AI, a mass extinction of six of them, depending on whether you believe we're currently in one or not. I don't know if there's gonna be a mass extinction event of companies developing ai, especially companies developing AI chips or not. And I think, uh, Jenna's point about the current, sociopolitical environment, probably will have more impact. whether this is a mass extinction event or a continuation of the camera and explosion of a, um,
Jim McGregor:so, um, that doesn't there's your, there's your headline, Carl Freund predicts mass extinction.
Leonard Lee:Well, actually, what I'm really curious about is, are you going to be rebranding to a cretaceous? Devonian.
Karl Freund:That's what I want to know. Decavrian was Devonian. Decavrian Devonian Extinction
Leonard Lee:of
Karl Freund:Evolution.
Yeah.
Leonard Lee:What about you, Leonard?
Jim McGregor:What's your prediction for 2025? My
Leonard Lee:prediction? Well, I have thoughts. Number one, I think generative AI supercomputing or this AGI supercomputing is going to have to Reconcile with, new economics and then these are realities. These are real economic shifts that are happening and it's going to create opportunities, in certain parts of the quote unquote ecosystem. But then it's also going to, maybe be that, extinction event for others. the other thing I think the hype is going to settle this year. We're gonna find that the CPU and the principles of XP are probably gonna dominate and be more relevant for AI in general or at large, at the edge, than what we're seeing in the, you know, what we, what the industry and the media and the broader public, thinks of the GPU, oriented, world of, AI computing. the other thing, maybe a move toward custom training systems and inference. So, optimizing because if there is credence in what the DeepSeq folks have done and there's this order of magnitude improvement that you can get across the board, there's probably going to be focused on that, especially, you know, like, when you look at the hyperscalers looking to operationalize. A lot of these models and, incorporate into their recommender systems and, agentic platforms, it has to be cost efficient because a lot of this is going to be given either away for free, or it's going to monitor the ties to a freemium, business model, like what. Meta does right? and then I think there's just going to be like, on the enterprise side, just simply, a digestion issue. the systems, you know, you guys know, I mean, we're on this friggin. What is it? 1 year cadence at the supercomputing level? Think about what's happening at the across the edge environments where. Systems are evolving probably way too fast for a lot of enterprises and consumers to really deal with. And so, I think that's a dynamic that, the, the end markets are going to have to grapple with and obviously it's going to have impacts, up and down the ecosystem. across the ecosystem and up and down the stack. So those are my thoughts
Karl Freund:to think about. It's gonna be a fun year. That's for sure.
Jim McGregor:Maybe we should be doing these every week. No, I'm just kidding.
Leonard Lee:No, I was thinking we should. Otherwise, we make this like a five hour thing and then we'll have to use AI to edit the whole thing and then maybe hallucinate in between. So anyway, but, hey, it's great catching up with both of you. hopefully we'll have a calmer. February, I doubt it, but, I'm almost ready getting back on for next
Jim McGregor:week. We have a lot coming up. I mean, bubble world. Congress is coming up and better world, followed by G. T. C. there is just on and on. March is ridiculous. I don't think I'm home more than two days.
Leonard Lee:Yeah. Yeah. All the more reason to watch our podcast here. the Silicon futures podcast, here and gentlemen, thank you so much. everyone, if you want to, if you want to get in contact with Carl, make sure to reach out to him at, Cambrian hyphen research, LLC. And I think it's a. com, right? But he's on LinkedIn. Reach out to him. And of course there's, Jim McGregor of TEUs Research. Just go to www.teusresearch.com and Oh,
Jim McGregor:Jim at TEUs
Leonard Lee:or Jim. Okay, there you go. And, connect with both of these gentlemen. They are thought leaders in the field and, of course. reach out to next curve at www dot next dash curve. com. And remember to like, share, comment, share your reactions to this podcast episode and, subscribe to the next curve rethink podcast on YouTube as well as Buzzsprout and, we will see you next week or next month, right? Back here to recap the month of February, which we hope is a little bit more mellow. Never. Okay.
Karl Freund:All right. Thanks Leonard. Take care. Cheers guys. Thanks for watching everybody.