Tricky Bits with Rob and PJ

Why This Podcast + Rob and PJ's Origin Stories

January 19, 2024 Rob Wyatt and PJ McNerney Season 1 Episode 1
Why This Podcast + Rob and PJ's Origin Stories
Tricky Bits with Rob and PJ
More Info
Tricky Bits with Rob and PJ
Why This Podcast + Rob and PJ's Origin Stories
Jan 19, 2024 Season 1 Episode 1
Rob Wyatt and PJ McNerney

Enjoying the show? Hating the show? Want to let us know either way? Text us!

“What, another tech podcast? Why?!?”

Having been in the tech industry for multiple decades across a slew of different companies, Rob and PJ may not have seen it all…but they have seen a LOT.

More specifically, they have seen what works, what doesn’t and everything in between.

This podcast is a deep dive into the technical side, unafraid to take on the nitty gritty details. It also covers recent tech news with Rob and PJ’s honest (and sometimes brutal) take…

It is a podcast that isn’t afraid to ask “what the frigging hell is going on here?” Only they don’t say “frigging”.

In this inaugural episode, Rob and PJ talk about why they started this and dive into their origin stories for how they got where they are today…

Show Notes Transcript

Enjoying the show? Hating the show? Want to let us know either way? Text us!

“What, another tech podcast? Why?!?”

Having been in the tech industry for multiple decades across a slew of different companies, Rob and PJ may not have seen it all…but they have seen a LOT.

More specifically, they have seen what works, what doesn’t and everything in between.

This podcast is a deep dive into the technical side, unafraid to take on the nitty gritty details. It also covers recent tech news with Rob and PJ’s honest (and sometimes brutal) take…

It is a podcast that isn’t afraid to ask “what the frigging hell is going on here?” Only they don’t say “frigging”.

In this inaugural episode, Rob and PJ talk about why they started this and dive into their origin stories for how they got where they are today…

Ierengaym. com ierengaym. com

pj_1_01-10-2024_153557:

All right. This is the inaugural episode of Tricky Bits with Robin pj. In this episode, we're gonna describe who we are, how we got here, and most importantly, why we're doing this. We've both been veterans of the tech industry for decades at this point in time. We've seen what works, we've seen what doesn't. We've seen what's fun and we've seen what isn't. And we're going to be your guides on where tech is at, where tech has been, and where we think tech is going. So we're excited to have you on the journey. with us.

rob_1_01-10-2024_153557:

what is the stated goal? Where are we taking this?

pj_1_01-10-2024_153557:

The goal of the podcast is we have seen a lot of fun stuff in the tech industry. We've been able to experience it. We've been able to build some of it as well. And I think there's a real joy that has been in the work that we've done and. There's also been a lot of soul sucking stuff we've experienced in terms of how the business of tech, big tech, or the lack of business in big tech has really changed the landscape. So I think the goal that we're aiming for here is really to try to accentuate the deep, technical fun that exists for the things that can be built that been built, and that we think should be built as well as highlight like where a lot of the gaps are, where these companies effectively shoot themselves in the foot or damage the industry because of really ass backwards thinking. And so I think ours is really this exploration between the stuff that's super enjoyable and beautiful and deep and technical and fun, as well as the bullshit that gets in our way.

rob_1_01-10-2024_153557:

I would completely agree with that. There is. A lot of joy to be taken from watching people play games. I think it's because we both in the entertainment space for the, the most part, like entertainment, technology, whether it be games or movies or special effects or whatever it, may be, it's a consumer product. The thing that we do isn't the consumer product. We just contribute to the consumer product. Um, there is a lot of joy to be taken from watching people enjoy the final product. And I completely agree that it's been easy and it's been hard and everything in between and over time, that balance has changed.

pj_1_01-10-2024_153557:

One of the most wonderful things, and I think this is true both for games and graphics in general, is that we also had to solve what amounted to fun, deep technical problems. you're really getting at the guts of these things. And asking these questions and not just taking it for granted where, oh, I, I won't even say C versus Assembler at this point in time, where you have folks that are programming in Java or Python or something that is built on top of some kind of virtual machine stack at some level, and you're not even getting close to what the metal of that machine can do. And I think it is a lot of fun to dive that deep, especially in the areas where there are problems you're trying to solve, make faster, make it more efficient to push the boundary

rob_1_01-10-2024_153557:

that to me is I think the essence of tricky bits. We're not gonna be scared of going down to Assembler or even transistors, in order to explain something. I think we are gonna expect a lot of our listeners, and hopefully we can present things in a way that doesn't intimidate people who doesn't understand it. We can explain it in multiple ways. But I don't want to step away from the details simply because there's a handful of people who might not understand it.

pj_1_01-10-2024_153557:

Absolutely., I'll fully admit, there was so many things that I took for granted, especially coming outta school or even as a nascent programmer that really got uncovered for me, especially at Insomniac games. I. When it really came time to figuring out how are we gonna make this as fast as possible and as efficient as possible, and as memory conservative as possible and I found this life changing frankly, in terms of how I started to look at all the problems that came after that. It influenced me greatly at Dreamworks and beyond. So I think this is extraordinarily valuable, that people have a holistic view and can reapply these lessons no matter what level of the stack you're at.

rob_1_01-10-2024_153557:

Yeah, I agree. I think a lot of it comes down to understanding the problem that you are given.

pj_1_01-10-2024_153557:

So Rob, how did you get into this whole computer thingy in the first place?

Track 1:

It's the only career that I have ever had. I started when I was a kid. I was fortunate enough to be a. Born at the right time. I guess I could've been born earlier and it would've been easier, but I was born in the seventies and I grew up in the eighties. And at that time in the uk, which is obviously where I'm from, there was a great home computer scene. We had all these new machines coming out and people were all trying to find their own way. There was nothing right or wrong about what people wanted to do. And we had the, the Commodore machines. We had the, the Acorn machines. The Spectrum machines, and then the UK had a whole bunch of other random machines like the Dragon and the Ttra and the oric and a whole bunch of other computers who didn't go anywhere. My first computer, I always say my first computer was A BBC. Model B, which was a legend machine, still is to this day. And there's a whole story there of how the BBC got involved and wanted to do a, uh, a computer program for the nation and wanted a machine that everyone could have that was standard. And they did. It was, uh, quite successful. Um, obviously the, the gem of ACORN was arm and the Archimedes and the original arm processor came out of that whole program. And I must say thanks to my parents for getting that because they were not cheap and they must have sold a kidney or something to be able to have afforded it. Maybe my dad was a drug dealer. I got no idea. But anyway, I got one and I very quickly learned BBC Basic and BBC Basic was brilliant and still is. And it had a built-in Assembler, which was the gateway to 65 0 2. Like I said, everyone was doing their own thing, finding their own path. And I quickly found that media audio graphics, which led down the games and later the demo path was. Where I wanted to do things. And in that day, assembler was the only way to really get there. You had to move things quickly on a slow machine. You had to write an assembler, which is where my love of Assembler originally came from. I have lots of fond members of making really cool things and my first commercial software products, and I wrote audio trackers and MIDI players and FM synthesizers and software renderers and all these things came later. Some of them were on the EDUs, but always done in the mindset of gotta do it as quick as possible.'cause if I can do it quick, then I can do more. Or whoever I'm selling this software to can do more without me taking up the whole machine. I went to college. I never had a pc. My first PC was after college. I went to work for Chrysalis Software in all the Room England, and they made all the good ous games. So IA was basically, they were ported from the Omega Bitmap Brothers games, SIM City, 2000, all sorts of really cool games for the Eds were all done by Chrysalis. So by this point I'd got myself an Omega and I still think it's the best machine ever made, for, for a programmers machine. It was brilliant. And because we were pouring out Omega games and some arcade games to the Eds, I was in heaven. It was like, this is awesome. I get to write Assembler every day and I get to ride these cool games and it was awesome. After Chrysalis, I got my first PC while working at Chrysalis, I didn't really program the PC much until I got to Dreamworks, and this was a whole learning experience for me. I'd never done X 86. I didn't barely knew what it was. I, I never used any of the PC apps. I did all of my college stuff on the eds. They had the word processors and printer drivers and laser printers and blah, blah, blah. I didn't need a pc and it was much easier to program than a PC was. So when I got to Dreamworks, they're making trespasser and it was in a very soy state. It was very low performing, very high memory, and obviously a pc. And this was my first time working in a big team, making a new ip. I've done lots of ports, I've done lots of small teams. Some of these ports you could do by yourself. Some of them were two people working on the same port at the same time. And this was now a big team. And he did this and she did that, and that's how it was. And it was producers and long-term schedules and things like that. And it was all new to me. It was my first venture into what today you would call AAA development. And the game was actually pretty good at this moment in time. It got worse before it got released, and it should ultimately been a run and gun shooter. But that game had so much tech, it had. The software renderer and it had bump map in and per pixel lighting. And we then code curvature the bump maps to get smooth surfaces. And it had wavelet dynamic mesh terrain, it had volumetric clouds, it had inverse kinematics. and it was done in this giant c plus plus template, and then one uber function that would just call into all these templates and. That's where my hatred of c plus plus comes from because trying to write that in assembler was a nightmare. And, but I did, I optimized it for Pentium Tanium Pro, Pentium, M-M-X-A-M-D showed up at some point. I wanted a K six version, so I made a K six with 3D now version, the game was delayed because the gameplay was crap by this point, and at the same sort of time, PCs were starting to get hardware and 3D FX was out, blah, blah, blah. And early games on using the hardware natively was starting to come out. And now these games were very fast at higher resolution, but didn't have the features we had. So when we were told to make a hardware version. So what do you do? And we were kind of stuck between a rock and a hard place and the solution was never great. It was never going to be great. They should have made that game into a Toro run and gun style game with some cool software tech and call it good, ship it. Two years earlier, it would've been great, but that's not what happened. But to this day, the tech in that game is awesome. And at the same time, Microsoft was making a thing called talisman. And this is gonna predate most people listening. Uh, talisman was a similar system. It was kind of a tile based re projection system and but it wasn't perspective. Correct. So it was crap. And it ultimately got canned. We did get the attention of Microsoft through doing this. And ultimately that's how I ended up at Microsoft was through this I worked on DirectX for a bit and then ultimately was on the founding members of the Xbox team. All the system architecture for the original Xbox. And that was at that point was just lots of paper calculations. It was like, okay, if, if Alpha blending takes does these operations on the bu, how fast can we alpha blend? How fast can we render? How fast can we clear the depth buffer, blah, blah, blah. And if the CPU takes this much bandwidth, assuming it's 90% cash to access, how much is left for the GPU, blah blah blah. And we went through this for a whole bunch of different GPUs, different processes, and the executives, a typical Microsoft decision picked gigapixel as the people to do it. It was a tile based render similar to power VR today, or Apple or something like that. And back then it was rubbish. Today, I may think differently about it, but back then it was terrible. We caused a huge stink and ultimately got it replaced with it Nvidia chip. I think if we hadn't have done that, Xbox would've gone the same way as all the other Microsoft consumer products, they've made that's supposed to revolutionize an industry and never did. And the reason is because of decisions like that where executives will just go, oh no, we need to do this. And it happened again on Xbox.'cause we originally picked an A MD processor and Bill Gates announced that the GDC without telling any of us that it's an Intel processor. And we didn't even know that it had switched. And Steve er switched it and his comment was, don't fuck up the Intel account. That was literally his words. And we're like, okay. So all that work we just did is now mute and this X 86, this compatible thing. It's true to a limit, but at the level we were working at doing like bus calculations, it's not even close to true. They're not even similar. So we had to go back to all that again and do it again within N Video as well. So that was a kind of a mess, but very typical Microsoft back then of like some external person would come in and be like, well do this. And they have say, and that's what happens. So for a long time, Xbox was on very, very fragile ground. It al it was almost just a, a, uh, a bullet point on Microsoft's list of failed projects. And fortunately that didn't happen. So we made the Xbox, we did the demos, we made those big shiny xs, which were kind of cool. Ultimately I left Microsoft and first attempted starting my own company. This would've been about 2001, one two, uh, one. I started, me, own Company. My goal here was I could help people make Xbox games'cause I know more about it than anybody'cause I built the damn thing and it became fairly obvious that actually a viable business. But in doing this, I got involved with the PlayStation two quite a lot'cause everyone was making PlayStation. Two games with the Xbox Games. And typically they were leading on the PlayStation two and port into Xbox. And there was a lot of things that were not similar. In fact, nothing was similar. On course there was two machines. So porting was kind of tricky. I'd done lots of ports. I got the hang of things and that's what I was doing. And I, I only did it for about three or four months and somehow, and I don't really remember how I met Ted Price, and I don't know whether it was by Fluke or whether it was through the PSS two work I was doing. I can't honestly remember how it happened. And it's like, well, we got a conversation going. It ended up like, why didn't come work for us? And it's like, yeah, why didn't I come work for you? So my dreams of doing my own business went on the back burner and I went to Insomniac. Uh, this was about 2001. It's ratcheting Clan one, and I didn't go to the Bahar office because they were moving to the current location in Burbank, the old Lockheed building. There I was working on the Rat Clank PSS two engine. It was me and Al Hastings were basically the only engine programmers on that game We did Ratchet one, ratchet two, and Ratchet three. And about this point, mark Cerny came to me and says, we're starting this project called the ICE Team. It's the initiative for a Common Engine. And today it's just known as the Sony Ice Team. And, but that's its actual name and it says it's gonna be a Naughty Dog's offices. They're in Santa Monica. And I'd like it to be part of it. And it's, it's also like this team's gonna basically get early access to the PlayStation three and we'll start doing common code, highly optimized code for the PlayStation three for all first parties. So I, I ended up on this team and we went to Japan a few times and worked, we had secret access to all of these really cool simulators and emulators and like cycle level accurate stuff. And they had this GPU that it was in-house. It was called the Sony rs. but it only did pixels. It was envisioned that the SPU and the processor would do vertices and they had this thing called the L-D-P-C-U, which to this day, dunno how it works. All the instructions were in Japanese. Mark Cerny could read it, but AIM wasn't even quite sure how this thing worked. It was basically a asynchronous gatekeeper where you could let the ES run out of order and it would send them onto the pixel shader in order, in submission order. That is so it would render correctly even if bits of it were done outta order, whatever didn't care. Uh, it never actually existed. It was just an idea as to how to connect A to B. A B in Es B, B and pixels. So they made this Sony RS and it was kind of power VR-ish. In that it had no fixed pipeline. Everything was in software. The depth buffer, the alpha blending it basically it made you do everything, which in some ways is great'cause you could do anything you wanted to do. So we spent a lot of time looking at tools of how would you program this thing? And then we realized in doing this, you had to schedule it yourself too. So we're like, why are waves going through the GPU today? And it automatically allocates based on resources that are of valuable. And you've got a shade that that uses more memory, you might get less threads going through because you've run out of local memory. Likewise, if you use a lot of registers, you might not have optimal number of threats because you're gonna run out of temporary storage. You have to do all that yourself, and you had to trigger the passes yourself too. So you'd like go to here, stop, go do to here again, then continue. It was this whole kind of nightmare-ish scenario as to how you had to write these shaders. I wrote Bump M. It took ages to write bump M mapping optimal, no stalls at all. And it was all done on this simulate, I believe, was called olive, and it was a cycle level simulator of the os. And it's like, this is gonna be hard to program. Like what tooling do we need? What do we need? A compiler, blah, blah blah. And how does this integrate with LDPs a year? With the vertices, we didn't even consider that. I was just like, give me three vertices and screen space and I'll start there. All that's a problem for a different day. So it was hard to program and then Sony made the thing, or at least laid it out in silicon and realized that this thing is not possible to make. It's huge, physically huge in number of transistors. And Sony weren't GPU architects. I mean they, they were already behind N Video and a MD at this point of matrix and everyone else who existed in the early two thousands. So they did what the first thing you were expected to do is you start cutting things. So they cut the things that took the most area and that was all the good stuff. So now it was even harder to program and less powerful and ultimately useless.'cause it didn't do vertices. So there was a bit of a panic. They started looking at, well, what if we put four SPS in? Can we or four cells can we render in software? So the ICE team were looking at like, how would you render in software with a 256 K buffer in each SPU Yes, it was just two business K sram, fast as hell, but that's all you got for everything. So that was code data, DMA, input, output, output buffers, and all of that. So we looked at software render and you could definitely write a software renderer and it would be pretty decently quick. And again, it fit me perfectly because I've wrote tons of them by this point. But now you're competing with hardware, so like feature sets starts to become really important and shaders. How do you deal with shader when you've got a software optimized renderer? How do you just drop in a shader and make it optimal? I mean, there has to be a boundary somewhere, and what happens if this SPU does this 32 by 32 block and this one does this 32 by 32 block what you do with the boundary and things like that. It was possible, but wouldn't be great and it wouldn't compete with the Xbox 360, which had a real GPU. And that's ultimately the goal here was outta the same time is the 360 was coming out and it had to compete. So then they looked at like, well, let's put 128 pss, two shaders, uh, shader cores in it and things like this. And weird, esoteric things where we could have big blend units that you could just feed back into each other and do these multi plus blenders, which is kind of how originally dx, now the not DX 90, the Nvidia GForce, the original one, kind of had these blend stages. It didn't have shaders, it just had these blenders that you could program between each stage and you could kind of get it to do various effects. So no was the answer. Yes, it's, it'll work, no, it will won't compete with the 360, It was a DX 10 class GPU and the PSS three only got a DX nine class GPU in the end because in the end they went to NVIDIA and we product them. I had some contacts on NVIDIA from doing the Xbox stuff and we kind of pushed them Sony towards that direction. It was very, no at the time because it's Sony and a, the Japanese culture is not want to go out and ask for help, especially from an American company. But it was the only way they were ever gonna compete. So they went to Nvidia, they got the RSX, which it was kind of an odd ball. It was never exact. Chip was never used in any sort of graphics card, but obviously it had vertices. So now what do we do with the sps? We don't need them. They don't need to do vertices anymore. Now they can do anything they want. So the ICE team going back to that had this whole initiative now to like, what can we make these SPUs do? And the answer is, if you do it first party code, you can make, do anything you want. If you are porting code like the Unreal engine, you are screwed because everything was DMA based. You had the DMA in process dmma out. There was no I'll access this memory. You could access it and DMA it and write a template to hide it, which I think is what Bungee did. They could just run code on the SPU and it would sign it with DMA in the background and cash and things like that. We being first party took the opposite approach where it was just make it work, do whatever you need to do. It doesn't matter how ugly it's, and that was the saving grace of the PS three. It wasn't the Nvidia chip, it was less powerful than the a MD chip in the Xbox. The saving grace indirectly was Nvidia because that freed up the spu and it could have been any GPU to be honest with you. But, having those pus for physics and audio and particles and in the end we're like manipulating frame buffers and command buffers and all sorts of really questionably, dodgy things on the spss. Initially we didn't have register level access to the, to the GPU. And NVIDIA was very adamant that we would never get as they always are, and they, there was open jail, that's all we got was open GL and it's like, oh, this is rubbish. But they screwed up because they printed on the debug TTY, they printed address of Command Buffer is here. So it's like, oh, okay, I'll just point the debugger with that and see what you put in it. And so we'd called GL and we'd look at the command buffer and it's like, yep, that's actually the command buffer. And it's like if I put my own data in there, it'll execute that too. And we soon figured out like lots of the structure of the command buffer. I quickly realized it's the same kind of format. As the Xbox was. So all bits started to fit together. So we went on a whole project here of reverse engineering, the RSX, and we made this thing called IG Render, which was Insomnia Games render, and also ICE Render two projects doing the same thing. And we'd just share as much information as we could find. We very quickly figured the entire GPU, including instructions and instruction formats and everything, it took quickly being a year or two. It took a while and enough that we could use it directly without having, uh, to use any of OpenGL. You had to call something like, if I remember, you'd call like GL Flush and in one of the registers when it returned to you in one of the volatile registers that the compiler doesn't have to restore. It left the pointer to the control registers, so you could call it and you have to pull like R 12 or something like that. You'd call this function extract R 12 into a variable, and that was the DMA controllers for putting gets, and you could now start and stop the GPU yourself. Ultimately all of the stuff we reverse engineered and made inline in our own code, ultimately years later became part of Lib GCM. So the CPU overhead of our system was minimal and resistance would not have been possible without that ultra low overhead CPU code.. However, we submitted the game and they're like, where's the render code? They're like, you can't do this. You can't render yourself. It's like, well, we are, and it's not possible without, so we had a whole fight with'em. I felt like it has to be this way. And in the end they're like, just make it all ICE render. Get rid of IG render. Make it all ice render. So it's, it's kind of supported and we'll allow it. And the reason they won't allow it is they hadn't tested all the register combinations. Like if you set a texture, you set it A, B, C, D, E, you write it, registered disorder. But if you set AB and then later onset C and D and then reset A and then set E, they weren't quite sure what the GP was gonna do and it ultimately didn't do anything. But they hadn't tested any of this, so that's why they won't let us do it. So they started to test some of the vector, uh, that. Ice surrender would kick out. And then we just used ice surrender. It was the same thing. It was all in line. It was fine. But anyway, that's, that was a long time insomniac. So after that I ultimately had a kiddo. Uh, this actually, I was born in 2004 and it's probably 2008, something like that. Now I decided I was gonna leave LA and move somewhere else. I almost moved to Bend, Oregon, but ultimately moved to Boulder, Colorado, which is where I am right now, and brought the kiddo here.'cause I didn't want to raise her in la I wanted it to have more of a childhood of the eye had of outdoors north of England, playing outside all that. So we moved to the foothills of the Rocky Mountains, just west of Boulder. And she grew up as a mountain kid and she loved her. But I had to leave Insomniac. So I came to Boulder and I became a consultant for Insomniac for many, many more years. I worked on all the rest of the PS three games, the early PSS four games, the Xbox game they made. I, uh, worked on that. So basically the PSS four engine architecture was mine. The PS five engine that they have now, as we've seen in the leak, is kind of the same engine they used to have. So it's also mine. So yeah, all those are mine. But when I got to, uh, Boulder became a consultant, I actually did finally start the company that was gonna start 10 years earlier. So I became a consultant, uh, did a lot of work for Insomniac, did a lot of work for Otoy as well, which was early video research at this point. Still doing lots of path, still doing lots of. Novel camera things. They had, they still own life stage. So they, uh, at the early, early life stages, they had the first functional one and the, the first geodesic one. And I did a lot of work on that, A lot of data processing, a lot of handling of the how do you synchronize everything. And now I guess it's just a package you could buy and anyone can have a life stage. So the new ones do specular and diffuse and all this. If you know what life stage is, go look it up. It's Paul Devic, ICT invented So in doing that, I did a lot of movie work, lots of special effects stuff, and it was always like, oh, special effects in real time would be great. But you look at how a movie does it and like, they're not shy of modeling the whole scene. If they have to, if they have to get, go and get an architect and build this entire building as a 3D model, precise, exact to what it is in the real world, they will. They're not shy of, it needs to be perfect. It'll be perfect. And if that moss on that building needs to be modeled, it will be modeled. And that was that. So I was involved a lot with like how shadows and how did you put a real shadow on a fake object and how even more difficult, how to put a fake shadow on a real object. And this is the, the art of special effects is how it all integrates together. And So today it's more intuitive than it was, but back then it wasn't. Anyway, all of this took me to, uh, MagicLeap where initially ar, my view of initial AR was there's a lot of technical problems to solve. There's a lot of things that have to be in place to make AR or VR work at all, like head tracking and prediction and. Abusing command buffers to get that prediction date into the GPU at the last possible second. But once this is solved, it's a commodity product. So I did a lot of that work. It was all low level. I always loved that work. So I did a bunch of that. And then a lot of it was how to use ar, and this still hasn't been answered. My view of ar, the ideal AR is a movie special effect. It's indistinguishable from the real world. It's fake, but it integrates with the real world perfectly. Yeah, not gonna happen. One day it might, I think today, AOL's in the like late 19, mid 1970s computer graphics, you can have pixels, you can have jagged lines. It has to be pink apparently.'cause everything in AR seems to be pink. So maybe it stands out. I don't know. Pink's everyone's favorite color. It's actually really difficult and the type of AR matters to as to whether it's mixed reality overlay canvas stuff or whether it's truly looking through glass, putting pixels in the world, magically hollow lens style. And once you've solved the technical problems, you start to solve the visual problems of like, how do you do a shadow? If you've got additive display, how do you render a shadow at all? You can't render black, add black. Nothing happens. So. How do you even put fake shadow on a fake object? Like, I have a coffee table, I have an object, I want to cast a shadow. Technically not possible with ar because you can't rent a black, you can't rent anything darker than the scene. You can't subtract the light from what's coming in. So you've gotta invent all these new techniques of like, how do you do it? And we, some crude ways of doing it was like, just render a gray polygon over the whole screen and then you can subtract from the gray to make it look like there's a shadow. But that reduces the contrast and dynamic range of the rest of the scene. And there's all issues with that. So, and then you get into lighting of like, okay, I have a object that's in a room, and the light turns on. How quick does that object in the, in your glass, have to respond to the fact there's now a wide light in it. It was hard. It was a blast and it was, none of it worked great. I mean, none of it's ready for consumer space, and I think that's true to this day. Uh, this was 2015 or so and so almost a decade ago. It wasn't great. The work was great, but the results were impossibly difficult and so demanding your vision system of as a human is so good at spotted defects. It's not your vision system's not great at all technically, which is why AR works at all. But it's incredibly good at spotting defects. And if it's something it doesn't like, you'll just focus on that and it's really difficult. So after Magic Leap, after that. I went to, uh, daiquiri did more AR work and Daiquiri was not meant to be. They, they they're the ones who convinced me that AR is practical and viable in a very controlled environment. And enterprise industrial is that controlled environment where you could say these lights have to stay on all the time. That window has to stay closed. Consumers don't live by the rules. So ar in the consumer space is much more difficult. But Daco did indeed have an enterprise system where you could wear glasses and it would show you, take two of these, one of those, put it here, do this, do that, and it would watch you and tell you what to do. And it was, kind of decent in its environment. Something else Daiquiri was working at the time was holographic displays. Seamus was doing these and we had these crazy elk cost devices that would basically change the refractive index of a pixel. And by changing the refractive index of a pixel, you can bend light and if you can bend light, you can make holograms. So we had these real time holograms. They were this big like miniscule things, but you could put lenses there and make it bigger. It was a, yeah, so it was a projected hologram from a one plane and you've got a hologram in space and it worked great. We made a whole API so you could render. It's like a GL type api so you could render things in holographic space and it was colored too. That tech got spun off when the company went down that got spun off into a company called Pacific Hologram, which Seamus run, and I believe it still exists. Their website still exists. It, uh, it's still in stealth mode according to the website, but maybe I've just spilled the beans. Ultimately, I stopped doing that and I actually missed a bit in all this. In all of this. There was the ratchet HD collection, which is before I went to Magic Leap. As I said, I started this company. I always wanted to start. I worked for otoy. We did all that in all of this. Sony decided that they wanted to make a HA PS three HD collection of the Ratchet PSS two games. They were looking at some companies like, oh, we'll just port it to sea and do all this. And it's like, I don't think you, we can realize what you're up against the game. The engine's straight assembler. There's no C it at all. And it's all very DMA driven and very asynchronous with all the vector units. I think it's Ted Price who says the only people who could do this is either us or Rob, because he's now a consultant. And he wrote the damn thing. So somebody came and was like, yeah, make this game. So I ended up making the HD collection of the PS three with a company called Idol Minds who were in Broomfield, which is next to Boulder. And basically,'cause they had testers and DevNet access and they were PS three developers. So it was easy to do it with them. But yeah, it involved reverse engineering the entire game. And there's a whole story there as to like, we couldn't recover any of the assets or the tools, so we did it all from the retail disc. So it was the best way to go in the end, but kind of a nightmare at the time. So, but anyway, that was before Magic Leap, then Daiquiri, then I still had this business the whole time I was at Magic Leap and at Daiquiri I still kept my company in the background, not doing much. And that stayed true. Did a bunch of contract work for many different companies and ultimately ended up with Apple. And that's when I finally shut my company down because Apple being Apple is lots of rules and worked on the Vision Pro. Yeah, indeed. Worked on the Vision Pro. And again, more ar, but this time not real ar This is now projected mixed reality as they call it. I call it projected ar'cause it's from a camera and you're not looking through a piece of, you're not looking through glass, which you project light into. You're now looking at basically a VR headset with cameras that. Capture the real world render on top of it and then present you the whole image, which is very different to real ar. It's much easier to ha handle latency in the mixed reality Apple style projected AR than it is in the real world. ar'cause real ar. The real world's gonna do what the real world does and there's nothing you can do to stop it. Whereas with the camera based ar, you can subtract, you can subtract light, you can do anything you want to do, but you have all the display problems and all things like that. So I did that and I left Apple, oh, year, two years ago. Oh, a year ago. And vision programs are next month. So we'll see how it is. It's pretty, from what I've remember of it, it's a pretty nice device and I still don't think it's consumer ready no matter who makes it or what they make. The technical problems of living in the real world are far bigger than the technical problems of making the device work. As we found out at Magic Leap, as we found out at Daiquiri, I don't think Apple have the staff all the inclination to solve those problems. Apple's not a special effects company. They might solve the technical problems really nicely. They might have displays, they might have super low latency, it might have great head cracking, but that only gives you ar. It's what you do with it then that defines the product there isn't any killer apps for ar and I think we should do a whole article, a whole podcast on AR and go into the details of this Uh, it's been a great ride. Uh, I do a lot less of this stuff now than I, than I used to. I think pass the button to a new generation is the same for real world problems. Like we've had our stab at it. We've set up these great frameworks, we've set up these great platforms. We've built understanding, we stood on the shoulders of giants when we started and. Now it's time to pass it to the next generation. Let them have a go and maybe they'll do better. Maybe they won't, but it's, it's time The origin story I was born, in the late seventies, so a lot of my early childhood really was in the, really through the eighties and early nineties. And one of the things that I absolutely loved was an explosion of visual effects and visual effects development that was happening and it connected so well with these stories. It also resonated for me. So I love the movie Tron. It's one of my favorite movies. And it hits all the buttons, I really love that it is this wonderful combination of engineering and math and art all coming together at the same time. It happened at the same time that we, my family had a Commodore 64. That was something that my, my dad got I think it was like 84 I wanna say. And what I started doing was, I learned basic, basically because I would, load up games and, play them and whatnot, toss'em in the gist drive. But then I was really curious about how to actually do this stuff. So my dad had a subscription to Commodore magazine I would read the Commodore magazine as it would come in and I would look at all the code that was in there for like little games or little, like fireworks displays or whatever, I was doing a lot of that while we're still in Florida. And then after my parents got divorced, we ended up My mom and my brothers and I ended up moving initially back to New York where I didn't have the computer for the longest time. And then we ended up in Providence, Rhode Island, early high school, I ended up getting a Commodore Omega 500, it was a beautiful machine. I, so I'll admittedly, I didn't dive as far deep into that. What I liked doing with it is actually there was a lot of built-in tools. You could just do a lot of very cool things with I was able to get a copy of deluxe paint too or something like that, which you could do some really primitive animation and 2D stuff in. And then I could actually even output that to the VHS tape. had a built in speech synthesizer. So I was able to use that for movies we were shooting at the time in high school. It was a lot of fun basically to even just play with it as a content production machine, which is what I was doing a lot with it. I think this was a wild West time for consumer computer graphics.'cause there was Callegari, there was like Lightworks I think Autodesk, obviously had the high end stuff at the time. 3D Steel Max was happening. Ari's True Space is actually what I had. And it was the computer teacher at my high school had either gone to Brown or knew folks at Brown, and she was able to get a copy of True Space. So this was like true space two or three. And for me it was amazing because it was a full environment. It was basically like Maya plus a render. Her and again, a very primitive version of Maya, but still it had animation in it. It had the set geometry where you could do intersection addition and subtraction of solids, so you could do some really cool stuff like with it. And so I like, got it together a whole bunch of, I didn't have a PC at the time in high school, but my friends did. So I basically started big borrowing and stealing time on each of their machines or getting them to do stuff so that we could actually cobble together like all of these different scenes and assets of starships fighting each other. And then in order to get it off of, so I got it all rendered and I ended up getting friendly with the Browns Computer Graphics Lab. This is 96. And I was able to like, able to do stuff there and render it there. And then they had, I think it was Media Composer at the time where I was able to actually output the scenes onto VHS. cause every single time I said, okay, I gotta do this. Like, how are we gonna get it off? I don't know yet. We gotta figure out the next problem. So it was always like, like we had a little bit of stuff and then we figured it out and then a little bit of stuff we figured out from there. And then we like, ultimately got it done. And that was exciting and fun, which was like, we didn't have to necessarily know how to do everything at the same time, which we needed to do was really figure out like how are we gonna get it? Just the next step that was okay and, figure it out and change it. So I ended up going to, to Boston University because I I could I went there on because I couldn't, we couldn't afford really any other university. I was like, so I was able to go there on scholarship. And the comical thing is I was like excited to do computer graphics type stuff, but I had misread the course catalog. So the course catalog had computer graphics in the computer science department versus the computer engineering department. So I actually ended up going to the computer engineering department., Boston University was unique in that it had a computer science department that came out of its math department and a computer engineering department that came out of its engineering, like electrical engineering department. So we actually had both of these things, which were similar but tended to have different foci. I ended up more, in, in many ways as a precursor to stuff I would do in grad school, I ended up taking a lot of like signal processing courses. We had some basic courses that we had to take, so like just understanding, convolution, understanding like digital processing. But then I took the graduate level courses as my,'cause you needed to take a couple of those in order to get your undergraduate degree. So I ended up taking like the DSB course for in undergrad, which was hilarious because that meant I couldn't take it when I actually stayed there for grad school. Now at the same time, basically was actually still shooting movies. I was doing some plays. I picked up some improvisational comedy skills. And then after I got a PC and I got a board where I actually could pull in the video, I was starting to do editing of those videos. In Adobe Premier 4.2, at the end of bu my undergrad, I didn't really plot out stuff very well. I tried to get into MIT's media lab. didn't get into MIT. So it was the second time they rejected me. And but what happened was they ended up getting a couple of professors at BU that were doing computer vision in the computer science department, and a lot of image and video processing in the computer engineering department. So I was able to swing it to take courses both ways. Finally got my graphics course there. And this is a, I'll tell a tragic yet comical story at the same time of why I never properly learned Rasterization in a second. But at the same time, during my first year of grad school, I actually picked up a consulting job working for a financial company on Wall Street.'cause my dad used to work there and he had some people who needed to do some, actually turned out to be some database stuff, which I didn't know anything about, but it was like, all right, let's just learn this stuff on the job, figure it out, make some mistakes. And, so I picked up a bit of sequel, did a bit of Java. I didn't know Pearl at the time. It would've been really useful for doing the parsing I had to. But I, so I did that for a year while I was, my first year of grad school, which I was also doing as a teaching fellow teaching algorithms. So I actually helped teach the algorithms course and I was the only person to volunteer for that one.'cause I was this found, I found it interesting. So I ended up spiel is in Kellogg was in Jersey City. My grandparents lived in Queens, New York. I used to take the train from Queens to underneath the World Trade Center, the Twin Towers, and then hop another train to go to New Jersey. I stopped working there August of 2001. And the. the day that I was supposed to learn rasterization, and they still had the course. That day was September 11th, 2001. So I missed that class probably for obvious reasons that day.'cause it was just a crazy day at that point in time. And then never, I learned Ray tracing fine, but never properly learned like ization. It was always a little bit of a chip up my shoulder for that one. But I'll admit, like I was a little bit listless in grad school. I couldn't quite figure it out. I did squeak by with my master's project at the end, which was trying, this goes back to my movie thing. My, my master's project was trying to use stereo cameras to improve lighting for green screen. And doing stereo reconstruction at the time was a pain in the ass. It was basically doing it in a block matching system with some map estimation on top of it to try and constrain it a bit. But it was some fun stuff where I was able to take the two stereo images, And because it was a green screen, I was able to take advantage of the alpha value you can get out of it and use phase correlation as an initial, like seed for your map estimation for when you're doing the actual stereo stuff. So it was like, there was a couple cool things there. It wasn't in the end particularly practical, but it was, squeaked me by, and then after that I headed out to la so I and a buddy took three weeks to drive out from Rhode Island'cause I knew a lot of friends and I couch surfed the entire way. And then I, in LA I didn't actually have a job. I crashed at a cousin's place for a little while and worked a lot of odd jobs from doing some marketing assistantship to technical consulting. And then I ended up at I, I answered this was when I was using Craigslist at the time to answer for a position at uscs Institute for Creative Technologies. So ICT, which is where Paul Devic was at, he was doing like light stage stuff there. I was not working with Paul I was working offsite in this little warehouse in Marina del Rey called Flat World. And their concept was,'cause they were working a lot with the military. Send a UAV through to collect cloud data points, reconstruct a 3D scene from that, and then have all these screens you could put up so that people could train through it really quickly. So project the scene that you're rendering onto these screens and then, you can go through it. And they had a ogre as their open source game engine. And this was at the time where there was like a bazillion game engines that were floating about. Unreal was there at the time, but it wasn't great at that moment of doing third person stuff, which is what we wanted. could, ogre wasn't so I ended up replacing, so and I didn't really know much at the time, Rob, this is like really like nascent stuff for me. But I ended up like they had used Dogar'cause it was free and then they bought a license for Gamb, like use Gamb. And I was like, all right, let me figure out how to put this thing So like I had to figure out how to like, just do all of this stuff. So I ended up rewriting the whole system with Gabriel clients.'cause the ogre one, I can't remember exactly what happened, but it failed. There was a day that came where I had gotten everything up and running, but we just hadn't done the switch yet from the ogre one to the Gabriel one. And this day where they were like having People come by and see it. We had to switch it. So it was like, it was a little trial by fire, but it worked. And then I augmented with a few other things I did. I was, I did one of those things, like everyone writes their own ray tracing engine at some point in time. At, especially in college, I made the questionable decision of writing my own scripting language for this thing. And that was a very bad idea. Later on, they replaced it with Lu, which was a much better idea. Around about the same time, because we had seen a lot, some, a bunch of success. This is back in oh four, there was this virtual soldier project they wanted to do, where again the concept for it at the time was that there was significant cultural differences between, let's say the US and other, like at the time, we were in the, like the US was knee-deep in the Middle East for Gulf War ii. And the idea was like there could be certain things that are considered aggressive by American standards that were actually just part of the culture. So what they wanted to do is be able to have soldiers train with kind of these sort of culturally appropriate modules so that you could get used to the cultural engagement at and do this at scale to effectively prevent like bad things from happening, like shooting people. So what we did is we had this speech recognizer that we hooked up to a primitive AI and then we took a graphics client I wrote with a virtual human that we produced and we actually had a couple different versions of this. We proof of concept this once we actually used for a little while Microsoft's, Microsoft had a speech synthesizer in engine. And what basically would do is it would kick out phone names. And then we would map those to vises that we would then use to animate the face in real time. It didn't look great, but it was cool that we could do it on the fly. And then, we projected this under this piece of plastic that if you lit it for properly, it looked like a hologram. So I did that for about like a year and change. And then I tried to get into Dreamworks animation'cause their compositing group had a, an opening. And I didn't get the job but it was just after I'd interviewed within Insomniac games. and I had gotten an offer from Insomniac, and when I asked Dreamworks like, cause they said I didn't have production experience. And it's always one of those things like how do you get production experience if you don't work at a production company? But I asked them like, would a game company count? They said yes. So I immediately said yes to insomniac games which kicked my ass. Oh my God, man. Like I, I'll be honest, like I did pretty well in college and then even my first couple of jobs where I was high flying, I thought, I was a hot shit and oh my God, I did not know how much. I didn't know. Like I It was, it's like embarrassing at this point in time to look back and see like I just, I you all were, doing such high tech stuff and like I. I, I felt just like my knowledge of was just gonna get totally ripped down, which it was, which was great. Honestly, it was one of those things that was like actually highly necessary. And I will always joke though for when, for resistance Gavin was the one who suggested us using Ga Anarch. And I said to him, I don't think this is a good idea. And then he overruled me and then he left being the lead of the tools team. So just what the hell? So I was left, that was left dealing with this relationship now at this point in time and making this work. It was very much so I had to, and there was so much shit I didn't know, and I feel embarrassed about it now, but just as we got into the fact that it was, the memory management was the biggest problem. and having to like, they, like anarchic, rewrote it so we could replace the memory manager. I, was able to replace it in Lua'cause that was the scripting language they were using. And that's really where I like encountered you the most. Which I'll be honest, the concept behind it was decent because the idea was like, is there a way so that designers. Could like directly basically get their stuff in without having to have a programmer program. It, so like, could a designer basically use Lua and the art assets and the tool of an arc, export it and have it just work inside the game? Uh, yeah, I, and I, I, I do wanna comment that. I mean, this was because I was working basically both in the tools and on the sort of rendering side. It put me in contact with you. And you intimidated the hell outta me. Uh, like, um, like it was, it was, uh, it was like, all right, I don't wanna, I don't wanna bug Rob, but I gotta bug Rob Like, um, like they, they messed up. Long jump here. I I, I don't really know what to do. This was like the, uh, one of the version

rob_1_01-11-2024_100319:

I do remember fixing that.

Track 1:

Yeah. And I was just like, look like Le needs this. But I looked at the assembly and set jump and long jump were identical and I was like, I've, I've, I've hit my sort of knowledge and you gave me a great patch that worked well. So appreciate that. And we were both tea drinkers. I was, I've become more of a coffee drinker now, but you introduced me to PG tips.

rob_1_01-11-2024_100319:

I still drink PG tips. I have

Track 1:

still do.

rob_1_01-11-2024_100319:

the morning. I've got one in my hand right now'cause it's,

Track 1:

Nice I remember the day you actually let headed out from the office and you dropped off your stash of tea with me. And I was always touched by that. So anyway I was there for the end of resistance and then comically, I was not gonna be put on ratchet. I was gonna be put on RA resistance too. And I actually was also talking with Sean McKay at the time'cause he was looking for a new audio guy. and I was helping him put together like interview questions for audio'cause of my signal processing background. And instead basically I, that was gonna be a possibility of heading over to the audio side, but I actually, there was a tools opening on the rendering team, the lighting slash rendering team at Dreamworks. So I tried for the McGann and then I got it that time. So I ended up heading over to Dreamworks Animation, which kind of was, that dream I had of working in the movies. So I spent the first year and a half as just like a programmer. And I was on the lighting tool for three weeks making improvements. And then after that they had a they were doing what was called the mini farm, which was basically a distributed renderer, interactive renderer basically. So they would take a few machines and they wanted to split it up so they could have better iteration cycles before doing the full blown render. So I was the second put on that, and I was able to figure out a bunch of stuff. And then after that, they kept me doing different, like rendering things like I worked on, like speeding up.'cause the way the Dreamworks render worked was two passes. One which was a primary visibility cache. And then the second, which was the lighting pass. And it turned out that if that, that primary visibility cache was nearly enough for both eyes. So you could actually just render a few more polygons for both eyes. And then. Light both at the same time. And if you didn't really care about camera dependent stuff, which a lot of time you didn't, depending on what layer you're talking about, you could actually just light it once and you get some pretty impressive improvements in terms of rendering times. So anyway, I did that and then I got tapped to become the tech lead of rewriting the lighting tool because Dreamworks had hit a scalability issue and a lot of the artists were actually not doing lighting anymore. They were doing like 80% of their job was file management because the tools no longer, like they were built with this notion that they could hold the entire context of the scene in memory at the same time. and as you started to get like foliage and fur and giant like crowds, like just was not possible anymore. So the idea was like we wanted to rewrite these tools from scratch and also reinvent the workflow, which was, it's funny'cause when I went into Dreamworks I was like, man, this code is nowhere near as efficient as like what Insomniac was doing. And I, I had again, a bit of an arrogant chip on my shoulder by that point in time again. But then having to rewrite tools for this magnitude and scale, I was starting to get hit with a lot of like humbling lessons at that point in time. And I got in my head and then we just kept figuring out problem after problem. One of the things that happened was the sort of, I spent a year and a half studying how the lighter worked, like figuring out and talking with all them. And I came up with a system that I thought was really gonna be awesome. And they all agreed to it. They thought it was great. We implemented it was terrible, like they hated it. So then that was one of those existential moments of the project. And then some folks on the artist side changed. Some folks on the engineering side changed. And ultimately we figured out basically what they took, some of the concepts that I had and kind of reworked it. And one of the fun things here was that when I looked at the stuff they wanted to do I realized a notable it would be inefficient, but a version that was able to get up and running was effectively to haul, go back to my old database days. And I was able to effectively I created an API that was totally clean, but underneath the hood I was able to use SQL light effectively as a an engine. Because so much of this stuff was really just about managing, like how Lights were attached to other things. And so like underneath the hood, I had a, an implementation that used this basically just to get the job done. The reason I mention this is'cause it's right around the same time that we hired Mark over and the API had none of that, that that SQL stuff inside of it. Mark came in, he was able to rewrite the entire implementation to be vastly faster and more scalable. So I do wanna give credit where credit is due. I was able to get it up and running, but it was always one of these things where it was like, at this point in time when he did it, it was the third iteration we had on this thing. So it was definitely like each time we got like closer to hey, it's on its feet now. It's really functioning well now it's super fast But it's funny because it's right around the same time as I was getting that project done. Dreamworks had been great, but it, the projects that were available were starting to, to dwindle and it was around the same time that Google came, reached out to me. And it was this was actually after I had met Jen who's by that point actually was my wife. It's a whole separate story after we'd known each other for six months. And I had a, I had my EL on the way. Google had reached out a few times over LinkedIn. I didn't, I kind of ignored them and then Jen was like, you probably should just respond. And so I was curious at the time about whether I could make it through the Google interview process just to see could, was it possible? So after that fourth time I did the phone screen, I did the the interviews, and then I ultimately got an offer uh, to which I said no to three times.'cause I still really loved working in movies. What Google was offering me wasn't that great. And then. We sort of figured out there was one possibility of going over to the Niantic team which would've been interesting. Mark Doberman's group. I'm not sure you remember if you've ever met him, or, yeah, I figure you might. But they were only doing internal hires, so I headed over to photos and it, it happened actually around the same time as my daughter's birth, so I did a reset. But I'll be honest and I spent a lot of years over at Google in, at first stint in three different teams kind of feeling out of place because that big tech culture of you gotta do it perfect the first time and write long grad school-like documents to actually do the designs. And the stuff didn't really connect with users, but we were doing it because it was complex. All of these things ran pretty like anathema to How I had come up in the industry, which is like solving problems for people, doing it multiple times, being okay with being wrong and, pivoting where you needed to. So I, I struggled quite a lot of trying to figure out my place of just what the hell's going on here? And it was like, okay, maybe it's me. Maybe'cause every, like single time I was like, maybe I'm not learning the right stuff, or I haven't, I really realized that there was a, like my perspective on the business was vastly different from big tech perspective on business, which is that when I was in Wall Street or at Insomnia or at Dreamworks, I got it like, and I got what the business was, especially at Insomnia and Dreamworks, we sold content and all the technology that we did was in service of that. And I, I realized that, that Google had a whole slew of cost centers and, but they weren't actually connecting the dots to the business model underneath. I then tried to say okay, let me see if I can propose some projects to actually see if I can we make photos, money? And we tried to do it through ads. It worked, it was some interesting engagements, but. Ultimately the product area wasn't that interested in it at the time. This is back in 2014 after my twins got born. So I tried a couple things. They got shut down. LA was not working out for me and my family at that point in time. So then someone said once, don't you check out Boulder. And I was like, that's crazy. I could never leave la. And then I started looking at Colorado and it was like, all right, this seems interesting. So we um, came out to Colorado. I stepped foot in the first time, October 21st, 2015. By December 17th, 2015, we had closed on our house. So it was just like, bam. Yeah. And I had remembered that you were out here. And so I was like, oh, let me contact Robin. We, I, we, I, after I got here, we to chatting again. I got to have a beer every once in a while. So I came over to Google, went to payments, which was, again, a sort of thing where it's a globe spanning thing for Google. I was like, we should make this into a, a, an actual product. They didn't wanna do that. Uh, So then I worked, went to a different ads research group and they were doing some image stuff, which was a last straw, like gasp for me at that point in time where I was just like, I felt very much like a cog in the machine and not really doing much of anything. uh, someone I'd worked with at payments to become the CTO of Major League Baseball who had created an office in Colorado. So I was like, all right, let me try that. And I head over there because I wanted very much to do ar'cause they've got all these great scans of all the stadiums and whatnot. But it also happened to be the same time when the Supreme Court. struck down the law against sports gambling. So then all of a sudden I became effectively the manager of a new team which was to create a gambling light app. And I still wanted to try and keep the AR thing alive, which is when I tried to get you involved, but they struck that one down, which was sad. again, it's all about what kind of the winds of the business have at that point in time, and sometimes it's just what's trendy and hot. So I did I learned a lot more about gambling than I ever expected to. So I went to Vegas a couple of times, which is not my place, but had to go there to actually understand the legal side of this. And then around the same time, so my wife was pregnant with my fourth child. I, some of my friends at Google said they were getting serious about the business side of things, and I got I let myself get fooled. And I headed back over to Google again this time as an engineering manager, where I had an absurd number of reports right before COVID hit. So I learned a lot basically about how to manage a team remotely at that point in time and get projects done. Tried to basically like, so I tried again to try and pivot into kind of more the business side, but Google really wasn't having that even then, which is super bizarre, especially where they're at now. And uh, then I joined a risk team for a little while, and then finally I was like, let just see if, get back into media. And I headed over to Twitch until yesterday when after having reformulated a team last year, getting us all together, deprecating a product, doing like lots of stuff. Twitch decided to have layoffs. And I think one of the big takeaways as I was going through all of this was much, much like where I, I really started from this really strong place for wanting to do content graphics computer animation games, and having it really be something impactful to people. I. There's a lot of it where I lost my way, like just by increments and a lot of what I'm excited about now that I think that there's actually a tremendous analogy between that and what happened with Big Tech, which was a lot of like really awesome stuff that they legitimately went after and then kind of got lost in the morass along the way. So what I'm excited about now is being able to figure out better pathways, not just to like getting back to the content side, the visual side for me, but where there are lessons more generally that can help out the industry, where the stuff that we're doing here can actually have positive impact so that people aren't as lost. People have a vision, people have a goal that they're really going after. So that is like one of the meta things for me that I'm excited about for all the stuff that we're doing.