What's New In Data

Transforming Data Productivity with Automated Workflows and AI with Chris White

February 09, 2024 Striim
Transforming Data Productivity with Automated Workflows and AI with Chris White
What's New In Data
More Info
What's New In Data
Transforming Data Productivity with Automated Workflows and AI with Chris White
Feb 09, 2024
Striim

Embark on an intellectual odyssey with Chris White, CTO at Prefect, as he recounts his metamorphosis from a mathematician steeped in the abstract world of optimization theory to a data science maestro and software development virtuoso. Discover how Chris's mathematical prowess translates into creative and practical solutions for data workflows, and how this synergy is revolutionizing the tech landscape. Our conversation peels back the layers of data science, revealing the indispensable role of automated workflows in bolstering productivity and bringing clarity to the complexities that data teams navigate daily.

Feel the pulse of innovation as we delve into the essence of Prefect, the workflow management system that addresses the shortcomings of existing tools and aligns with the intuitions of Python developers. Chris lays bare the intricacies of integrating event-driven constructs that enhance processing efficiency, and how Prefect tailors to the needs of both budding engineers and seasoned code wizards. Looking ahead, we glimpse the potential of managed compute offerings and AI integrations that promise to transform mundane workflows into robust workflow applications, setting a new standard for configuration management and collaborative coding.

What's New In Data is a data thought leadership series hosted by John Kutay who leads data and products at Striim. What's New In Data hosts industry practitioners to discuss latest trends, common patterns for real world data patterns, and analytics success stories.

Show Notes Transcript Chapter Markers

Embark on an intellectual odyssey with Chris White, CTO at Prefect, as he recounts his metamorphosis from a mathematician steeped in the abstract world of optimization theory to a data science maestro and software development virtuoso. Discover how Chris's mathematical prowess translates into creative and practical solutions for data workflows, and how this synergy is revolutionizing the tech landscape. Our conversation peels back the layers of data science, revealing the indispensable role of automated workflows in bolstering productivity and bringing clarity to the complexities that data teams navigate daily.

Feel the pulse of innovation as we delve into the essence of Prefect, the workflow management system that addresses the shortcomings of existing tools and aligns with the intuitions of Python developers. Chris lays bare the intricacies of integrating event-driven constructs that enhance processing efficiency, and how Prefect tailors to the needs of both budding engineers and seasoned code wizards. Looking ahead, we glimpse the potential of managed compute offerings and AI integrations that promise to transform mundane workflows into robust workflow applications, setting a new standard for configuration management and collaborative coding.

What's New In Data is a data thought leadership series hosted by John Kutay who leads data and products at Striim. What's New In Data hosts industry practitioners to discuss latest trends, common patterns for real world data patterns, and analytics success stories.

Hi, everyone. Thank you for tuning in to today's episode of What's New in Data. Super excited about our guest today. We have Chris White, the CTO at Prefect. Chris, how are you doing today? I'm doing great, John. How are you doing? Excellent. Excellent. Super excited for our discussion. You know, first, Chris tell the listeners a bit about yourself. Yeah. So, I guess my journey into tech began, probably in grad school. So I got a PhD in math at UT Austin and started in kind of a very pure realm of math that slowly made my way into, so I still was in pure math, but in definitely a more applied field of optimization theory. And so during that time, started to pick up, you know, some software practices, started to build some stuff myself, test various algorithms, et cetera, et cetera. And. By the time that I was wrapping up, I kind of had this realization that I'm much more of a problem solver than a research visionary. And so I didn't really want to set myself up for having to kind of define and defend and get grant money, etc. for, you know, a research program. I wanted just a fire hose of problems to be coming at me. And what better place to do that than an industry. So kind of tested the waters with some consulting and then, jumped right in as a data scientist and then. I guess to go from there, it was like another phase transition, actually. So I started where I was, in fact, building predictive models, like that was my job. So the output were, you know, coefficients and these sorts of things. But I was at Capital One, and it was actually at the time when they were doing their big migration from on premise managed servers to AWS. And so just got really involved with a lot of the various tooling changes happening and ended up writing. Lots of different software packages for helping a lot of the people make that transition into, you know, both new infrastructure, but also in a lot of cases, new languages. So a lot of the, especially business analysts were coming from SaaS like SAS, you know, not software as a service and Uhhuh, . So they had to yeah, make an adjustment to get to Python which is what everyone was moving to. And so then by the, by the end of my time there, I was, on a team that was responsible actually for a whole platform. So it was, we're building and maintaining infrastructure, building kind of notebook, like interfaces for business analysts et cetera. And then met Jeremiah, who's the, the real founder of Prefect. Then, and we kicked things off in DC five and a half years ago. Well, we'll get into Prefect's story, it's a really awesome one but first I want to ask you, so you, like you mentioned, you were in grad school and you have a math background. So you were getting, you were pursuing a PhD in mathematics at University of Texas in Austin. So how, how does math relate to data science and the software industry in general? I think in a lot of different ways. So my field, right, optimization theory has a very direct relationship. In the sense of pretty much any statistical data science, machine learning AI model is ultimately seeking to optimize some criterion over some known landscape of possibilities. And so optimization theory is just a huge driving force there. Like how can we do this quickly? How do we know when we're at a good. Place whatever that might mean based on the problem, etc so that's a very direct relationship, but I do think there's a more kind of loose relationship a lot of people I think think of math as being a place where there's one right answer and there's a sense in which that's true because it is based right on very like hard logic, but actually doing math research. There's a massive amount of creativity involved. I guess just I'm going to throw out this example, you know, try to keep it really short. Basically, everyone knows the prime number theorem or sorry, prime factorization theorem. Every integer can be decomposed as a unique So, yeah, we'll you know, product of primes up to powers, and there's always this question, is one a prime number, right? There's a lot of ways you can approach that question, but one way is just through kind of like a semantic argument, which is, if one was a prime number, then that theorem would be really annoying to state. And lots of other theorems would be really annoying to state because you'd say for all prime numbers except one and so you have a choice you say well, I'm actually just going to exclude it because everything gets cleaner and more beautiful when I do so. And in a sense, all of math is consistent if you had chosen it to be prime. And so, anyways, that kind of creativity of naming things. Picking and choosing the scope of, of a statement, et cetera, is, I think carries over in a lot of ways to just both data science and engineering more broadly. I love that. And like you said, a lot of people associate math with just being, you know, hard truths or, or, or proofs, or, you know, arithmetic says one plus one is two and that's it. And the truth is that there's a lot of room, like you said, for creativity, for abstraction, for dealing with the unknown, the things that you can never possibly know, and also paradoxes, right? I mean, I think Goodall did some amazing work there with the incompleteness theorem, and but at the same time like you said, lots of room for creativity, and I think that's, that's, that's one of the unique things you're bringing to the software. Industry with the work you're doing here at Prefect. But before we get into Prefect talk a little bit about workflows and why workflows matter to data teams. Yeah, so I think workflows, you know, you can really start at a high level and basically any repetitive thing that you do in a work context, you could argue is a workflow. That's maybe not a helpful definition to our previous topic. So, you know, what, what kind of scope do I care about? I care about repeatable processes. That are well suited to being codified in code. And so, you know, moving, data into a spreadsheet, maybe every morning you wake up, you make a query to a database, you download as a CSV, you do a little massaging of the CSV. You upload it back as maybe a Google sheet that probably could be automated with code. And that piece of code is the workflow. And now write how you define it, et cetera. That becomes kind of your tool choice, your language choice. But I do think workflows have this kind of unique, characteristic that they always are right on that boundary of kind of human manual work. And so it's very rare that you would have a workflow that is. Important and interesting to you that you wouldn't still like want to interface with or check in on regularly It's not like a true background job that you never really, you know, look at it's something that you Has a lot of I guess semantic meaning to you. Yes, and would you say workflows are purely a technical construct? Or do they bubble up higher to? More processes that you could explain to someone who's on the business side. Yeah, I definitely think that they do bubble up to that level. I think the reason that oftentimes the, you know, non technical people who think about these things is they don't have a good way To model the benefits that they would get from codifying it in code. But, but I definitely think that there are numerous things, and I guess that kind of falls into a broader, category of a business process automation. You know, just sounds a little bit larger than just workflow automation. Absolutely. You, you mentioned one example with CSVs. Would you mind giving the listeners other examples of cool workflows that you've seen? Yeah. They range from kind of, you know, the most standard ones that you can imagine. So data engineering, you know, caricature of this would be data in a production database. Need to put it in a data warehouse. The schemas aren't a hundred percent compatible and you want to do things like track deletes, et cetera. So you have some kind of transformation layer, you make a query. You change the data and update it in some way, and then you put it in a data warehouse. You could do that manually, once a day, once a week, whatever the case may be. But no data engineer in their right mind does it manually, right? You automate it with some sort of tool. So that's one. Data science, I think, has a lot of examples of this as well. An easy example, just to kind of build off the one that I just said would be every time a certain table is updated, maybe you want to rebuild some predictive model that is important to the business or to your product. Once again, you could do a human handoff there, or you can automate it through an API trigger. Tables updated, fire off some set of parameters that describe, you know, what the surface area of the update was, rebuild the data science model in some, you know, potentially expensive infrastructure somewhere. And then in a lot of other places, I think, I think workflows emerge once you really start to kind of get in the mindset of thinking about automating things, we probably, every engineer listening to this, I'm sure has had this. Experience where you start to, you start to see automation possibilities everywhere you look. And most of the time, those, I think, fall into the kind of workflow automation category. So, you know, I guess all the examples I was about to say that I've done for hobby projects actually fall into the data category, like. Downloading a bunch of buoy data from NOAA and doing some, like, surf, you know, predictions, things like that. Are you a surfer? I am a surfer, yeah. Not a good one, but I do enjoy it. Yes, you mentioned you're out in Half Moon Bay, California, so I think you're being modest. If you can surf there, you're probably in the upper percentile of surfers in the world, so. That's, that's, that's very fun. I, I love hearing people relate like you mentioned, getting a, surfer data off buoy and using data in your day to day tasks or things in your hobbies, things that, you know, you're passionate about. And ultimately data. Can be used to enrich your lives and it also can be used to enrich the way you work with other people. So how can data engineering teams with workflows, you know think about this as a way to bridge with other teams? Yeah, that's actually a really important question to me personally. From kind of my seat at Prefect. In my experience, when data is really important to a business, no automated data process is quietly running in the background. Everyone wants access and visibility to it. And so, being able to provide that kind of insight to all of the various stakeholders of whatever it is that you're doing, Without, you know, having a weekly meeting, for example, is really important. And so having a, having a dashboard that aggregates a bunch of stuff that you're doing, you know, and, I don't know, lateness of everything, maybe how much things are costing, et cetera, et cetera. How often you're having to do updates can be really. Powerful and kind of speaking of how data can enrich your lives, like metadata can do the same, right? You can start to say, okay, I've got all of this information about the processes that have automated and I can see that there's a big bottleneck and it happens to be, I don't know, maybe with this tool or something. So maybe the tool is not doing its job well, or, you know, maybe you need to scale it up somehow, et cetera. But sometimes those things can be hard to identify, you know, without actually. Charts and graphs. Yeah. Yeah, definitely. And you know, like you mentioned working through API's and I always like to say that even humans have API's. It sounds like a terrible robotic way of talking about people. But anyways, but even like understanding people's working styles and their personalities and you know, it's almost like, you know, you can't use the same way of communication with every single person, you have to know them as individuals and the same applies to data and collaborating across teams. It's like, what are the best? You know, we talk about data contracts in the industry. We talk about you know, A. P. I. Specs and there's there's right abstractions for. Different types of teams and companies to work with each other. And it's just understanding that, that handshake. And like you said, workflows can be a big part of that. Yeah, absolutely. Just knowing when something was updated, not to mention, right. All of the other things you might want to know the data definition, the ultimate source for the data, et cetera, et cetera. But seeing the runtime of the business that keeps that data moving, I think is where workflows come in, right? Data that isn't accessed or updated in some way is probably just dead data. Absolutely. Absolutely. We don't want that. We don't want that. Yeah. And speaking of, of, of workflows and collaboration. So you're the CTO of a company called Prefect. Tell me a bit about, before we get into the technology tell me the story of how Prefect started and how you got involved. Yeah, definitely. So Jeremiah Lowen, who's the, the CEO definitely was the originator of the idea and he was actually on the PMC of Airflow, which is a tool I'm sure most people have heard of, right, an Apache project basically the original workflow orchestration system. Well, I don't know, maybe. That's not fair to like some others Uzi, et cetera, but you know, the one that really kind of made a big splash on the scene with Hadoop. And so, he was trying to incorporate the state based model that Airflow has, so right, this just idea that a thing you're doing transitions through states, scheduled running, failed, scheduled running, success, whatever the case may be, is a really useful way to organize, think about, and gain observability into things. The things you're doing, especially if you have a lot of them. And he wanted to incorporate that into data science style workflows. So problem with that is airflow is very monolithic and doesn't scale well, especially for that sort of activity where maybe you have. Data science model build that, I don't know, processes a thousand features and you want visibility into each of those kind of feature processing jobs. Airflow is not going to be the tool for you. It's just going to not be able to keep up with that amount of information to track. And so, he started to build his own thing just for his own purposes, realized the potential of it, and through a mutual friend found me. I was involved in the DAS community, which is An open source Python framework for distributed computing. And I had written some blog posts, et cetera. And so I was in D. C. at the time, which is where Jeremiah is. So we started hanging out and, you know, one day like, all right, let's, let's take this to the next level. Let's not make it just be a little private repo that we're hacking on and actually get some funding and start a real company. And so maybe six months later is when we first open sourced. The, you know, original version of Prefect and yeah, I, I guess it's a lot of different evolutions that we've kind of gone through, but the original purpose was twofold. So one to bring those state based semantics into data science workflows and make sure that our SDK was like culturally aligned with what a data scientist expects. So, right. We immediately we're heavily decorator based. I like to believe a very Pythonic interface, first class Dask integrations this sort of thing. And then kind of as the next, oh, and dynamicism was another huge part of this. So just the ability to not pre register every single bit of compute that you're going to want tracked, allow at runtime dynamic spawning of things. We've leaned into that tenfold since the first version of Prefect. But also this idea that data inherently always has stakeholders. Kind of a fuzzy thing, but this idea that we wanted to make sure to always display rich information back, that you would be proud to share with someone who depends on your work in some way. And yeah, there's a lot of other stuff in there too, about kind of industries that we first looked at, et cetera, and some architecture things that we did, but we can. Yeah, I mean, it really sounds like it's I mean, it's a great story and it really sounds like it's you know, not just orchestration in the traditional sense, but really doing so much more in terms of building scalability into the process, state management, being aware of, like you said, the, the things that The data creators will be proud to share with their business stakeholders. So it really does elevate the value quite a bit. And one thing I want to dive into, you mentioned event driven. So, you know, explain that and the value of it in the way Prefect delivers event driven orchestration. Yeah, so event driven is it's an interesting category, right? As you definitely know it can mean a lot of different things to a lot of different people. What it means to us is, so first and foremost, events are tools for observability. So, we operate at, you know, the business logic layer, if you will. And so the event streams that you'll find when you first set up Prefect are creation of resources, when resources are updated, these sorts of things, not necessarily the sort of event data that you would get in, say, data dog, right? Like really deeply sampled CPU level time series or something like that. And so, okay, observability, just like a pod maybe in Kubernetes was rescheduled to another node. So that messed up your workflow. That's an event. That happens that helps you recover from failure quicker when you know what happened, but more interestingly, and kind of, I think more to the spirit of your question is using those events to then automate more things. So those things could be triggering workflows, right? So really common example here is drop a file in an S3 bucket. Every time that file drops, you want to run a workflow with a parameter that references that file. And it does something and processes the file and, you know. Most likely produces some additional side effects somewhere else. So that's one thing. So trying to kind of keep up with whatever scale people tend to do those things at, which can be very extreme is, is a fun scaling challenge. And then also though, I think events allow you to automate a kind of a different level, which is like. Predictive failure, for example, so both the presence of expected events, but also their absence can be really powerful signals. So, for example a batch workflow, you're basically saying I expect, you know, my data warehouse to be updated once a day because I have a job scheduled maybe at 8 a. m. Maybe I'm not running the job in an event driven way. I'm just doing scheduled. But in doing so, right, there's this expectation on the other end that the data warehouse is in fact being updated. And so you could add kind of an insurance layer to that pipeline, which is to say, every time a table is updated, produce an event. And if a 24 hour period goes by where no such event is produced, like send me a text, or run a different workflow that produces some debug report or, you know, whatever the case may be. So automating through. Event triggers is another big component of event driven work for us. Yes. So, you know, the, the, one of the things that's valuable there with event driven is rather going purely time based and running a bunch of computers, you know, running the compute when an event is a specific event is observed and that does map pretty well to scenarios where, you know, your, your use cases you know, let's say you're an airline and, you know, an aircraft just landed and there has to be this maintenance job around it. Or, you know, you're some other domain like retail, like the event can be a a purchase or a return request, a refund request, things along those lines. And that's what's triggering the downstream data and workflow actions rather than saying, Hey, every 30 minutes, let's see what the current state of the world is and run a bunch of compute to see how we should decide what to do, even if, even if there's nothing. That happened, right? Which is you know, as data teams are becoming more sophisticated things that they want to avoid just to make sure that their systems are optimal and, more cost optimized. So it is a powerful construct. And yeah, as you mentioned you know, I'm familiar with that space, but it's great to see, you know Prefect really think about this enhanced workflow capabilities yeah. And, you know, because traditionally with things like Airflow, yeah, you're, you're kind of boxed into this, this batch world, right? Everything is being done with batches. Exactly. And I think that's always kind of the first, smell, if you will, for needing event driven is when you set up something that is pulling something else to check whether work needs to happen. And really that probably will be more efficient if you restructure it into an event driven way. And, yeah, I see that pattern more and more. Yeah, and I'd love to hear your story in terms of, you know where you're seeing that pattern bring value to data teams. Yeah, I mean, efficiency is just a big one, right? So, being able, once you have an event driven pipeline, you can really scale it out in a way that is harder to do when you have maybe the poll based or the batch based type. Of workflow. So you now have the option to, for example, process each individual, like most granular thing that you're interested in with its entire workflow. Which is nice because sometimes like the types of failures that you get tell you that there's something wrong somewhere in a batch and now you have to do more work to figure out where it was, you know, where the problem was. So being able to tie it right to concrete granular ID. is one just huge benefit. But also I think this is kind of something we mentioned before, just the handshake between teams can be really powerful with event stuff. So, for whatever reason, pipeline took twice as long today, but it's still totally fine. But if you were trying to, like, time it with schedules to say, okay, at 5pm I'm gonna run this new thing because the 8am job should be finished by then. Oftentimes you're gonna just get these weird mismatches and Problems, but so wouldn't it be better and more efficient and probably get your, you know, Final result faster if it was just strung along through an event system And would you say that it makes the systems more resilient? I would say that yeah, because I mean for a lot of reasons, but the one That I guess I see The most here is that granular level. So once you have your workflow or process designed in such a way that it can be parametrized at this level, then. It's way, way easier to either ignore a failure if it really was like, you know what, that one wasn't important or reprocess it without having to reprocess like a bunch of stuff. So like the job itself might actually be a lot faster because you're not processing a batch, for example. Absolutely. And you know, the, and one of the things that I've seen is the event driven construct also makes. Re replayability more powerful you know, scenarios where, you know, because every team's data, data processes ends up becoming this super long dag at some point directed a cyclic graph like the, the, the flow of like components that are processing data and if anything goes wrong, rather than, you know, just dropping everything and reinstate reins, instantiating the, the tables from scratch and then running your, your pipeline. You know, I'll just say nope, that, that one thing, that one event failed at this very specific component we're just going to continue from there and, and, and retry it, you know, that, that, that's another thing that saves a lot of time, saves a lot of costs so it's great to see some thinking along those Lines to make it more, like you said, resilience and also making the retries more powerful. So, another thing that I, I like about Prefect when I, when I looked at it is that, you know, just the way it empowers Python developers. Can you speak a bit about, you know, how a Python developer can use Prefect? Yeah, I used to, I used to joke a lot that Prefect is a Python ops framework. And so a huge design goal of ours from the beginning, and I think we've kind of perfected it in the last year, is to allow you to write Python code that is familiar to you, and you know, without contorting your code into some abstractions that might not make sense to you or just might not mirror the use case that you have. And so because of that, there's two things that have happened that I think are really exciting. So the first thing is we've been able to cater to more junior engineers in a lot of places. So we try to have, you know, highly sensible defaults that are also deeply configurable. So more advanced, you know, people can Can benefit from us in whatever way that makes sense. But a lot of people that are using Prefect, you can, you can tell based on the questions they're asking that they're definitely more junior to Python. But you know, what they'll say is it gives them superpowers and allows them to do things that they wouldn't otherwise be able to do. So just for example, you can take a Python script. Maybe it was just a, you know, a script that you considered basic wasn't doing a lot. Drop one decorator on let's say the main function, you know, that you would use to run the script and with only that decorator. So flow decorator prefect flow decorator. You now have that main function has its own typed API. It has observability. You still can run it ad hoc. You do not need to schedule it with Prefect. You can even use a different scheduler if you want. Most people eventually, of course, bring it over just because the observability hooks become nice, but you don't have to. And with only that one code change, right, you have a front end for your script, and that can, especially when you're kind of new to setting things up, can feel really Exciting, frankly. And now you can hand it over to the next person and say, Hey, hit this button. And my script will run, you know, in some production environment. Isn't, isn't that cool. Absolutely. Absolutely. And it's very cool. And, you know, a lot of Python is just such a fast language to work with the elephant in the room is, you know, how do you scale it? How do you productionize it? And, you know, I think that's like, like your joke you know, Prefect does, does help you scale out and productionize things you're doing in Python, so that's great. Definitely a lot of value there. And on that on that same point, I guess the second thing that we've really picked up on as a trend over the years, this isn't new for us, is, Because Python is such a popular language, if you are doing anything, not necessarily related to data, You could benefit from, from prefect. So right, take a script, turn it into what I like to call like a workflow application has a front end. It has a state that's configurable within prefect, et cetera. And expose that to other stakeholders. And so we see people increasingly away from data adopting Prefect as well. I mean, they're usually data adjacent to be, to be fair. But just, you know, that's Python slowly making its way out. But that's also been really cool and exciting to, to see and learn about some other, you know, industry trends and use cases that are. Not in my standard wheelhouse. Well, like you said, data adjacent. I think the modern professional has to be data adjacent now and be able to speak the language and know how to use data in their decision making and actions. So it's just such a vital skill now. And this is why the work we're doing in the data industry is so critical. I really do think we're, we're scraping the surface right now. So what is Prefect doing next? Oh, that's, that's a great question. There's a million things I can list. I always try to have a big list of things. I would say in the most immediate term, some things that, you know, an interested listener can expect from us is One thing that's interesting about Prefect is we don't offer managed compute, which might surprise some people. We can talk about the reasons for that separately, but we will be offering managed compute in the very near future. So that's actually one of the things. I'm currently at an offsite where we are building and pressure testing our managed execution layer. So that's really cool. That's kind of, you know. Maybe not an explosive feature, but definitely will be useful to some people. Because prefect is this, you know, semi generic Python framework. We've always been interested in AI and AI type workflow applications. So we have. A popular framework called Marvin, modeled after also Hitchhiker's Guide to the Galaxy, which is where Prefect comes from. And Marvin in, with its 2. 0 release will actually be backed by Prefect primitives. So you'll kind of have a nice dashboard, observability, managing your prompts. Things like that, you know, tracking cost of your open, open AI token. So yeah, more AI integrations is another thing. And honestly leaning into this framing of Prefect. Elevate your workflows to workflow applications and all of the things that you kind of might think of that go along with that. So a little bit better kind of configuration management for these things. I mean, we have a lot of that right now, but I think there's a lot of ways that we can add to it. Also, you know, thinking of an application framework, really leaning into observability. So we're going to have, first class time series metrics that you can publish. So you can track basically any, any float that you publish as some piece of state from a workflow or a task. You will be able to, after the fact, aggregate those things and, you know, perform interesting kind of filters and, and charting over them. So you're just publishing kind of these things without knowing how you're going to use them in the future, but then, you know, once you've got enough data, exploring that data is another. Another thing that I'm really, really excited about. Absolutely. Well, Chris White, CTO at Prefect. So much exciting stuff coming up there. It was really great connecting with you as a person. A lot of I can see a very innovative stuff going on and there's no better. Person to lead that than you clearly working on, you know, bringing your, your depth in, in mathematics and also creative problem solving and making it accessible and productized for those who want that elevated data workflow orchestration Chris, it was so great connecting with you here today. Where can people follow along with your work? Yeah, thanks, John. LinkedIn, I'm still on Twitter. My handle is Markov gains with a Z. I know, I know, I made it a long time ago. GitHub, you know, we Prefect has a number of open source repos. So definitely go check them out, go get involved. We also have an incredibly active Slack community. You can find links also on our GitHub repos. So come hang out there. We've got a lot of channels that aren't, you know, strictly talking about Prefect, just data, data adjacent things. And. And yeah, at any, any conferences that you see me post about, I try to, I try to be out in the world, you know, as much as is reasonable. So great. Great. Well, we'll have the links down to your your LinkedIn and your Twitter now known as X. I don't know anyone that calls it that, but you know, we'll leave it there, except for Elon and Chris is great connecting with you here. Thank you to the listeners for tuning in today. Yeah. Thanks again, John. This was really fun.

Math and Workflows in Data Science
Data Science and Workflow Automation Collaboration
Prefect