Picture Me Coding
Picture Me Coding is a music podcast about software. Each week your hosts Erik Aker and Mike Mull take on topics in the software world and they are sometimes joined by guests from other fields who arrive with their own burning questions about technology.
Produced by Neiko Will.
Logo and artwork by Jon Whitmire - https://www.whitmirejon.com/
Reach out to us here: podcast@picturemecoding.com
Picture Me Coding
The XZ Apocalypse
A week ago a developer in San Francisco named Andres Freund found a backdoor in SSH which would grant some shadowy figure access to Linux machines running the latest version of a library called liblzma. Even more incredibly, there were various semi-anonymous figures clamoring for inclusion of this compromised version of liblzma into the latest version of various Linux distros.
This entire scheme had been underway for over three years before it fell apart under Freund's scrutiny and attention from the broader software industry.
This week Mike gives us a breakdown of the exploit and we talk about the fallout from this backdoor which took advantage of an overworked and vulnerable open-source maintainer.
As Mike puts it, the story is "bonkers".
To read more about it, check out these articles:
- The Verge: “How one volunteer stopped a backdoor from exposing Linux systems worldwide”
- Wired: “The Mystery of ‘Jia Tan,’ the XZ Backdoor Mastermind”
- TuxCare: “A Deep Dive on the xz Compromise”
- Timeline from Boehs.org: https://boehs.org/node/everything-i-know-about-the-xz-backdoor
[MUSIC] >> Hello, welcome to Picture Me Coding with Erik Aker and Mike Mull.
Hello, Mike.
>> Hey there.
>> Mike, we didn't chat last week.
We were both sick.
That's the story.
>> I'm sticking to it.
>> We called in sick to the podcast.
Have you been listening to any music for the last two weeks?
>> I have, yes.
March was a good month for music, and one of my favorite artists, Waxahatchee came out with a new album at the end of March.
Waxahatchee is one of those tricky bands because I think it's really just one person, kitty crutchfield, but in any case, I've been following Waxahatchee's music for over 10 years now.
>> Wow.
I didn't know that.
I loved St. Cloud. You pointed me to St. Cloud and I love that album so much.
It was like a soul healing syrup I needed.
>> Yeah, especially during the pandemic.
But the first album I've heard of hers was an album called Cerulean Salt, which came out in 2013 and was a big hit on a bunch of indie rock best of that year lists.
I really like that album and everything she's done since then has been pretty interesting.
Then, as you say, with St. Cloud back in 2020, she took a little bit of a deviation into more, I guess what you'd call Americana and almost country-ish music.
I loved that album and it was a godsend in the middle of the pandemic.
I really like this new one too.
It's very similar in mood, maybe a little bit more rock and roll.
I really enjoy it.
There's probably three or four songs on here, which are instant all-time classics for my taste.
>> I like it a lot too, actually.
>> I've probably listened to it 50 times already.
>> I'm probably a little bit behind you, but I think I'm going to be up there too.
I've been listening to a band called Deaver, the album is called Amber Eyes.
I've just started to get into it.
It's pretty heavy stone or doom type of stuff, but it was described as stone or doom plus grunge.
I thought, well, that's the funny term you don't hear much.
I don't have a lot to tell you about on the album.
I like it, but when I started listening to it, I noticed this is on Spotify.
They said to me, "Hey, people who like this album like these other albums."
One of the other albums was by a band called Earthbong.
I haven't heard Earthbong, but I was immediately like, "Oh, maybe I'm in the right place."
>> Yeah, that does sound like, I mean, you had to get there from Sleep, right?
>> Yeah, exactly, the stoner doom connection.
>> I have not listened to this, but I will check it out.
I have not heard a doom album, a doom metal album this year that I am in love with, so still searching.
>> I think I have another one for you next week, but we'll see.
This week, we wanted to try to discuss what has been huge news in our industry, and even outside of our tightly knit community of developers, software developers, software engineers.
This is the XZ-utils hack or XZ-utils backdoor.
Now, anybody listening to us has probably already found out that this thing exists, but it is a fascinating story, and you went a little deeper into the exploit, so I wanted to try to see if we could shed some light on what that exploit looks like.
But first, I wanted to talk about how this thing was discovered.
It was just about a week ago, it would have been March 29th, people are saying this is Easter Sunday weekend, I guess.
There's a developer in San Francisco who works for Microsoft, his name is Andrus Freund, and he posted on Mastodon.
"I was doing some microbenchmarking at the time, needed to queers the system to reduce noise, saw SSHD processes were using a surprising amount of CPU, despite immediately failing because of wrong usernames, etc."
"Profiled, SSHD showing lots of CPU time and libLzMA with perf performance, unable to attribute it to a symbol, I got suspicious."
"Recalled that I had seen an odd Valgrind complaint in automated testing of Postgres a few weeks earlier after package updates," and he ends with, "it really required a lot of coincidences."
So just to give a few details about what he's describing, he's saying, tried to log in remotely to a machine using SSH.
SSHD is the SSH daemon, which allows access over SSH.
And he's saying, it took a long time.
Now for what I understand, this developer is not someone who regularly contributes to SSH, this is someone who contributes to Postgres SQL.
But he was suspicious that it took so long to log in over SSH.
Yeah, and it's interesting because, well, two reasons.
One is it gives you a sense of what, what slow means in the computer world, because the differences were something like 500 milliseconds, which, you know, is probably inconsequential to most people, but is a huge deal in computer benchmarking terms.
The other reason why this is interesting to me is it bears a lot of resemblance to the famous Cuckoo's Egg story of the late '80s where Cliff Stoll sort of stumbled across this minor accounting error and turned into this huge global security hack thing that he wrote a book about.
A little plug there.
If you've never read the book Cuckoo's Egg, I would recommend it.
It's a great book.
I loved reading it.
Do you remember the thing he stumbled on?
It was time stealing, right?
But what was the actual issue that Cliff Stoll stumbled on that he wrote about in Cuckoo's Egg?
Well, as I don't recall the details, I just remember that he was looking at some, some accounting, well, essentially accounting information, you know, people being charged for a computer time and he found what was something like a 68 cent discrepancy.
And instead of doing what most people were doing, shrugging their shoulders and saying, "Computers, am I right?"
He kind of went down this rabbit hole to figure out what the heck happened and it turned into this massive thing, which, you know, again, here it was sort of good fortune that this guy found this seemingly unrelated thing and it took him down this path.
It's incredibly lucky.
I can't imagine how many times I've ever used SSH to log in.
Have I ever thought, "Wow, this is taking a long time."
And if it did take a long time, I would probably just cancel it and think it's something wrong with my system or something wrong with the network, right?
I would just probably assume it's a network problem.
It's amazing that this is the thread that this person started unraveling.
But he also connected it to this Valgrind.
You know, can you tell us about Valgrind?
No.
I'd never heard of it before this.
You've never heard of Valgrind?
I've heard of Valgrind.
No.
You hadn't heard of it?
No, I had not.
Oh, okay.
The performance... I just heard it associated with trying to understand and debug C programs.
I just, as of this bug, started looking at it.
That's amazing.
I just assumed it was a tool people have been using for decades.
It probably has.
Just not something I've ever encountered.
Okay.
Here's my question for you.
I don't understand the exploit.
Do you think you'd be able to explain it to me in terms I could get?
I will make an attempt.
Okay.
My own understanding of it is still a little bit incomplete and fuzzy, but I will go for it here.
So ultimately, what happens is that the exploit gives the attacker a backdoor into SSHD.
If people aren't familiar with that, that basically means that the system that most Unix distributions used to allow you to log into them via SSH, and in this particular case, it's logins via public key.
A specific public key?
Yeah.
Well, the backdoor uses a specific public key, but it's when you're logging in.
So you can log into SSHD using a username and password as well, but most people are going to be using a public key.
It's basically a way to get into, you know, get to a shell on a remote machine.
I would guess a lot of people listening to this are quite familiar with it.
The person who created this exploit, the part that's fascinating to me is that, so this person probably not a real name named Jahtan, who became a contributor to the XC repository, which was, you know, this project was originally started by a guy named Lassa Collins, I said it.
And he was very busy with it.
And so this person allegedly named Jahtan came along and sort of got himself involved in the project.
And the interesting part to me is that he did not directly change the source, you know, the program source code.
So what happened here, as I understand it is this contributor added some binary test files to the repository.
Yeah, that sounds pretty harmless, right?
I'm just going to add some test files.
We're going to use them for testing, like fixtures or something, right?
Yeah, exactly.
You know, and this is a compression library.
So adding binary files to compress or decompress as part of testing was not looked upon suspiciously.
It seemed more or less like a normal thing to do.
But inside these test files, he had apparently introduced bytes that, you know, were essentially code.
And the other part of this, which is really interesting to me is that there were some sort of scripts that were run on these test files using this M4 macro processor, which is, you know, a thing that's been on Linux systems for decades.
I remember using it back in the early 90s.
That's a thing I've never heard of.
It valuates macros for C?
Yeah, and a lot of people, you know, I was using it in make files at the time way back when.
And I think that's still a really common use is, you know, to put macros into build systems.
And apparently there's this M4 file that is used apparently in like distribution builds.
He added this file to the repo, but also put it in the get ignore file.
So if you were just like a hacker who pulled the repo and were building it locally, you probably wouldn't have seen it or used it.
But when places like, you know, Debian or building distributions, this was getting included.
So there was this M4 thing and then I guess a couple of bash scripts as well.
And the final result was this, it would create an object file.
It's C code.
So, you know, it's something that's compiling and then being linked.
And so, you know, in a typical C pass, you compile it from the C code into an object file, .ofile, and then that all gets linked up into a library or program.
So there's kind of hidden stuff.
There's stuff in tests and there's this hidden macro evaluation and kind of hidden build scripts that end up producing executable code that gets linked in is what you're saying.
Yeah.
So it produces this object file, but it, you know, it's not doing it by compiling code.
It's doing it by extracting bytes from these test files and then running some, some scripts on it to sort of process it and decrypt it.
And then, you know, what you end up with is this object file on the, on the local disk, which eventually gets linked into the final library.
So anyway, I just, I find this whole process to be very clever and very fascinating.
I don't know if this is a, a type of exploit that's been tried in other areas, but it's the first time I've ever seen it.
Apparently he also, and we're assuming it's a he, we don't know.
This contributor made a change to something called the landlock feature, which I guess if, if enabled would have probably not allowed this sort of rogue binary file to be linked into the program.
So he made this change to the code that essentially disabled that.
This was kind of passed around the internet because the thing that broke it was that he like added a dot at the beginning of the line.
I didn't understand that I saw that dot before the void.
Yeah.
I'm not entirely sure exactly what that does, but apparently it had the effect of disabling this landlock feature.
So anyway, that this, this rogue object file gets linked into the lib lzma.
So far it sounds like quite a lot of knowledge went into this.
Random dot, this, this knowledge about how to hide code, a lot of knowledge about C, building C programs.
Yeah.
And apparently pretty specific knowledge of certain distributions and how distributions are built and distributed.
You're saying distributions like Debbie, yeah, like Fedora, like Ubuntu.
Like Linux distributions.
The person doing this clearly was taking advantage of certain aspects of the way those systems build themselves and incorporate new libraries and so forth.
So did we get to the point where if I have that private key where this embedded, I think it said there's an embedded public key in there, because as I said, you could also have a private key on the remote machine that is your private key.
In this case, it's a public key.
How does it work where if I have the right private key, I try to log into SSH and I just magically get in because of the backdoor.
Do we, do we follow that part?
Sort of.
So when you build XC, you get this library, lib lzma, which allows, you know, other programs to use the compression, decompression features.
SSD, OpenSSH does not specifically use this library.
So, you know, that was another interesting detail that this person apparently knew.
What did happen is that a lot of these distributions, they modify OpenSSH so that it can be started with SystemD.
What does that mean?
They modify SSHD so it can be started by SystemD.
It's not just a SystemD startup script.
Apparently, there's some sort of patch that has to be made to OpenSSH to allow it to be started with SystemD notification.
So there's like a lib system D thing that needs to be incorporated.
So it can listen to signals from SystemD is what you're saying.
Right.
And apparently lib system D does depend on this lib lzma.
So that's why, you know, this doesn't work on every distribution, doesn't work on every implementation of SSHD.
It only works on these cases where it's a distribution that is starting the SSHD daemon with SystemD on particular types of machines.
Apparently, what it would do is that it essentially replaced some system calls that were normally made.
There was a function called get CPU ID, which is apparently where the exploit was implemented and because of the way it was incorporated into lib lzma, it would get called by SSHD when it was starting up and then it would do a thing and then call the real system function.
In the situation where it was checking for this particular private key, it would slow the connections down, which is what people noticed.
But if the specific public key was found, it would invoke this backdoor and essentially give the person with that key access to the system.
Lots and lots of work went into it and lots and lots of circumstances were required to get this to actually work.
Yeah.
Well, I want to talk about the timeline in a second here, but it does sound like a huge amount of knowledge went into making this exploit work.
Yes, both a considerable amount of knowledge so that the person clearly knew what they were doing and also it took time.
We'll talk about that a little bit more as we go along, but it's interesting that the person had to warm their way into the project and then make somewhat what looked like legitimate changes to the repository and then slowly introduce this stuff over a period of time.
It wasn't just something where they hacked into the code base and added something that was in the next release.
It was really a, we've been using the term long con.
Yeah.
But it's also a lot of knowledge about Linux distributions.
So which distributions were affected that you know about?
I think basically anything that used to system D and that I think it's more fuel for the system D hater fire, right?
Although those fires have kind of cooled over the years as they sort of lost, I think, didn't they?
Yeah, I suppose.
Everything using system D.
Yeah.
So using system D and then I think it's also limited to machines on particular hardware architectures, particularly things using G-Libc.
The distributions that seem to be affected are things like Fedora, Ubuntu, Debian, probably some other.
But not fully stable released versions of those.
This was like about to be released.
Yeah, these were still all pre-release versions at the time that the exploit was found.
Thank goodness.
Except for Arch Linux.
Arch Linux I think has a quicker release cycle and they had already released a stable version of Arch Linux, which included the compromised version of the library.
I think that's right.
So this is like the five, five, six, zero and five, six, one versions of lib LZMA were the hacked versions.
And yeah, those were starting to be rolled into the various distributions.
What about OSX?
I heard a little bit about it.
When I have done Python on OSX, if you try to install Python via Homebrew, you can see it trying to install XZ first.
That's where I first started seeing XZ, this compression.
I remember thinking, what is XZ?
Why do I need that for Python?
Yeah, I saw some stuff about that too.
People were recommending that you back out the five, six, zero and five, six, one versions that were coming in through Homebrew.
But as near as I can tell, even if this code were on your system, there's really no way for it to exploit that particular hardware operating system.
So I don't think there's any way that this exploit is going to give anyone a back door into your Macintosh.
So not too worried about OSX then?
I'm not, no.
I want to talk about the timeline a little bit.
Timeline makes this even more interesting.
You hinted at this a little bit.
So there's a really good post that just has the timeline, boz.org.
I'll put a link to it in the notes here.
They've got a great timeline.
Notably in the timeline, there are at least three false identities involved in various stages here over three full years.
There's lobbying for new maintainers, there's lobbying for patches, there's submitting auxiliary PRs to assist the exploit.
Now, this starts all the way back in 2021.
I mean, it probably starts earlier, but the account which people consider to be the account that has been pushing for and merging the compromise code, the back door, the one that the commits are assigned to, it was created on GitHub in 2021.
So that's three years ago.
And initially that account merged a minor PR in a project called Lib Archive.
People now think, wow, that's probably hugely suspicious because you look at this early PR, it does virtually nothing.
Okay.
So that's 2021.
In 2022, the first patch is submitted to the mailing list for XZ.
And there are people who appear, and notably they disappear shortly after, who show up to lobby the maintainer of XZ to add a new maintainer to the project.
They're doing this lobbying that you see in the open source world, which is we need patches merged, we need to know this project is alive, we depend on this for our work, and they're pressuring the maintainer to add someone as a new maintainer to the project.
This GiaTan merges their first commit 2023.
In March, they are changed to be the primary contact for the code base.
And then finally this exploit that you mentioned was in a test is actually merged into the code base.
So that was 2023.
And interestingly, this was sort of in spring, early summer 2023.
And then in July, there's a PR on a project called OSS Fuzz to disable fuzzing to then mask the exploit back in this library.
So fuzzing is a security tool for looking, searching automatically, programmatically searching for potential exploits or problems in programs like this.
So this would have been 2023 about a year ago.
And then a year later, February 23, March 8, they merged the final steps of the back door into the repo.
And then at the end of March, it's discovered.
And around this time, there's all these shadowy accounts pushing hard for the latest backdoor version of this library to be included in various Linux distros, Debbie and Fedora, Ubuntu, all these lobbyers come out of the woodwork.
They all appear, apparently don't exist.
So there's these ghosts in the story.
We've been talking about this all week, and it's the word we keep using is bonkers.
And I just can't think of a better way to put it.
But yeah, this social engineering effort is almost more impressive than the technical effort, both because of the timeline involved and the sort of exploitation of this open source maintainer and the sort of psychology of putting pressure on people, you know, it's generally regarded as a bad thing to put pressure on open source maintainers who are just trying to do their best.
But of course, the maintainers want to satisfy people.
And so just this this campaign that seems to have been mostly fabricated to first replace the maintainer and then to sort of rush these libraries into production is just it's bonkers.
Yeah, all the all the online accounts, the email addresses, the GitHub accounts that were associated with pushing these patches or lobbying for them, they all have no activity outside of this activity, aside from the GITAN, which has quite a lot of open source commits.
The fact that they all kind of vanish after they play their part in this tragedy leads people to believe, wow, this was a coordinated campaign.
This is fascinating.
So there's a great mystery here, I think has really seized the imaginations of people like us.
But one of the first set of comments that I saw by people like Gliff, who's known in the Python world, he wrote Twisted and has been heavily involved in Python and open source and software engineering and echoed by someone like Jacob Kaplan Moss, who's involved in Django.
And they were talking about open source maintainer burnout.
So Gliff, very early on, writes, I really hope that this causes an industry wide reckoning with the common practice of letting your entire goddamn product rest on the shoulders of one overworked person, having a slow mental health crisis without financially or operationally supporting them whatsoever.
I want everyone who has an open source dependency to read this message.
And he links Lassie Collins' message right around the time where this lobbying and pressure is happening to get this GITAN added as a maintainer.
So here's the message Lassie Collins wrote, "I haven't lost interest, but my ability to care has been fairly limited, mostly due to long-term mental health issues, but also due to some other things."
This is the developer who started this project, XZ, and has been the sole maintainer for it.
Recently, Colin continues, "I've worked off-list a bit with GITAN on XZ Utils, and perhaps he will have a bigger role in the future.
We'll see.
It's also good to keep in mind that this is an unpaid hobby project."
So this first commentary that I saw was about open source maintainership, open source burnout, the famous XKCD cartoon where this huge stack of blocks is resting on this one tiny block, and it says, "Everything depends on this person in Nebraska who's been solely maintaining this hugely important thing for 30 years."
What do you think?
Yeah, I've seen a lot of similar commentary.
I do worry that some people will use this as a open source as bad flag.
There's probably a solid argument to be made there, given what happened here.
But I think there's two ways to look at it.
One is that the long-standing open source claim that it's more secure because people can see the code and find issues and fix them and so forth, as opposed to closed proprietary code, is maybe a little bit of a dubious argument.
With full thinking.
Yeah, if there's really only one person working on the code.
It's pretty obscure, right?
It's not an easy project.
It's doing some fairly sophisticated things, but it's also something that...
So you can't really just probably jump into it and understand it, but it's also become sort of widespread.
It's a better compression utility than what had before.
And so it's really used to a lot of different things.
And it's just a hobby project that this person started.
Yeah, it's just a hobby project.
And it's probably something that is more or less unique to the open source world.
The counter to that argument is, look, people did discover this and once they discovered it, they were able to see what happened and figure out a strategy to mitigate the problem.
Whereas with the SolarWinds hack of a couple years ago, you kind of had to wait on the company to announce the issue and wait for them to supply fixes for it.
I think that's what I find fascinating about this.
This is an exploit that happened in plain sight.
It's like a heist on stage where anybody may be watching at any moment in time.
It's some sort of magic trick.
I'm out in public.
It's pretty obscure.
I'm trying to direct your attention over here while I do this thing in the background.
It all took place in plain sight if people cared to look.
Yeah, exactly.
And it also makes you wonder, was this unique?
Is this just the one that people caught?
Yeah, exactly.
It was just going on in a widespread way.
And they're just the people doing the other exploits just haven't been detected yet.
So the far greater portion of responses that I saw to this was about who did this?
So the GitHub account, GiaTan, people assume now that it probably was a group of people.
It took too long.
It took too much coordination, care, and planning.
It might have been one person.
It might be a state sponsored actor.
There's a lot of speculation about that, that this was too organized.
So I've seen a lot of questions, which boil down to who is this person?
In Wired, they have a pretty good article called The Mystery of GiaTan, the XZ Backdoor Mastermind.
The Verge has a pretty good piece, how one volunteer stopped a backdoor from exposing Linux systems worldwide.
In the Wired article, they talk about the long timeline, the multiple shadowy identities, lobbying, social engineering, you mentioned.
They write that inhumanly patient approach, along with the technical features and sophistication of the backdoor itself, has led many in the cybersecurity world to believe that GiaTan must in fact be handled, operated by state sponsored hackers, and very good ones.
And they do mention later on that this account made 6,000 code changes to at least seven projects between 2021 and February 2024.
That's a huge amount of effort in a three-year period.
Yeah, it's possible that one person could do that, but you might have people sharing this account.
And what they're ultimately trying to do is build trust in the account.
Over a long enough period of time that they can inject this backdoor, that is pretty fascinating, hard not to speculate about.
Yeah, it'll be interesting to see if anybody is able to investigate it to the point where they can actually pinpoint who is doing this.
My guess is it's probably more than one person.
It seems to me that maybe if this were their full-time job, somebody could probably do the coding end of it on their own, but all of these other campaigns of social engineering and creating false accounts and sending messages designed to pressure people into doing things, it feels like a different skill set probably was farmed out to a different person or persons.
There was a pretty funny post on Masa.on where somebody was like, "Well, take their perspective into account."
They've been working on this for three years, and right before it goes into stable Linux distros, some random person discovers it, all the work down the drain, that must be awful.
I kind of was imagining you go to work in your office job, you work for some state hacking agency in Russia, you get a request one day to try to get me into SSH, and then you start planning it out.
You start looking for random libraries that are maybe loaded, that are run by one person where you could start contributing exploit code.
You can imagine the amount of ticketing and planning involved in this.
Organizations I worked in would never pull this off.
It's too long-term.
Yeah, you got to figure there had to be some sort of research project prior to the whole thing starting where they made an attempt to identify a particular project or a particular maintainer.
They must have known at the point where they started harassing Lassa Collins that they were going to attempt this exploit two years before they actually got it into production.
Right, they're looking for a mark.
Yeah, maybe they found several, but the official timeline goes over two years, but it seems like it probably was an even longer project with more planning than that.
Yeah, a lot of time, a lot of money invested in that.
But you mentioned the maintainer Lassa Collins, and my heart goes out to this person because they posted way back in 2021, 2022 about their burnout, their mental health struggles, and there's a huge amount of attention that's going to be landing on this person's shoulders that's really undeserved.
Yeah, and I think you have to remember too that they created a very useful thing.
Yeah, I'm sort of like, what is that poor person going through now?
I really hope that person's doing okay.
I mean, I honestly feel that way.
Yeah, I don't think the community is probably going to put much blame on him, but it does seem like it's probably really difficult to accept that something like this happened.
Even if you didn't have people banging on your door saying, "Wow, you did an awful job letting this state agent gain entry into your project and gain your trust," even if you didn't have people telling you that, it would be hard not to feel that way.
Wow, I let everybody down.
Yeah, I can really only imagine.
I can't imagine that my feelings would be inconsequential.
It seems like it's really got to have a severe effect on your outlook about the industry and doing projects like this.
So supply chain security worrying commences here, worries about open source here.
What other projects would they have attempted this for?
Will this change affect how large companies use open source projects, scan them for vulnerabilities?
Probably not, right?
It's probably going to be business as usual.
What do you think?
I do think there probably will be more products and projects around looking at open source or looking at supply chain stuff in general.
I mean, there's already commercial projects to do that, but as somebody who's been in the position of having to look at things to buy to secure my organization, my sense is that this probably doesn't percolate to the top of your list still.
If you've got 10 things on your list that you need to buy to secure your organization, you're probably going to start with two-factor off and a SIEM system, and you're probably going to pay for sock audits and stuff like that before you look too closely at supply chain worries.
Yeah, I don't know.
I don't think this is going to bring this issue to the forefront of people's security programs and security spending.
What about developers on teams who are asked to write software and they're using open source tooling and they'll find some arbitrary library and they'll load it into their app or they're just having problems building their app.
Maybe they're loading some build scripts or something and those are compromised.
There are a lot of places that this type of exploit could get into the system's people right on a day-to-day basis.
I think there's a little bit of a long tail.
The interesting thing here is someone was able to identify a library that was likely to be used in quite a lot of SSHD operations, even though it's not part of SSHD.
They were able to find this library, what others are out there that are widely used, very few maintainers that have this potential.
Now, I don't think we can answer that, but that is a little bit of a worrying question, right?
It is.
Clearly, the people who are doing the attacks have gotten fairly sophisticated in identifying the vulnerabilities because if you think about this particular one, say that you start off with the idea that you want to attack SSHD, you start looking at SSHD and LIBE LZMA is not in there.
It's not something that you would immediately identify as a vulnerability for that program.
They were sophisticated enough to see this one step removed attack.
I guess the other thing that I have not seen yet and that I'd be interested in is if anybody hasn't analyzed the potential impact of this had it succeeded, obviously, being able to backdoor into a system with elevated privilege could allow you to do some terrible things, but I don't honestly know what sort of systems that gives you access to.
Is it a bank or is it the electrical grid or other important infrastructure things?
I just don't know.
Yeah, because network segmentation would make this more complicated.
You have to scan for open SSH ports and try to start poking around.
There's a haystack that you're looking for needles in potentially.
Yeah, but it seems fairly clear to me that the people who are doing this had some idea of what they were going to use it for.
I don't think they would have gone to all this trouble putting in a backdoor if they thought it was just going to get them into some small business that they could ransomware or something.
But it's crazy because as you mentioned, the plan as we saw underway, so the plan was unfolding over three years, but that's the surface of what we can observe.
There must have been a lot of research that went into it beforehand.
Maybe we're talking a five-year plan.
Over five years, you might imagine, wow, the stuff that we're trying to exploit.
Maybe it gets people to pull it out.
You're making bets that these core utilities are going to be there years hence.
I think the other thing that I find a little creepy about this, if I'm a Debian maintainer or a Ubuntu maintainer, I now have to be aware that there are organizations, groups out there who are really closely watching how we do governance, how we get patches into the distro and release them.
These people know how to lobby us for Linux distro maintainers, open source maintainers.
I find that a little creepy.
Here they are just trying to cobble together the next version of Debian, very large project.
It's like you're suddenly on a stage and the audience is filled with people who are potentially pretty nefarious.
They have pretty dark designs.
Yeah.
I think for the open source maintainers, it introduces a new concern in that when you start these projects, I think people imagine that it's going to go kind of like Linux has gone over the years.
You introduce something which is sort of the starting point and then people get interested in it and they contribute to it and they expand it and it gets better.
Now you're in the situation where every time somebody wants to contribute to your project, you have to consider the possibility that they may be doing it maliciously.
It just seems like a whole new concern that people who are already overtaxed have to worry about.
Yeah, that's fair.
It's the same way that spam and phishing have entered our lexicon.
When we receive text messages from numbers we don't recognize, we are suspicious.
We receive emails from people saying, "Act on this quickly.
Click on this thing."
We are suspicious, but we have been trained to be suspicious.
What you're saying now is as an open source maintainer, there's a new kind of suspicion you have to be trained for, aware of, wary of, I guess.
That's an additional burden.
Open source maintainers already have heavy burdens.
We did an episode about open source way back in the beginning when we first started recording this podcast.
We're now on episode 30 here.
A lot of the things we talked about are still not resolved.
It's amazing that so much of our industry relies on open source to perform the work that we want to do on a day-to-day basis to support companies that we work for.
They rely on the free labor, massive numbers of people.
There doesn't seem to be a lot of enthusiasm in industry for the idea of actually supporting these projects either.
I don't know how you go about putting the economic value on some of these things.
There's been this issue recently around Redis.
We've talked a little bit about how Redis is a thing that's widely used and used in a lot of different ways, but it's not something that people seem to want to pay for.
Yeah, they changed their license from BSD, I think, to not closed source, but they're saying, "We want you to pay us."
They're then a response of, "Well, we don't want to pay you."
Maybe we could take that to be the general statement about open source.
If every single open source library said, "I want you to pay me," then suddenly a lot of engineers like us who work for companies would have a lot of work to do, re-implementing all that stuff.
That's what you're saying.
Yeah, I mean, in the good old days, the sort of Stalman philosophy was that you make this software and you make the source free and you don't charge people for it.
The way you make money is that people hire you to consult on it or work on it or whatever.
That doesn't seem to happen.
I don't know what the magic formula is for making open source economically viable.
Well, Stalman always said, "Free as in speech, not as in beer."
It's a pithy statement.
But I got to tell you, the way in which the software is consumed is not the same way that free speech is consumed.
It never really held up under scrutiny as a philosophical opinion in my view.
Well, that's all we have time for.
This is a fascinating story.
I think we'll keep following it.
We'll put some links to some of the articles we talked about here.
Thanks, Mike, for describing the bug for me.
I think I do have a little bit more appreciation for it now.
Yeah, I'm looking forward to more analysis of it.
Also, I'm hoping that somebody will investigate the sources of this and try to understand who initiated it and what their ultimate intention was.
For my part, I hope someone goes and talks to Alasdair Cullen.
I hope we find out that person's doing all right, that they're not feeling the entire weight of the entire world, of our industry looking at them harshly.
Instead, people realize, wow, that's a very hard position to be in and it's not that person's fault.
Yeah, totally agree.
Again, this is a really useful and interesting project, creating a compression tool that's actually better than the predecessors is a very significant accomplishment.
We should be kind of open source maintainers in general, I think, unless they are state actors who are trying to exploit our systems, I guess.
I agree.
This is the Picture Me coding with Eric Aker and Mike Moll.
Thanks for hanging out with me, Mike.
Had a great time talking to you today.
Thanks.
We'll see you next time.
Bye-bye. (gentle music) [BLANK_AUDIO]