Cyber Security
The cyber security podcast from SE LABS Ⓡ
- Understand cyber security and other security issues. Practical and insightful, our experts have experience in attacking and defending in the physical and digital worlds. Peek behind the curtain with Cyber Security DE:CODED.
Cyber Security
Interview: Brian Monkman (NetSecOPEN) | S2E11 Bonus Episode
Simon Edwards 0:02
Welcome to DE:CODED. This is a series two bonus episode featuring a full length interview with Brian Monkman. Brian worked with testing company ICSA labs, before heading up the network security testing standards organization. NetSecOPEN. We talked about the subtleties of assessing firewalls, why testing them can get very complicated, but how most challenges can be solved by a rare thing in the cybersecurity testing world transparency? Brian, first things first, you know, we can get into really technical details about how to test network security appliances, and that kind of thing. But at the very basic level, how would you judge the speed of a firewall is it as straightforward as just running lots of data through it, and seeing how much it can handle?
Brian Monkman 0:54
No, not at all. There's there's an extraordinary number of variables to consider. I mean, the speeds and feeds is definitely, definitely one of the things to consider. But that really isn't the complete picture, you need to consider the type of traffic that is going going through the firewall. And I'll take take, for example, a healthcare entity, a firewall that's handling health traffic that's related to healthcare, they're going to have a very different traffic profile than, say, a financial institution or an educational institution. And so it's really quite important to take all of that sort of thing into consideration. But that said, there will be some basic things that you'll take a look at, regardless of what sort of enterprise the firewall is, is protecting.
Simon Edwards 1:52
So if I was running a healthcare organization, you know, would I be looking out for adverts for firewalls particularly suited to my kind of organization?
Brian Monkman 2:02
Not necessarily, because most of the firewalls out there today, in fact, quite frankly, I would be extremely surprised if, if it was anything other than all the firewalls out there can be configured to address the needs of specific enterprises. So you could get a firewall from vendor a working for you're being set up in an environment of a large financial institution. And the same same vendors firewall could be used in a healthcare entity.
Simon Edwards 2:36
Okay, so I go out there and I, I can choose from any of the main brands, and so long as I've got enough expertise, I can probably tease enough performance out of it
Brian Monkman 2:46
correctly. You know, that's, that's a reasonable assumption to make. Of course, there's going to be caveats, but But yeah,
Simon Edwards 2:52
there's always always those. So let's say that we go shopping, and I need a 10 gigabit per second firewall, because that sounds like a big number is that the only number that matters are the other details on a data sheet somewhere I should be paying attention to,
Brian Monkman 3:10
you should be be paying attention to a number of different things in the standard that our organization has developed. We look at, we look at, I would say around around 10, nine or 10, sort of different things. considerations are what we call KPIs to take a look at. So we the first thing we take into consideration is the application mix. Profile. That's, that's important.
Simon Edwards 3:45
So when you say an application mix, what's in this context? What is an application?
Brian Monkman 3:50
Well, the traffic profile for for the enterprise? So in going back to healthcare, financial, you know, that sort of thing. Education, we
Simon Edwards 4:03
talking about protocols? Are we talking about something more detailed than that?
Brian Monkman 4:08
Yes, you're talking about protocols, but it can be a little bit a little bit more detailed in that as well. You know, for example, it wouldn't be unusual at all, for an educational institution to want to use a video streaming service, you know, one of the commercial ones out there. And so that's a myriad of different protocols and, you know, different traffic, you know, type of traffic requests, and so on. So, it's more than just simple HTTP DNS SMTP. That sort of thing. It's it becomes a becomes complex fairly, fairly quickly.
Simon Edwards 4:45
Right. So if we were if we were setting up a secure network for a sales team, we might assume that they'll want to make phone calls so maybe SIP or even so even more specific, like Skype. That's the kind of thing you mean.
Brian Monkman 4:57
Yeah, yeah, it's that's the sort of The thing that we would, that we would include in the in a traffic profile or application mix,
Simon Edwards 5:06
right, and a financial institution would probably want to be able to shift financial transactions with more priority than maybe email.
Brian Monkman 5:16
Yeah, yeah. And that's definitely one way of looking at it. And we're only just getting started in the specifics of the, the application mixes themselves, because it is extremely difficult to get parity between the various test tools out there.
Simon Edwards 5:35
So we have one linked by firewall, and we're going to Cisco's website and Palo Alto website, we're looking downloading all these these spreadsheets and these, these data sheets, and we're seeing 10 gigabytes per second here. And we're we're seeing other bits and bobs. Do I mean, the statistics that we see or the marketing claims that we see in those data sheets? Do they go down to that level of detail?
Brian Monkman 5:59
Sometimes? It all, it all depends on who the target audience for the data sheet is? The first question, anybody who's reading a data sheet should be asking is, what is the configuration of the firewall itself are all the security controls necessary to deal with the traffic on or, as often is the case, you know, are some, some of the security controls turned off in order to improve, improve performance? You know, what we do here? Here, when it comes to the standard is, our approach to testing is fundamentally different from that in that you decide what sort of security profile you want to set up on the product that's being tested. Then once the security protocols have been addressed in the security policy, you verify that the product the firewall is operating, as you would expect. And then at that point, you lock the configuration in and make no changes. So all the traffic that is coming through the firewall, and being handled by the firewall will have to go through the various policy engines that were set up prior to testing
Simon Edwards 7:20
So does that mean that when you see the headline figure of 10 gigabits per second, that's maybe the figure you could achieve without actually the security settings turned on?
Brian Monkman 7:32
Possibly, yes. Not but not in every case at all. It all depends on, on how the vendors decide that they want to represent it. A lot of the vendors who actually have the security policy turned on will document that in their data sheet. They may not go into voluminous detail that but they'll provide a pointer to how the product was configured. During as a result of these were the numbers are coming from and what what you're actually looking looking at. But that's that's one of the fundamental challenges of NetSecOPEN is to get the test tool vendors, the labs and the security product vendors together and have everybody agree to what a reasonable standard should be in order to provide apples to apples comparisons between products.
Simon Edwards 8:26
Yeah, so I'm we've seen that so we look at test data sheets, and we do testing and in some cases, you've just got that 10 gigabit per second headline figure. And in other cases, they will say, well, it's in real life, it's going to be half that with security settings on. And in some cases, they even go to more specific detail and say, Well, if you've got a lot of encrypted traffic, you know, SSL, the whole padlock in the browser thing, then it's going to be even slower. So there's this whole range of transparency and lack of and claims and counterclaims going on. I've seen that. Yes. So when we come to look at reviews of firewalls, you can take the marketing claims from the vendors as truth. Or you can, I guess, test it yourself, how would a big organization professionally testify? Well, to make sure that it was suitable for their own purposes? Well,
Brian Monkman 9:20
so the first thing that I would suggest they do is that they shortlist the products that they want to take a look at based on certifications against. NetSecOPEN requirements. Of course, that's a shameless shameless plug. But that's, that's definitely a good a good place to start. And since the standard that we've developed is open and public and available to everybody, you know, anybody with with a certain amount of competence can certainly you know, reproduce the testing. In addition to that, any products that have gone through the NetSecOPEN certification, the configurations of those products are available for anyone that would like them. Second thing to do is to use standardized test tools. So in, in our case, we have a few test tool vendors that are that participate in the program. And, you know, there's some of the products that have been verified as being able to produce comparable results have been accredited by NetSecOPEN. The beauty about that is it provides everybody a common starting point. And so an enterprise is going to take a look at data sheets. And so well, and say that, well, this doesn't really perfectly fit what I want, but it's a reasonable starting point. So then you could take that reasonable starting point, get a get a dot and moderate make modifications to the test tool that you've acquired. And, you know, to suit your own environments and, and, you know, run run tests again.
Simon Edwards 11:05
And it does is a device on the test, or I think we also talked about systems on the test these days, aren't we?
Brian Monkman 11:10
This is true, it's the acronyms seem to be all over the place sometimes.
Simon Edwards 11:16
So at the moment other than NetSecOPEN you because you guys are doing the standard by which other testers I guess should aspire to follow? Are there lots of different unbiased reports that people can go out and download publicly at the moment.
Brian Monkman 11:33
There are other reports available to download publicly? I do know of, of a number of of labs that produce reports. Sometimes they charge for them, they're behind a paywall other times that are freely available, the difference between test labs that, you know, as he exists today, and what we're trying to do is the open and transparent nature of it, because you, you don't really know how how test requirements have been developed in a lot of tests labs, whereas, you know, we were not only open and transparent, the standard that we've developed has been contributed to the IETF benchmark Working Group and is going to be part of the public domain. So it's, it's, there's our goal here, as has been to be as open and transparent as we possibly can. And that's an often overused term, open and transparent. But our goal here is that if anyone wants to come to us and ask us specific questions, well, how did you come up with this or come up with that, we would be able to tell them.
Simon Edwards 12:50
And what we find as well as being transparent is, as a tester, anyway, is a very good way to show your competence. And what we find is very large organizations kind of global 500 level, they will look at reports, but they won't base a buying decision on a report that I've published about a particular firewall, even if I agree with two or three other test labs, what they will do is they will judge us as testers. And then if they're about to spend a million or more pounds or dollars or whatever, on firewalls, they'll probably engage with the tester to do some work for them privately, because they've got very specific needs. And every test is always based on some assumptions about what people are going to want.
Brian Monkman 13:35
Correct? Yeah, they're very few enterprises that have the resources available to them to do a lot of the detailed testing that they would want.
Simon Edwards 13:45
I think one of the obstacles is the cost of the test equipment itself. But then also, it's, you know, I'm not not disparaging any of those companies, but they're quite hard to use as well. And you do need I think, like a specialist team, maybe even if calling contractors to do that kind of work.
Brian Monkman 14:04
It's not trivial, that's for sure. No,
Simon Edwards 14:07
thankfully for us. So when you when you do see one of these third party tests, and you Brian yourself, you decide you're going to judge it and work out if it's valuable or not, what are the kind of some of the criteria you might use to, to form an opinion?
Brian Monkman 14:26
Well, first off, I'll look at their test, the test methodology and how how much detail they've they're prepared to provide. Second, I'll look at the weather whether or not they've set themselves up to be governed by any sort of standards in order to ensure that the tests that they conduct are not only open and transparent, but are reproducible because one of the biggest variables out there is is that you can take the same product, same test tool, the same testing requirements and come up with different results depending on who's who's actually doing the testing. So one of you know that that's an important consideration as well,
Simon Edwards 15:17
transparency. And as you said earlier, being able to reproduce results is very important.
Brian Monkman 15:24
Right? Right. And we, we not only provide the test reports and certification reports publicly and at no charge, we also provide, as I said before, the configuration files for the device that was tested, and also a configuration information for the the test tool that was used.
Simon Edwards 15:50
Right. So this, this goes back to what we were saying at the beginning, if if you don't have the config, then you really don't know what's going on, you could say this device runs at eight gigabits per second. But then when you look at the config, everything's turned off, so there's not really very much security going on. Right? We're talking about standardizing and reproducibility. But there's also reality, there's also realism in testing, which is something we pay a lot of attention to. And in the real world, things often aren't reproducible malware and attacks, and things all happen in different ways. So in your opinion, how close can testers get to reality? And how close do you think they need to get,
Brian Monkman 16:34
they need to, they need to try, that's, that's the first thing they need to recognize that it's important. They also need to recognize that 100% compared to comparable results will never happen. So you need to decide what's what's a reasonable delta. And for us, you know, we're setting a target of, you know, particularly when it comes to application, traffic mixes and that sort of thing, we're setting a target of around a 5% difference. If you take a look at the standard that we've developed, you'll see that seven, seven or eight of the test cases are set up to be a lot more of a standardized approach to testing, those comparable results are much, much closer when it comes to the various the various test tools with respect to the CV or the exploits and malware and, and so on. What we are trying to do is develop standardized test sets that all the test tool vendors will have, you know, coming from, from a single source to ensure that the policy engines of the dots are having to deal with the same kind of malware and the same, the same exploits and so on. Is that going to it's not going to provide absolute reality. No, it worked. But it will be it'll certainly be a lot closer than the number of variables that there are involved across multiple test houses when it comes to sourcing things like that.
Simon Edwards 18:16
Right. And I think in any kind of testing that I can imagine whether it's sort of medicines or car tires, or whatever, all tests is trying to get as close to full reproducibility as possible. But it's for various reasons, including physics, it's just not possible. So my understanding and you will know more than I just the way that TCP IP and the other networking protocols work, the chances of getting two sets of network traffic looking exactly the same as I think unfeasible.
Brian Monkman 18:50
I think that's a I think that's a reasonable statement. But you know, again, you know, close enough is the term that I like to use. And if someone pushes me on, well, what does close enough mean, which is a fair question, I bring out the, the what our targets are, our target is within 5%. And then if it goes beyond 5%, then we started looking at it much more closely. But the important thing is, is what's behind all that. And you know, the configuration information, like I said, the detailed information on how these, how these test results have been generated is one of the things that we do here. Before we grant certification. It goes through a certification product review process here. We have a we have a certification body and we require the test labs to provide us the raw data from the log files from the from the test tools. So we use that to generate our own certification reports. But we also use it as a way to, you know, double check what the what the labs are doing Knowing
Simon Edwards 20:01
I think it's a really important point that you're making that close enough is, is good. I can think in in other areas of security testing, where some vendors have been very resistant to being tested because they want to just sell their product. And they don't want any possible risk of, of being kind of told they're not as great as they are. And one particularly very well known security company said, Well, you cannot test us because we are designed to stop spies and cyber criminals. And no tester is able to be a spy or a criminal. And it became a kind of game of words in the end. But ultimately, if you can do an attack, or send a stream of data, that is a bit like what happens in the real world, realistically, and I'm sorry, to anyone who thinks that computers are as reliable as the movies would make you think. Realistically you can, you can get close, but you can't get exactly the same as any one organization in the world would ever experience in the real life.
Brian Monkman 21:02
I agree with that.
Simon Edwards 21:05
So we're talking about firewalls, but what about other devices? There are lots of things you can put on the network. And some of them I think, talk to firewalls as well. Is that something that you guys at net SEC open are looking at as well?
Brian Monkman 21:17
Yes. I mean, it's, so we've taken a long time to get to the point that we're at today, it's the development period took somewhere between three to four years, you know, we recognized right from the beginning that it was going to take a long time, because it was going to be a learning experience, this sort of thing has never been attempted before. And it's, it's it's something that has been very, very difficult. And but now that we have, we have reached a place a place where we can see the light at the end of the tunnel, so to speak, I think it's fair to say that the standards that we've developed, can be repurposed for for many other different product types. And we have also developed with, in some circumstances, products in mind that aren't firewalls, for example, Next, Next Generation IPS is a perfect example of what we're looking at, it wouldn't be a great leap to develop security, efficacy testing for web application firewalls, and we repurpose most of the performance testing requirements that we have in the standard today. That could even be said, the same thing could even be said for like cloud security, not not completely, you know, because, you know, for example, if we wanted to do something in the sassy environment, I think that would require and in many cases, almost, almost starting from scratch, but but for vendors who have firewalls out there, and they're repurposing them to the cloud environment, there could be a lot of overlap there as
Simon Edwards 23:09
well. What kind of big challenges can you predict for testers who want to assess cloud based firewalls and other security applications?
Brian Monkman 23:18
A lot of the major, almost all of the major cloud providers out there have use usage agreements that in any way that tend to embargo, this kind of testing, so they
Simon Edwards 23:31
just don't want to be tested?
Brian Monkman 23:31
Well, they don't it's it's it that but they also want to have a significant amount of control over what their what the network is required to handle, you know, because they they're subjected to many service level agreements. And they want, they don't want to open themselves up to financial risk by allowing testing. So that's a significant issue.
Simon Edwards 23:58
Right? So if we're testing a firewall box on our own network, we're not going to bring down anything that Cisco or Palo Alto is running. Correct. We're dealing directly with us cloud service provider, we are touching their servers directly and using up their bandwidth,
Brian Monkman 24:13
right, right. So controls have to be put in place to ensure that that thing cannot happen. Another another thing to take into consideration, performance metrics, such such as, you know, connections per second or throughput can very easily been be dealt with in a cloud environment just by throwing more instances of, of the firewall at the issue. So you know, something like latency, which isn't as much of an issue in in a traditional networking environment becomes a significant issue in the cloud environment, but other things like throughput and connections per second become less of an issue.
Simon Edwards 24:58
I can also imagine that Some tests will have better internet connectivity than others. And so it might not be fair for a small lab in rural England to him to do a cloud based test to one versus one that's connected in, I've know, Silicon Valley, like next door to one of the major data centers
Brian Monkman 25:18
through that, but at the same time, that's a relatively simple issue to, to overcome in this day of colocation, you know, and, and so it's a little bit more technically complex. But, you know, you somebody is in using our example, in rural England doesn't necessarily need to be on premises to be able to execute testing from a lab.
Simon Edwards 25:45
No, they could set up a co located server in rural China and probably get some of the best broadband available.
Brian Monkman 25:51
Well put.
Simon Edwards 25:56
VPN speed testing, is that something anyone's ever shown an interest in,
Brian Monkman 26:02
bounce the idea around with with a few people but haven't really dug into it in any in any great detail?
Simon Edwards 26:11
I think there's probably quite a quite a stack of different things going on there isn't there, there's endpoint software and other stuff.
Brian Monkman 26:16
There certainly is. And the test tools handle encryption in it in a very different way. And and it comes down to are you using hardware are using virtual devices, how's it set up, and, and, and so on. So it's VPN testing isn't just a matter of measuring what the tunnel could do. It's also measuring how things are encrypted and decrypted, and so on.
Simon Edwards 26:47
There are tests out there, we've talked about some of the issues, which is there may be background relationships that are unclear, there may be a lack of transparency, are there any other red flags that you could highlight for people, when they come to read a report that would make might make them wonder if this may be less useful than others?
Brian Monkman 27:06
For me, the biggest red flag is, is people who take take reports, at face value or as as gospel I guess, is probably the best way of putting it, the more information a tester is prepared to provide you. And from the point of view of how how a test has been executed, how it was developed. So on the better if of if a tester is kind of kind of uses things like well, this is our, our proprietary environment, or it's our intellectual property. So we don't share that sort of thing, then you need to start questioning Well, what does that mean? You know, me there are some times when that said, and it's very legitimate. But there's other times when it's being used to really obfuscate things somewhat.
Simon Edwards 27:56
Yeah, it's a great, it's a great excuse, we can't tell you for various reasons.
Brian Monkman 28:01
Right. You know, and I think one of the, one of the good things that we've stumbled on NetSecOPEN is that the openness and the transparency of what we're trying to do, you know, coupled with having the not just the security product vendors, weighing in on the on the testing standards, but also in closing labs and test tool vendors, has really resulted in testing requirements, I think that are very robust. And the fact that the having the security product vendors in control, from the beginning to the end, will prevent will prevent unintended consequences in the use of the use of the testing.
Simon Edwards 28:52
Please subscribe. And if you enjoyed this episode, please send a link to just one of your close colleagues. We also have a free email newsletter. Sign up on our website, where you'll also find this episode's show notes, and bonus episodes featuring full length interviews with our guests. Just visit DecodedCyber.com And that's it. Thank you for listening, and we hope to see you again soon