Season 1 | Episode 1 | Part 1
Welcome to the first episode of Hacking Exposed, Qwiet Edition!
With us today are Stuart McClure, Chris Hatter, Chetan Conikee, and Ben Denkers.
Part 1 is focused on the MOVEit exploit. Stay tuned for Part 2, where our hosts draw lessons from other recent exploits and developments in the field of Cybersecurity.
The discussion touches on:
- threat modeling
- the relationship between quality and security in DevOps
- third-party risk
- what we can learn from the humble slime mold
- and even a movie recommendation for those who grew up in the 80s
Resources for this episode:
Mandiant’s article on the MOVEit vulnerability.
A training resource from Microsoft on threat modeling.
The IMDB page for Air.
[00:00:34] We meet our uniquely qualified hosts and learn their backgrounds.
[00:07:29] Why your complex organization is like a simple organism.
[00:11:17] Ben emphasizes the importance of accounting for risk when using third-party products that could potentially access or move files across an enterprise.
[00:13:57] Chetan and Chris debate whether quality and security are the same thing.
[00:15:01] Stu points out the importance of integrating security into the education and culture of a coding environment.
[00:17:26] Stu makes a point about input sanitization that he REALLY wishes people would take seriously.
[00:17:40] Chris makes the case for aligning the terminologies used by engineers and security practitioners.
[00:19:28] Chetan draws an analogy from the movie “Air” to suggest that developers should get paid like iconic athletes. Kind of.
[00:00:00] Stu McClure: Hey, Hacking Exposed fans. This is your OG and founding author, Stu McClure, bringing you the Hacking Exposed Podcast, Qwiet edition–that’s Qwiet with a w. This episode ran a little long, so we’re splitting it up into two parts, so I’ll be back at the end to remind you about part two. All right, thanks everybody.
[00:00:34] Stu McClure: This is Stuart McClure here, I guess one of the OGs of cybersecurity, you’d say. Um, been in the industry for 30 plus years. I stopped counting at 30, so, um, you know, just to, uh, be kind to myself, but I’ve spent a lot of time trying to break down cyber attacks, how they work. How to mitigate the risks, practitioners, and then building technologies and [00:01:00] platforms that help provide real automated solutions, really oriented largely to prevention, which is a four letter word in cybersecurity.
[00:01:10] Stu McClure: So what we’re gonna break down today for you inside of Hacking Exposed, Qwiet Edition, is really all of that in a nutshell, based on four different perspectives. So I’m gonna introduce my co-hosts here. Real quickly, Chetan and Chris and Ben, why don’t you guys go ahead and get a little background on yourself, starting with Chetan.
[00:01:32] Chetan Conikee: Hi, all. My name is Chetan Conikee. I spent around 20 plus years writing software, sustaining software, and now protecting software. All of these fields are somewhat correlated to each other. And I pretty much consider all my experience irrelevant because my environment is changing around me, which means I have to constantly learn. I’d love to learn with you in this podcast.
[00:01:59] Chris Hatter: I guess I’ll [00:02:00] jump in next. My name is Chris Hatter. I joined Qwiet AI in November. Before that, I spent the last seven years as a practitioner CISO at a, at a Fortune 500 operating in 10 plus countries with large, global infrastructure bases in public cloud as well as on-prem. I’ve had the fortune, or lack thereof, of being able to see, feel, and experience cyber attacks and breaches from the inside.
[00:02:32] Chris Hatter: So I’m looking forward to kind of bringing that perspective to the group and just kind of going through our unique views of cyber.
[00:02:43] Stu McClure: Ben,
[00:02:44] Ben Denkers: thanks, Stu. Ben Dankers, I’ve spent the last 20 years doing security consulting, helping under, helping organizations better understand risks, vulnerabilities, threats to their environments, and ultimately, helping them identify what they need to do from a process and [00:03:00] technology perspective to be better prepared for that kind of ever evolving threat landscape.
[00:03:07] Stu McClure: All right. Well, thanks everybody. Thanks for coming together. I know, time is precious, so we’re, we’re gonna get jumped in right away as we’ve talked about some of the topics that we’ve seen in the last couple of weeks, and sort of maybe intimate a little bit about some of the topics that are coming up, quick, rather quickly.
[00:03:25] Stu McClure: So one of the, one of the more interesting, I guess, topics of the last couple weeks have been around this MoveIT vulnerability or exploit. I know Mandiant, I think, did the first report of it, way back in the beginning of June and sort of said, well, it affects government mostly. Well, that might be true, but the customers seem to be pretty diverse here.
[00:03:48] Stu McClure: I don’t know if you guys taken a look at it much, but it, it’s not just government, local, state, federal kind of stuff, but appears to be a lot of different types of organizations that leverage [00:04:00] this technology. To move files, transfer files, basically. It is, it sounds like a sequel injection vulnerability, at least at the beginning.
[00:04:09] Stu McClure: But I think there’s multiple steps to it. And, it’s sort of the age old problem of, okay, you’re building a product, you’re building code that doesn’t necessarily look for vulnerabilities by default. Developers that are doing wonderful jobs and doing things very quickly, but might not be thinking about the cybersecurity impact of it.
[00:04:34] Stu McClure: So, sequel injection, deserialization attack, pushing up a web shell, sounds like how, this is how it is working. What do you guys think about this attack? I mean, is it frustrating? I, it certainly frustrating to me, to no end, but, yeah, I mean, Chetan, what’s your thoughts? Yeah, Chris,
[00:04:57] Chris Hatter: I’ll jump in. I think what pops off the page at [00:05:00] me is probably, I’ll start with kind of like the most boring part of this attack. It’s a managed file transfer application. You think about, every organization, for the most part, especially government, large enterprises, everyone needs to move files from point A to point B.
[00:05:15] Chris Hatter: And so this type of technology is not uncommon. And what immediately pops into my head is wondering whether or not people had MoveIT, or managed file transfer, in their threat model. Did they have the threat model? They have it in the inventory of software that they’re using. Right. Have they thought about this as a supply chain style attack problem, and what kind of security went into securing that part of their supply chain?
[00:05:41] Chris Hatter: My guess is a lot of organizations, this is more or less of an afterthought when compared to some of the most common techniques that you would use to prevent cyber attacks like AV/EDR, firewalls. Like what did they do around making sure that their file, file transfer infrastructure was in a healthy state, right?
[00:05:59] Chris Hatter: So it [00:06:00] comes back to do we know our assets? Do we know what software we’re using? Have we considered our, how to secure that, that ecosystem within our threat model? .
[00:06:11] Ben Denkers: Yeah, and Chris, how are they validating, right? Like that the organization of, whatever, this week it’s MOVEit, maybe next week it’s another, you know, it’s another platform.
[00:06:20] Ben Denkers: But as part of that risk profile, how are they actually validating that those organizations are doing kind of the basic blocking and tackling to identify some of these systemic issues. Right? I mean, cuz we’re talking about SQL injection, which is not exactly a new thing as everyone on this call is aware of, right.
[00:06:35] Ben Denkers: And so, you know, how as an organization can they, can you do a better job of validating that enterprise solutions that you’re leveraging, or are using potentially are doing the right types of things to secure their own software.
[00:06:50] Chris Hatter: Yeah. Well, I mean, the other complex part of this is if you go invest in a file transfer solution, you don’t have access to that code base.
[00:06:58] Chris Hatter: Like you can’t, you [00:07:00] can’t analyze the code, for the most part. And so when it comes to supply chain security, what are you doing and how are you validating the kind of secure code development practices of your vendors, or is it a simple, 30 question, vendor review or like, what, what are you doing in that space and what should you be doing?
[00:07:29] Chetan Conikee: Yeah. Let me bring a developer’s perspective, to this, you know, in my opinion, an organization is like a slime mold, your app, your applications are evolving, your staff is evolving, and your technology is evolving. And they’re all happening at different timelines. So if we examine this vulnerability, it’s almost like you have legacy software, modern software. Modern’s the facade to legacy.
[00:07:55] Chetan Conikee: And of course, all your sustainment effort has transferred to the modern, and there’s [00:08:00] less of focus on the legacy. So like you stated, this is SQL injection at the foothold and from SQL injection, it moves to a web shell, which is remote code injection. And from there on it’s command and control. So the question is, as a developer, how do we incentivize and draw focus to ensure that we can’t leave legacy behind?
[00:08:24] Chetan Conikee: Because when we do leave Legacy behind, it brings forth this situation. So SQL Injection is of course, legacy, but it’s not that obvious with modern software.
[00:08:35] Stu McClure: Yeah. And at the end of the day, SQL injection is input sanitization, correct? I mean, At the end of the day. So if, if developers can only remember one thing around managing their code, it is, “Hey, what’s coming in?”
[00:08:50] Stu McClure: Am I filtering this or am I just allowing it to go unfettered? If, if we just look at it from that perspective, isn’t that killing the 80/20, I mean, [00:09:00] of SQL injection or at least, any kind of injection component?
[00:09:04] Chetan Conikee: Yeah, it’s Pareto principle playing its game. You know, you still have, input sanitization that you have to take into account, and also something called as utilizing enforced principles while accessing your database, which is, rather than create dynamic queries and letting anyone create that query, which could be for good purpose or bad purpose, you have to prepare that query so you have your expectations, boundaries defined.
[00:09:34] Chetan Conikee: To ensure that anyone, even if they try to abuse, are confined within the boundaries, they can’t exceed or proceed to make this an attack vector.
[00:09:41] Stu McClure: Yeah. And so, you know, Chris, you talked about threat modeling. I mean, let’s be real. I mean, how many companies out there outside of the Fortune 500–and there are a lot more companies outside of the Fortune 500, right?–how many of them really think about threat modeling as a [00:10:00] major part of their cybersecurity program and then validate that threat model, and the threats in and around the supply chain in particular, um, out there, in your opinion? I mean, I, I don’t see it all that much.
[00:10:11] Chris Hatter: I would argue it’s a very low percentage. I would argue also that for organizations with, uh, you know, where the data that they have, or the systems that they have, or the products that they sell, have very sophisticated attacker interests. Right. You think government, you think banking, you think healthcare, organizations that are well-funded and well staffed, this is actually a very important part of their product security journey.
[00:10:43] Chris Hatter: Because it is, it is what really starts the process of making sure that you build secure software. It’s not only looking at the code or the servers and systems or vulnerability scanners. It’s sitting down and thinking about, what are all the different ways and types of attacks that could harm us, and what is the [00:11:00] logic in our app that could be exploited, which has a tendency to be very difficult to secure, as it relates to tools that are out there in the market. So I, I would say that overall the percentage is very low. I think in sophisticated security organizations at a, at a minimum for the crown jewels, they’re doing this.
[00:11:17] Ben Denkers: Yeah, I mean, I think, sorry, Chetan, I was gonna say, in short, right? Like if you’re looking at it from a consumer of product, in a risk matrix perspective, right, and you have a solution, whatever that solution might be, that could potentially access or move files across your enterprise, that’s something that organizations absolutely need to start thinking about how they’re accounting for from a risk profile perspective, right? And I think a lot of organizations, uh, aren’t necessarily, uh, thinking about it from that point of view. But these types of incidents certainly highlight the reason why they should, right? And so on more of a conceptual, like what is my risk management process or [00:12:00] program, and what applications or solutions am I using as a consumer of products like this? What risks does that entail to me, I think is critical…
[00:12:11] Chris Hatter: I want to kick the baton here to Chetan in a second, but I wanna, I wanna prime him with something. Part of the reason, in my opinion, that threat modeling is not a more common practice is its ability, or inability, to scale, like how they’re conducted. I’m a very firm believer in application security and product security that the developers and engineers themselves need to be prepared by the security team to own their own destiny as it relates to securing what they built. And so what’s our path to engineers owning the threat modeling process and being self-sufficient in it without the cyber team?
[00:12:54] Chetan Conikee: Yeah. Chris, Ben and Stu, you sort of walked into a landmine here and, and you know, allow me to say why, [00:13:00] because, you know, I’m gonna bring in a developer perspective. You, as leaders of security, spoke of threat modeling, right? Uh, modeling is the design phase. And no design materializes until it’s implemented. And then sustained. So assume I’m a developer, I wake up in the morning. I’m in front of my terminal writing code. I’m incented to juggle all my priorities and show that I deliver on time. And now you folks are coming and telling me, write software, secure software. Help me understand how I’m gonna juggle this extra thing in addition to the many things I’m dealing with.
[00:13:44] Chris Hatter: I’ll counterpoint you really quick by saying, by saying you as an engineer, do you wake up every day, get to your terminal and commit yourself to writing quality code?
[00:13:57] Chetan Conikee: Quality could mean many things, right? The perspectives, [00:14:00] uh, you know, if I take my apparatus and measure quality, I’d say performance. That is one thing that I’m perhaps incented to do better because, you know, all my bonuses are tied to performance, but is security tied to my bonus? And security could mean many things, right? So help me understand whether quality and security is the same thing.
[00:14:23] Ben Denkers: Well, I think, you brought up a good point, Chetan, right? You need some of a, you need a top-down approach, right? To incentivize from security. And so it’s an organizational, you know, the organization plays some role in, in you, you call it incentivize, but prioritizing what needs to happen and, and how things should be done. And so, uh, you know, if it’s security is important, as it should be, to the organization, then proper incentive should absolutely be put in place so that it’s less of a question of what you need to work on, and more of what type of resources do you need in order to enable, you know, that, that [00:15:00] desired outcome?
[00:15:01] Stu McClure: Well, I’m gonna jump in as a recovering coder myself, I do take some of that mindset myself. When I was coding, it was all about the cool hack I could do, not security hack, but the hack of speed or efficiency or of doing something that was sort of cool. That was what motivated me, incentivized me every single day. If I had a performance metric of some sort, I would try to maintain to that, but it wasn’t about security until I started to really learn how easy it is to hack. And I wonder if, you know, like you talk about Chetan, I mean, what, what’s the incentive when you’ve not been trained to care about security?
[00:15:41] Stu McClure: Why should you care about security? You know, you haven’t gone through university and, and taken 10 classes on how easy it is to hack code. Not, probably not even one class, right, to teach you that. And so I think it’s incumbent upon us in the industry, everybody, if you care about your third party risk, [00:16:00] and especially in the supply chain, you need to demand to understand the degree to which your, your sort of third party providers, how they take security. Do they take it seriously? Do they implement security as part of quality inside of their development cycles? Do they do threat modeling? And that’s something that nobody really does. I mean, I was on the board of, uh, a third party risk supplier for the last seven years, and they’ve never incorporated secure code into the measure of third party risk. Like if you’re gonna do business with XYZ company, are they thinking about cybersecurity? Are they considering it? Are they educating their employees and their developers? Are they measuring it in any way? They don’t touch it. Not even a one iota. I don’t know anybody that does. So, it is a cultural thing, as much as it is a training education thing. I do believe though–and I’ll get off my soapbox here now–is that if you can educate a [00:17:00] developer as to how powerful those simple attacks can be, on sort of “unmonitored coding” principles, I think it would greatly empower them. And I’ve seen this in multiple companies prior, where we’ve had to go in to our developers that are building software solutions to expose how simple it is to hack into these systems. I mean, SQL injection vulnerability, for crying out loud. I mean, I wrote about that in 2001 or 2, in a book. So like, it’s just doesn’t seem to go away anytime soon. So it’s that culture thing that just, will drive you nuts.
[00:17:37] Chris Hatter: So I wanna react to the, the quality, like, are quality and security the same? I am a firm believer that engineers wake up every day and they want to do a quality job and they want to do it securely. I don’t think they’re actively like, I’m gonna do this insecure because it’s faster. I think the vast majority of engineers wanna do an excellent job of [00:18:00] quality, an excellent job of security. The example I would use, you said you would measure quality by performance. I would equate performance with availability, as a security practitioner.
[00:18:10] Chris Hatter: Right? And so if your application has some sort of vulnerability or some sort of issue that makes it so not performant, that it’s unavailable to, its end users, or group that you’re selling the product into, that’s a quality gap, and so I think that we need to, security practitioners and even like the CTOs of the world, it’d be really good to get on the same page around terminology, whether it’s quality, um, whether it’s vulnerability, or bug, we should get the same terminology being utilized. And then the, the one thing that I would agree with you though, Chetan, fairly heavily is I think that the incentive structure is, is broken, right? Like we talk a lot about security being incentivized to mitigate risk. Engineers are incentivized to [00:19:00] go fast and build things that can bring in revenue.
[00:19:02] Chris Hatter: And oftentimes those things are competing, right? And how do we, how do we kind of reduce the competing incentives? How do we better integrate these teams with shared objectives? And I, I do think it’s worthwhile to, to talk about. How do we actually change the incentive structures so that security teams, AppSec professionals and engineers, are working towards the same objectives compensated on that same basis.
[00:19:28] Chetan Conikee: And very quickly, very quickly, I’ll just rebuttal to Chris and Stu on this topic. Ben, sorry about that. You know, top of my mind on the onset. I used the word landmine. Let me, let me draw some relevance as to why I said that. Top of mind is this movie that I caught up on Saturday called as “Air.” Uh, it’s about,
[00:19:48] Stu McClure: Yeah, I watched it, yeah.
[00:19:51] Chetan Conikee: Jordan. It’s a beautiful movie. I’d recommend it.
[00:19:53] Stu McClure: It’s a good, really good, yeah. Especially if you grew up in the eighties, by the way. It was. Exactly.
[00:19:57] Chetan Conikee: And the reason I’m pulling this movie into our [00:20:00] conversation is there is a scene where Phil Knight’s resting on his couch, and, he’s committed this contract with Jordan, which is unprecedented, which is rev share on every shoe sold. So I’m, I’m asking both of you, Chris and Stu, you are running point on security. So can you create this unprecedented situation where you commit some percentage of your budget to developers saying, you know, if you write secure code, some percentage of your bonus is gonna be influenced by that. So that’s why I kind of took you to this landline for the reasons I’m opening my forum to both.
[00:20:37] Stu McClure: I would absolutely, I would absolutely implement that in, but of course, I, I’ve been doing this way too long. I know that the painful impact of not, you know, coding securely way too well, but I think that the masses wouldn’t, you know, that they just haven’t seen enough pain. Unfortunately, and we haven’t educated on the rewards of doing it yet, and so [00:21:00] anyway, I know we can talk all day on this topic for sure. And, uh, we might just.
[00:21:05] Stu McClure: Stu here again, thanks for listening to part one of Hacking Exposed podcast. We’ll release the other half of the episode in two weeks. Thanks so much for listening.