Season 1  |  Episode 3

Episode 3 of Hacking Exposed, Qwiet Edition! is coming at you from Black Hat 2023 . . .

Stu McClure, Chris Hatter, and Ben Denkers are back, this time with Arun Balakrishnan.

In this episode, our heroes tackle:

  • The ins and outs of openly pwning Minecrafters
  • Is there more to jailbreaking a Tesla than meets the eye?
  • The OWASP top 10 for LLMs might be a bit of a reach
  • Lessons from Zimbra
  • Being super late to AppSec
  • When is an incident not an incident?


Resources for this episode: on deserialization in Minecraft. on the Tesla jailbreak.

OWASP’s recent top 10 for LLMs, (even if we think some of them are a little silly).

Associated Press on the SEC’s new rules for disclosing breaches.


Show Notes:

  • [00:01:29] A little java nostalgia
  • [00:03:09] Witnessing a pwning
  • [00:05:00] Explaining deserialization attacks
  • [00:07:18] Even our CISO would have fallen for this, back in the day . . .
  • [00:07:58] The stakes are higher now.
  • [00:09:38] A message for the kids.
  • [00:09:53] Shout out to Wolfenstein.
  • [00:10:02] But seriously, why pwn a gamer?
  • [00:10:41] Even hacking is an industry now.
  • [00:12:04] Jailbreaking a Tesla may be just the beginning . . .
  • [00:13:36] Stu says self-driving should be free.
  • [00:14:01] Arun reminds us that Tesla still knows who you are.
  • [00:14:48] Stu drops the (row)hammer.
  • [00:17:00] OWASP is raising awareness of LLM vulns.
  • [00:17:26] Ben says you should think about social engineering attacks on LLMs.
  • [00:18:45] Stu makes an easy call.
  • [00:20:14] Don’t let LLMs write your code.
  • [00:20:30] Cross-site scripting is still a thing.
  • [00:22:29] Chris and Ben on the real issue with vulnerabilities like these.
  • [00:24:11] The SEC has entered the chat.
  • [00:24:40] Semantics as compliance (?)
  • [00:26:09] What to say when you don’t know what to say
  • [00:28:10] Practice makes perfect


Episode Transcript

[00:00:00] Stu McClure: Well, welcome everybody. This is Hacking Exposed Podcast, Qwiet edition. I have my trusted heroes with me to talk about the latest and greatest in threats, vulnerabilities, exploits, breaches. You name it. We love to talk about this stuff all day long. And at Qwiet, this is what we try to prevent, not in production but pre-production writing code.

[00:00:32] So this is what we’re, working on and here are some of the, well, the things I think that are most interesting the last couple, three weeks. So, one of which I wanna just get it kicked off right away ’cause I love this stuff. My kids used to develop in this and play this game all the time, which is Minecraft.

[00:00:48] If you haven’t, you know, looked at it probably in under a rock somewhere for the last 15 years. But they have a great hack that just came out. and it is what we [00:01:00] call a deserialization attack. And we’re gonna explain a little bit about what that really means, but it allows anyone to do well, just about anything.

[00:01:10] and so I wanna talk about this a little bit. It is in the platform. So the platform itself, Minecraft is based and written on Java. Java, of course, one of our many languages we support here. we also have an AI model on Java so it can find, not just the known threats but potentially the unknowns out there.

[00:01:29] But, it’s written on Java and there is the option of mods is what they call it. So modifications, so basically you can add libraries of feature functionality, things of this nature. Well, of course. What do you think those mods are written in Java. So, and I remember my kids back in the day, especially my youngest son, Cameron.

[00:01:47] He had a really good friend, Noah, actually, where I helped teach him Java to write mods inside of Minecraft. ’cause I thought, well, okay, big cool gaming thing. I wanna be the cool dad. So I decided to [00:02:00] get into it a little bit. And you know, of course I’m not a Java developer, but you know, it’s simple enough, straightforward enough, so I started to help him.

[00:02:05] And so it brings back a lot of these memories. So this is how the basic attack works. And I’m curious, y you know, your guys’ opinion or your experiences here, but. With Minecraft. and with any Java, of course, there is this function called serialization and deserialization. Basically, you need to take something and transmit it across to something else.

[00:02:29] In, in the case of Minecraft, it’s client server, server to client. And that’s what these mods largely do. And so these mods are written in Java, typically leveraging the Java libraries. There is a function inside of the Java library. That allows for serialization and deserialization. Well, there was a vulnerability in one of those.

[00:02:50] And so what it allowed people to do, and by the way, I watched the video. Did you guys watch the video as part of this? Oh my gosh, it was hilarious. This poor kid, it couldn’t [00:03:00] be older than 19 years old, is live streaming. His full Minecraft experience, you know, where he, it’s like a Twitch feed of thing where they like, go on, everybody subscribe to everything.

[00:03:09] So he’s there. And you could see every step of the way where, okay, the bad guy gets on, but he doesn’t know he is a bad guy yet. the information is shared inside of the chat where he’s like, well, this is weird. He shouldn’t have access to this. And it just gets worse and worse, and it falls off the cliff.

[00:03:27] Cliff really, really fast. You should watch the video. It is worth the hour and a half, even if you put it on four x speed or something like that. It’s pretty funny. But the poor kid, you just, your heart breaks for the kid ’cause you know, his world’s going to like, suffer for the next week or two, cleaning up this mess.

[00:03:43] But basically started to discover that this one particular, user on his server was having access to all of his assets inside of Minecraft, and he’s like, well, hey, how do you, how do you have access to this? [00:04:00] And he just comes out right out and says it, oh, this is a, a zero day, you know, RCE that we, we discovered, I’m a Russian hacker and this is what we do and you’ve honest,

[00:04:11] Ben Denkers: you’ve been pwned and

[00:04:12] Stu McClure: you’ve been pwned just straight up. I was like, okay, that’s cool, man. When you could just come straight forward and he didn’t quite believe it. He is like, ah, whatever. And then he is like, oh shoot, this is bad. This is really, really bad. So, all right.

[00:04:25] So the two things that were really obvious that you can do with this vulnerability is deserialization. Then let’s talk about the deserialization itself. Is you a, you can take on the persona of anybody else. So you can get the session, cookies. Mm-hmm. You can get tokens, basically, whatever assets you have in the game, based on those elements.

[00:04:44] Well, of course, then someone else can get ’em because they can take the session and run with it into their environment. So that’s first. Second is, yeah, you could do a full blown RCE, so remote code execution on a client and on a server, and any client connected to any server. So it’s like, Near Infinite.[00:05:00]

[00:05:00] So a really, really ugly one. And, if you know these Russian guys, which I wouldn’t, you know, I wouldn’t put it past ’em at all. It seems like part of the fun, took advantage of this core vulnerability in this library, which is deserialization. Now what is Deserialization? So the whole concept is you have to, as a programmer, Send data from one place to the next and then receive data from one place to the next to do that.

[00:05:25] You can’t just send it sort of as is. You need to serialize it or you need to deserialize it. You can think of it as almost like, you hash it and unh it, although hashing is the wrong analogy ’cause it’s a one way thing, but you basically encoding it in some form or fashion. And so when you receive the coded message, you know you have to know how to deserialize it or decode it.

[00:05:43] Well, if that deserialization decoding part. Is vulnerable to an attack like an overflow or something of that nature. You’re receiving raw data from the outside world. It will then trigger that vulnerability to do [00:06:00] whatever you want to do so that that. I mean, is that about, you know, you’re, you’re the developer program here.

[00:06:06] You should be kicking me under the table if I’m saying something wrong. But that, that’s the basic gist of how those attacks work. Right.

[00:06:11] Chris Hatter: A tangible example on this is, in video games, like loading a previously saved game. That, that, to me, when I was looking at this, was the example that, that connected best with me.

[00:06:23] So, in a video game, when you’re, when you’re playing, you save that the, like the game is occurring in ram. Right. And when you save it, you need to save the state and you save it into your, your hard drive. When you reopen the game, you gotta, you gotta get it back. And so the serialization and deserialization is happening.

[00:06:37] And that’s, that’s like a tangible example that I took.

[00:06:39] Stu McClure: Sure. Actually, yeah. You can serialize to the disc, you can serialize to another network, network, PC or app or service or whatever. So it’s that process. And so it’s the age old problem of input. Right. Validation at the end of the day. And you either trust a library, To properly input, the out outside world, [00:07:00] or you don’t.

[00:07:00] Mm-hmm. And in this case, you trusted it and it was vulnerable, so you got poked. And so that, that one’s a fun one. a fun little homage, I think, to the kids out there.

[00:07:10] Chris Hatter: It made, it got me, it got me think. So I was a big video gamer, you know, when I was in middle school, high school, even in college to a certain extent.

[00:07:18] I played counterstrike a lot and there was a lot of mods in Counterstrike. And I remember just. Not even considering whether or not I would open up my game to download mods, right? And so I would go into a server that had mods, you know, if you, if you actually looked in the console, you would just see the downloads happening, like a whole list of them, right?

[00:07:37] You just want the cool, I, nothing of it. I just want new skins on my guns. And I wanted to, you know, new maps and all the, you know, features of, of the mod and I thought literally nothing of it. And it’s actually kind of like a microcosm of, of kind of users, even in the enterprise environment of just.

[00:07:52] Sometimes you just want the best, the fastest thing, and you start to not think about what might be happening behind the scenes. And this is an example of that.

[00:07:58] Arun Balakrishnan: I think back then you had [00:08:00] lesser assets to lose as well, right? Like today,

[00:08:02] Chris Hatter: true

[00:08:02] Arun Balakrishnan: in this attack, I think I read, people were taking steam session keys, like you said.

[00:08:07] Mm-hmm.

[00:08:07] Chris Hatter: discord creds and everything.

[00:08:08] Arun Balakrishnan: Discord credits and everything which are on the mission. So I mean, these, you lose. You can be impersonated and, and absolutely. There’s much more things to do today,

[00:08:16] Ben Denkers: this, this at home, because I have kids that play Minecraft, right? Yeah. So I still have kids of the age that are, that are at home playing Minecraft.

[00:08:22] I think it’s a good lesson, you know, to highlight–

[00:08:24] Stu McClure: Yeah. So what’s the fix, right? The fix is, you know, a, try to just use your own private server. Don’t open it up. to anybody. That’s one way. I mean, a lot of this obscurity through obscurity, of course, and we will get to the real core fix. another one is, you know, don’t have, don’t allow mods, like, just keep straight up Minecraft and just play, you know, play with your friends.

[00:08:44] Stuff like that. The true root cause fix. well that’s, you know, real simple. You have to fix it inside each mod. And there are dozens and dozens of mods that use this specific vulnerability. So you have to really, you have to update all these mods [00:09:00] basically at the end of the day. And, but it doesn’t prevent zero day.

[00:09:02] I mean, that’s really dependent on, you know, the original vulnerability in the Java library.

[00:09:08] Arun Balakrishnan: Yeah. And then the, the private server thing, I think it’s tricky. I mean, if you want your friends to play, you need to expose. And then in the article I was reading, there are active attackers out there who are scanning servers to see if they have these mods installed.

[00:09:22] Yeah, for sure. Yeah. When keeping it private I think wouldn’t work. I think. Yeah. Fixing it is the only way to go. Yes. Yeah.

[00:09:28] Stu McClure: Fixing it is the way to do it. Yeah, and so my, my kids will love the story. All of them, especially Jillian, my youngest daughter, man, she was nuts on this Minecraft stuff

[00:09:38] Chris Hatter: I never got into mine–

[00:09:38] Stu McClure: Still plays. She’s in college, she still plays it. Love her to death. So honey. Be careful. Don’t use mods.

[00:09:44] Chris Hatter: I’ve seen streamers play. I never personally

[00:09:46] got into it.

[00:09:47] Stu McClure: Yeah, I know, I know. I, I know

[00:09:49] Chris Hatter: it’s huge with kids.

[00:09:50] Stu McClure: It gives me the migraine. Is that how old I am? Like if I do that three D thing anymore. I did Wolfenstein when it first came out and I started to get these migraines and I realized, okay, three D’s not for me.

[00:09:59] My brain was not [00:10:00] developed that way and even to this day, Minecraft just dropped.

[00:10:02] Chris Hatter: I want to, I want to

[00:10:02] get, just before we get off the topic though, I want to get into like the mind of the attacker a little bit. ’cause we were talking stealing steam credentials, stealing. discord creds. I get the impersonation angle, like is there a big prize for doing this on the, the side?

[00:10:18] Arun Balakrishnan: I mean, the remote code execution part. You could install a botnet, you could install a malware network. I mean, imagine the number of missions that are connected to all these servers there, and imagine that becoming a botnet. Yeah.

[00:10:29] Ben Denkers: Or they’re on their parents’ computer. Right. Which has access, who, which, who knows, has

[00:10:34] access to what, right.

[00:10:35] Oh, yeah.

[00:10:36] Stu McClure: Credentials left and right.

[00:10:36] Chris Hatter: I like the, you know, the botnet angle. I mean, you can get a lot of resources very quickly.

[00:10:41] Stu McClure: Well, there are teams that all have their function out there in the dark world, right? Like they’ll have a whole team on nothing but initial access. That’s all they do. They just sell initial access, so they get as much, it’s a numbers game for them, and then they sell that access into those that are gonna go mine it.

[00:10:57] I know there’s a whole hierarchy and structure, and each one [00:11:00] have a different paid sort of series of, of costs and expenses, stuff like that. But it, it, it’s very professionalized now and it’s. Like it’s nuts. So, yeah. all right, well we learned our lesson I think on this one, and hopefully kids out there, you’ve learned one as well.

[00:11:17] All right, second topic I wanted to cover was the Tesla. Supposedly the Tesla, Tesla jailbreak, as they call it. There have been some hints of other work in the past, but I think what they’re gonna show here at Black Hat may be one of the firsts, like true legit ones. jailbreaking a Tesla infotainment system basically to get access into features that are typically charged.

[00:11:43] Right. Is that basically the gist? Mm-hmm.

[00:11:46] Chris Hatter: In a nutshell?

[00:11:46] Stu McClure: Yeah. Yeah. And so, I mean, I don’t know if you guys are watching it or going to that ses, but it’s tomorrow, I think on Wednesday.

[00:11:53] Chris Hatter: Pretty sure it’s not gonna be very easy to,

[00:11:55] Stu McClure: it’s not gonna be easy to get in there now, but hopefully you can hack it up or somebody, one of our,

[00:11:59] Chris Hatter: we’re gonna [00:12:00] get plenty of press on it for sure.

[00:12:01] There was a lot of press leading into it, so we’re gonna get plenty on the back end.

[00:12:04] Stu McClure: Yeah,

[00:12:04] Ben Denkers: it’s interesting ’cause they could write code, right? And. To the infotainment center itself, which means, like, for me, this is like the base of what’s to come as opposed to

[00:12:13] Stu McClure: you hit it right on the head. Right, right on the head.

[00:12:15] Like I, I, when I was reading,

[00:12:16] Ben Denkers: it’s the beginning guys,

[00:12:16] Stu McClure: when I was reading through the, the details on this one, I, I have a, I have a nose for this kind of stuff. Yeah. And if I had a few of the guys that used to work for me, like Barby jack, And a few other Riley and those kind of guys, I would’ve said, dude, there’s more in this.

[00:12:30] Like, go deep on this hack. Yeah. Because there will be the ability to, I think if I’m, if I read it correctly on running anything, I want it on the damn thing. Yeah. Yeah. So if you can run anything on that, then you can get on the CAN bus. If you can get on the CAN bus, then you can typically, okay. It does depend on the architectures, et cetera, et cetera.

[00:12:50] And you know, I come from a long history of iot and especially vehicle security. But I can tell you the, the likelihood is high that you could basically control the, [00:13:00] the Tesla itself.

[00:13:01] Ben Denkers: It was a unique co– it was a unique compromise, right. By manipulating voltage, I think, which is something you don’t typically see.

[00:13:07] Stu McClure: Yeah. Voltage manipulation,

[00:13:07] Chris Hatter: I mean, and the only fix is to get a different processor.

[00:13:11] Arun Balakrishnan: Yeah. Yeah. That’s the thing.

[00:13:12] Chris Hatter: And by the way, you need physical access to,

[00:13:14] Stu McClure: you do need physical, which is important to know,

[00:13:17] Chris Hatter: but it is a precursor to, to where we’re going. Right? So Tesla’s been one of the biggest players in basically making.

[00:13:24] Features software in their vehicles, right? And everyone’s kind of, some other players are doing this, but this is gonna be the way of the world, I think, in the future. Heated seats, software over the air, the autopilot software over the air.

[00:13:36] Stu McClure: Yeah. Full self, self-driving. I like this full. Yeah.

[00:13:39] Chris Hatter: I’m a Tesla Tesla owner and I paid for full self-drive.

[00:13:41] Yeah. Which I think it’s, I think it’s pretty amazing technology.

[00:13:43] Stu McClure: And you overpaid. I should be zero

[00:13:46] Chris Hatter: candidly. I think, I think I did overpay. But, but this vulnerability is, is leading us to a place where, If these jailbreaks are out there and they’re in any form or fashion made simpler, you’d save [00:14:00] 20 K by just, using

[00:14:01] Arun Balakrishnan: one thing there is, it’s like the, the whole, the PSS four hack and the Xbox hack and stuff where like you are phoning home.

[00:14:09] So the Tesla network would know that you are doing this. Right. So you, I mean, your terms and conditions could effectively. Like stop you from using the car.

[00:14:18] Ben Denkers: Well, and we just talked about the dangerous of, of adding mods on Minecraft, right? This is no different. If you jailbreak the car, think about what the next time you plug in your phone, now your phone potentially could run code on the infotainment center, for example.

[00:14:31] Right? And so like to me, the setting of the stage of being able to have access to the features is one thing, but the long-term implications of all of the potential downfall of. It being compromised, somebody doing something malicious. I think it just opens, it opens up it

[00:14:47] up to a much larger discussion.

[00:14:48] Stu McClure: Well, it does, but I, I think the technique is something interesting. it reminds me of Row Hammer. Do you remember Row Hammer?

[00:14:55] Ben Denkers: I do not.

[00:14:56] Stu McClure: So, so on Intel CPUs, and I [00:15:00] was, I was an Intel employee, so I’m not, I’m gonna be very careful not to disclose any information that’s inside there. But basically, RO Hammerer was a public exploit of exploiting.

[00:15:12] How the Intel C P U read memory. So the way that memory and DRAM chips were read from the C P U effectively, as you continued to write into memory, there was an offset, just a smidge offset, whereby when you placed something on purpose into memory, you could trigger the e i p, the execution pointer to point.

[00:15:40] To a, the place where you already placed your code, your executable code, something to that nature. And it was a timing issue. It was a sequencing issue, something like this. Now, I desperately tried to build an exploit on that sucker at Cylance. I had the brightest minds I’ve ever touched on planet Earth, look at it.

[00:15:57] And we could not get it to consistently [00:16:00] work and exploit, but I’m sure someone else, you know, with a lot more resources could have gotten it done. But it, it does remind me. These physical kind of hacks are very, very real. We sort of forget about that. Yeah, yeah.

[00:16:12] Chris Hatter: Because we’re all, I just think they’re gonna be more common, right?

[00:16:14] The more software oriented vehicles become the desire to actually execute this type of attack. Right. And the desire, there’s always been a desire to do this remotely. We’re at black hat long history of hacking cars, right? but the desire to do this stuff over the air will increase dramatically as well.

[00:16:32] Stu McClure: all right, we gotta move on. We’re running short on time as much as this is a fun topic. I think we also wanna talk about, let’s say the AI ML large language model, threat landscape. So we, that’s a lot of talk right now, especially at Black Hat, I’m sure is only gonna increase. I just did an interview with a press guy that, actually used to be a Cylance employee, so he really smart guy, but he was very attuned to all the latest and greatest threats.

[00:16:58] mostly centered [00:17:00] around injection stuff, although there’s a lot bigger issues I think that are potential there. But this is gonna be a story that I think is just, we’re gonna have to keep tracking on. There is a, a, now an OWASP top 10. I think it just came out. Yeah. Top 10 just came out for LLMs. Yeah. Now if you read it all, I mean, I don’t know, did, I don’t know if curfew had any role in this, but it seemed a little forced, like there was a few in there.

[00:17:21] I’m like, okay, like master the obvious. Are you really calling that a vulnerability? Like

[00:17:26] Ben Denkers: to me, I, the first thing that I thought about was, is you have to train LLMs with social engineering techniques, right? Because like generally speaking, most of those would be basic social engineering that you would have to worry about with any, with any individual, right?

[00:17:39] Stu McClure: What do you mean?

[00:17:39] Arun Balakrishnan: like asking it to disclose things that it shouldn’t.

[00:17:42] Ben Denkers: Oh, right. Yeah, yeah, yeah. Absolutely. Right. It’s, it’s around how you’re communicating with the model itself.

[00:17:47] Stu McClure: Yeah. It seemed like a stretch to fill 10. Okay. I gotta be honest, but God bless him. Get it started, get the conversation started, get the discussion going.

[00:17:54] Chris Hatter: Yeah. I mean, I think, I, I think that that’s the point, right? What we, what we [00:18:00] saw with OpenAI and how quickly LLMs are becoming, Embedded in, in production software and enterprise software. there hasn’t been enough thought, I think on, on this subject. And what are all the ramifications and implications of using lms?

[00:18:14] I’ve been talking to a lot of CISOs over the last couple months, and I would say virtually everyone that I talk to has basically dropped the hammer on this, like banning the use of LMSs because they can’t control what people are putting into them. They don’t want trade secrets. Right. Or confidential information.

[00:18:27] Mm-hmm. I think just starting with the awareness and, and what the OAS group did, I think is a step. Absolutely in the right direction. I will see where the, the list of 10 goes, but, I, I, there hasn’t been enough thought, I think into, in, in how we’re gonna make sure these things are safe.

[00:18:41] Stu McClure: This will probably be a consistent topic on this podcast.

[00:18:45] I, I’m predicting right in here right now, we’re gonna have to be talking about this every time we meet. ’cause there will be greater and greater and deeper and deeper, not just theoretical vulnerabilities, but actual exploitation. Yeah.

[00:18:56] Ben Denkers: Especially as integrations get more and more complex.

[00:18:58] Chris Hatter: Now, our, our company, I mean, we’re, [00:19:00] we’re looking at how to, how to use LLMs and.

[00:19:02] In application security and code suggestion. Code fix. Yeah. I mean, this is a very important topic for us in, in our own organization. I, I think that what we have seen though is it’s not only in the AppSec space, but other spaces. People, I think to, to latch onto the marketing buzz, jumped in immediately and said, well, we have LLMS embedded in our product and now we’re seeing just how I’ll use the word vulnerable. But just how susceptible they are to various types of attacks. They’re, they’re susceptible to just outright lying. I think. I think you did some early queries and said you were dead. Here you are.

[00:19:39] Stu McClure: Oh my gosh. My famous, my famous story. Yeah. My, on my birthday, my CFO decides to ask ChatGPT for a joke that I might find funny on my birthday.

[00:19:48] And so sure enough, it responded with, I’m sorry to inform you, but Stuart McClure passed away two years ago in May of 2021. my heartfelt [00:20:00] condolences to you and his family, and she, and she said, so AI hallucination on story.

[00:20:05] Chris Hatter: I mean, imagine these hallucinations in a very serious context. Yeah. There’s something as serious as application security code fix suggestions.

[00:20:11] Yep. This is not something you could really mess around with, and it’s not,

[00:20:14] Stu McClure: You do not want take recommendations from, chatGPT on how to fix code. Mm-hmm. Just please. God forbid, don’t do this. Okay. But anyway, this, I think this will be a common discussion point, so let’s, let’s hold off maybe on the, on the bulk of the meat for next time.

[00:20:30] All right. Couple more to cover. One is, there was a recent cross-site scripting vulnerability inside of Zimbra. Zimbra is a collaboration platform, software, which has all kinds of email chat, things of this nature that allows for easy collaboration. And sure enough, there was an age old cross-site scripting that was a reflected cross-site scripting.

[00:20:52] I know there’s a bunch of different types stored and I don’t know. Mm-hmm. You probably know the better one than me, but this one, I like to call [00:21:00] out simply because again, it is a web-based vulnerability. It is a very simple input, sanitization, vulnerability and vendors manufacturer, software vendors are simply prone to this all the time.

[00:21:13] And, but it’s super, super easy to catch, but we’re not catching these things. I mean, Arun, why are we not catching these things? Not, not we like Qwiet, but like the industry. Why aren’t, why aren’t we, you know, I’m sure Zimbra, which I have no, you know, dog in the fight there. I wish them the best. I hope, I hope they, you know, they, they did address it quickly, so I think that was positive.

[00:21:31] Yeah, but why aren’t we finding these things? I

[00:21:33] Arun Balakrishnan: I mean, I think if you look back at OWASP top 10 for the last 2, 3, 4 revisions, Cross scripting SQL injection are always in the top two. Yeah. There doesn’t change. Yeah. Right. Because these are, these are vulnerabilities in operations that you have to do within an application.

[00:21:48] Stu McClure: Right.

[00:21:48] Arun Balakrishnan: And you cannot say, my application would not communicate over the web, would not talk to a database. Now, when you do millions of these operations, then yes, mistakes happen. Yeah. Right. So, and then now I think [00:22:00] I’m seeing more of, access control and all of that coming up in the list because you have more and more web applications coming out.

[00:22:06] Yeah. So, Yeah, I think it’s nature of the beast where you do a lot of these, mistakes happen.

[00:22:12] Chris Hatter: I, I’ve got a, I’ve got a quick take on this. Yeah. So, so what is the code coverage of security by design techniques across all these applications? Sure. Things like this, if you’re looking for it, you’re generally gonna find it.

[00:22:29] It’s not super complex stuff if you’re, if you’re scanning right, if you’re using a product, like why you’re gonna find this type of thing. I would argue that in the vast majority of cases where it’s an input, validation related issue, you’re simply just not looking.

[00:22:42] Ben Denkers: Yeah.

[00:22:42] Chris Hatter: And so to me it’s like we keep coming up with these things.

[00:22:46] Is the penetration of these products that make such a difference in security? By design, it’s not, it doesn’t feel to me like the penetration is where it needs to be.

[00:22:56] Ben Denkers: Like the governance around the AppSec program.

[00:22:57] Chris Hatter: Right. To make it doesn’t, and we too are [00:23:00] being, we talk to prospects on a very regular basis, large organizations, that are green fielding AppSec programs in 2023.

[00:23:08] Ben Denkers: Which is mind boggling.

[00:23:09] Chris Hatter: Right? And I mean it’s just the reality it

[00:23:11] Stu McClure: So are they scan it is what it’s, but are they scanning and not finding, or are they finding and not fixing?

[00:23:16] Chris Hatter: It’s both.

[00:23:17] Stu McClure: You think it’s both.

[00:23:18] Chris Hatter: It’s both. No questions about that.

[00:23:19] Arun Balakrishnan: I think you said they’re not scanning even.

[00:23:21] Chris Hatter: Well, what I’m saying is you have a contingent that doesn’t do anything. Yeah, yeah. Yes. So obviously you’re gonna have these kinds of issues. You have an another contingent where you’re scanning and just not fixing, and then you have a third contingent that’s scanning and fixing, and typically not the type of player that you’re gonna see here,

[00:23:36] Ben Denkers: or, or even the partial. Right.

[00:23:37] Chris Hatter: The other issue is, I mean, like you, you may have a player in that third category that just simply hasn’t gotten to legacy apps or hasn’t gotten to some, you know, part of their, of their, application base. Yeah. It’s just, to me, this is all about coverage, cover coverage and execution a big deal.

[00:23:55] Stu McClure: I know it’s a sad state of affairs. I mean, I’ve been doing this a long time, but

[00:23:58] Chris Hatter: it never, it never, [00:24:00] it’s not, it’s,

[00:24:00] Stu McClure: honestly, that’s what I said in 1996 and 7, like when I first started getting into this field. So like, it’s a sad state if we’re still there. All right, let’s move on.

[00:24:11] Last sort of topic, I think on the docket, SEC has come out with the, man, the potential mandate in December, given certain timing, et cetera, for public companies, thank goodness. Not private and VC-backed, PE, non-public, to disclose a breach or an incident, as they say, as they call it, within four days now,

[00:24:36] Ben Denkers: gonna be interesting.

[00:24:37] Stu McClure: Now this is gonna,

[00:24:38] Ben Denkers: it’s gonna be interesting.

[00:24:40] Stu McClure: I was talking to a CISO of a very large company this morning. Chris knows who I’m talking about, and he was really clear. He was sort of like, yeah, that’s not gonna be fun. But it just means that everything is simply under investigation.

[00:24:57] Chris Hatter: It’s all kind of creative ways.

[00:24:57] Stu McClure: Nothing is gonna be called an incident. [00:25:00] Yeah. And so that’s how they’ll get away with it. I mean, what do you guys think?

[00:25:02] Chris Hatter: I don’t look at, I look at this overall as a positive thing on the breach notification timeline thing. When you’re talking about large global multinationals, these organizations already have pretty tight breach.

[00:25:16] Notification periods. If you’re operating in, in Europe, you have GDPR, you’ve got CCPA here in the US there’s a lot of regulations out there that require you to, to report relatively quickly. It’s tight, but it’s unlikely that you’re not already dealing with a regulation that’s got some strict requirement.

[00:25:32] Yeah. Yeah. From a macro perspective though, I think this elevates the, continues to elevate the security conversation in the boardroom. I think it gets the CISOs, you know, having more dialogues with their executive management team about how they’re gonna engage here. At the end of the day, I look at this as macro positive.

[00:25:49] Yeah, it’s, it’s complex. But I would say lean on the, the incident response procedures that you have leverage experts, and then the one thing I will call out is you have to be extraordinarily clear [00:26:00] between your team, your management, and your board, what materiality is. Right. What is material for you, because that is a big linchpin of what needs to be reported.

[00:26:09] Arun Balakrishnan: Yeah. So then like that’s my question, like if is four days enough to even understand? Like what exactly happened or are you announcing this, which will only lead to more issues and complexities?

[00:26:20] Chris Hatter: It’s, I would say, and you’re, Ben’s got, you know, even more experience than I do on this. It’s in or miss, I mean, four days.

[00:26:29] Typically, you don’t know the full story or you may not know the full story. I don’t know that there’s definitely not the point where the expectation is that you must communicate the entire. situation in, in the four day period. But if you know enough, it crosses the materiality threshold where it’s relevant to your shareholders and other parties.

[00:26:49] that’s where you need to report it now. Details are gonna continue to come out. When I, when I had reporting obligations, in, in, in my previous role, we [00:27:00] were actually very communicative. And that was even a risk on our own side, like where we had to figure out is that, does that make sense? Is that, mm.

[00:27:06] Is that responsible? for us and, and for the, for the regulators, we were super communicative and we all always had good luck doing that. We would say, I don’t outright, I don’t know all the answers right now, but this is the situation

[00:27:18] Stu McClure: I know, but this is fairly clear. They define it as an incident, specifically, not an investigation.

[00:27:25] So you could keep an investigation almost in perpetuity before you’d find it as an incident. Now, the moment that it triggers from an investigation into an actual incident, That’s a real question mark. Well, what’s, how do you define an incident versus an investigation?

[00:27:38] Ben Denkers: That right there I think is the big, big question.

[00:27:39] Stu McClure: I think. So, it’s gonna be important to help define that or not define it on purpose so they have some wiggle room. ’cause once you do define it, well now you’re bound to it, beholden to it. But at the same time, you don’t want somebody coming in and saying, you never reported this incident, when your definition of this is totally different. I, so it’s gonna be a little tricky, I think.

[00:27:59] Chris Hatter: Yeah, these, these [00:28:00] definitions I would always say, define it. You need to define it and gain alignment across your organization. If you do that, then yes, you should follow your own internal procedure repeatedly. And I, I would say this is something you have to drill.

[00:28:10] You guys probably did hundreds of tabletop exercises, thousands even. This is something you, you, you put the definitions in place and you practice.

[00:28:20] Ben Denkers: Yeah. And you’re, you’re gonna be held accountable to your policies, right? I mean, in, in a lot of, in a lot of industries, depending upon what your operating, what your operating plan looks like, If you’re not doing what you’re saying you’re doing, you, you have a problem anyway.

[00:28:32] Right. Especially as it comes to an incident or something else. from a security program perspective. And I think having this idea of reporting in four days, it’s gonna be tough, for sure for a lot of organizations. But the drive for something good in the long run I think will happen.

[00:28:47] Stu McClure: Right. It could be a full employment act for professional services.

[00:28:50] Ben Denkers: Hey, I mean, I’m not, you know, it is what it is, but,

[00:28:53] Stu McClure: well, I remember, so my first reporting of a public company was when I was with, Intel, McAfee Intel when [00:29:00] they did in their 8k, the Operation Aurora hack. And I think it was one of the first, if not the first sort of public acknowledgement of an incident.

[00:29:09] And they did that without any mandate or. You know, requirements whatsoever. I was pretty proud of them for doing it ’cause, and they didn’t get a lot of credit for that. but, but I, and I was obviously hands-on for that operation, where it was largely, we believe China, accredited, individuals getting into Google and Intel and all kinds of stuff.

[00:29:30] But I was really happy that they did that because it is, you do need to disclose that kind of stuff. I mean, I’m not against. Any of that disclosure. I mean, full disclosures material is fine, right?

[00:29:40] Chris Hatter: Yeah, I think that’s what I said earlier.

[00:29:40] Stu McClure: It’s gotta be what is material now.

[00:29:42] Chris Hatter: That’s why I say you gotta define that and it’s gotta be super clear and everyone needs to understand it from top down.

[00:29:47] Stu McClure: Well, or you just make it super amorphous, so, so no one can say you screw up.

[00:29:52] Chris Hatter: I personally wouldn’t write,

[00:29:53] Stu McClure: Like, come on, let’s be honest. Okay. Well, let’s see. Any other thoughts on that? [00:30:00] SEC? You think it’s good, bad? Indifferent.

[00:30:03] Chris Hatter: I think it’s a good thing.

[00:30:04] Ben Denkers: Yeah, I think it’s good. I think it will, you know, time’s gonna tell how difficult it’ll be for organizations, but overall it’s a positive thing I think.

[00:30:10] Stu McClure: Yeah. Okay. All right, cool. Well I think, you know, of course we got a lot of extra credit topics, but we’re gonna have to push those off to, the next session. Hopefully you guys will get out into the, Black Hat conference a bit, you know, see a couple of cool things we can take back to the ranch and start talking about.

And hopefully everybody listens to this. Also had gotten a chance to come by the booth, at Qwiet, while you’re here, and look out for us in a couple more weeks, we will be bringing you more content. All right, thanks everybody for joining.

[00:30:40] [00:31:00]

About ShiftLeft

ShiftLeft empowers developers and AppSec teams to dramatically reduce risk by quickly finding and fixing the vulnerabilities most likely to reach their applications and ignoring reported vulnerabilities that pose little risk. Industry-leading accuracy allows developers to focus on security fixes that matter and improve code velocity while enabling AppSec engineers to shift security left.

A unified code security platform, ShiftLeft CORE scans for attack context across custom code, APIs, OSS, containers, internal microservices, and first-party business logic by combining results of the company’s and Intelligent Software Composition Analysis (SCA). Using its unique graph database that combines code attributes and analyzes actual attack paths based on real application architecture, ShiftLeft then provides detailed guidance on risk remediation within existing development workflows and tooling. Teams that use ShiftLeft ship more secure code, faster. Backed by SYN Ventures, Bain Capital Ventures, Blackstone, Mayfield, Thomvest Ventures, and SineWave Ventures, ShiftLeft is based in Santa Clara, California. For information, visit:


See for yourself – run a scan on your code right now