Cybersecurity Mentors Podcast

Lessons Learned From the Australian National University Breach with Suthagar Seevaratnam - Part 2

Cybersecurity Mentors Season 5 Episode 4

In Part 2, we discuss how a routine firewall rollout at ANU accidentally severed the attackers’ C2, forcing them into noisy, rushed activity that revealed their tradecraft. Suthagar explains the balance between observing for intel and acting to minimize harm, and how transparent, tightly controlled communications—culminating in a readable public breach report—helped rebuild trust. We also unpack why stolen databases without a data dictionary were hard to weaponize, and close with career advice: resilience, empathy, and people-first communication matter as much as tools.

ANU Breach Report

Suthagar Seevaratnam’s LinkedIn 

Send us fan mail via text

Suthagar:

So I think the best cyber people I've met and helped train haven't come from technical backgrounds always. And where you start is not where you end. And the more breath you have, because you know different walks of life, the better you are.

John:

Could you teach me?

Steve:

First learn stand, then learn fly. Nature rule, Daniel son, not the mind.

Speaker 5:

I know what you're trying to do. I'm trying to free your mind, Nia. But I can only show you the door. You're the one that has to walk through it. What is the most inspiring thing I ever said to you? Don't be an idiot. Changed my life.

Steve:

Welcome back to the Cybersecurity Mentors Podcast. Today we're diving into part two of our conversation with Suthagar Sivaratnum from the Australian National University Breach.

Speaker 5:

Yeah, if you missed part one, definitely go back and give it a listen first. We talked about how when the most advanced university breaches unfolded from the initial compromise to the incredible amount of persistence the attackers maintained.

Steve:

Exactly. In this episode, we pick right where we left off with the ANU team making a critical firewall change that temporarily kicked out the threat actors of their network.

Speaker 5:

Yeah, you'll hear what happened when they got noisy and how the team handled the data breach and they and how they manage communications under this pressure, this pressure and stressful situation.

Steve:

We also talked with Soothigar about transparency and public accountability, how they chose to communicate openly about the breach, and what that meant for institutional trust.

Speaker 5:

Yeah, plus toward the end, we dig into the lessons learned and some powerful leadership insights. Uh again, about communication and communicating with compassion when everyone's under fire.

Steve:

And if you're an aspiring cybersecurity professional, stay tuned because SouthGar closes with some great advice on building a meaningful career in the field of cybersecurity. But before we jump back in, here's a quick word from our sponsors for season five, ACI Learning. Check out ACI Learning at acilearning.com/slash SimplyCyber.

Speaker 3:

We aren't here to waste your time with buzzwords. In IT and cybersecurity, what you know and what you can do makes all the difference. We are ACI Learning. Training built for novices and pros who need measurable results. Hands-on labs, real-world checks, courses that get you certified and ready for what's next. Build confidence. Strengthen defenses. Achieve more. Visit acilearning.com/slash simply cyber to learn more.

John:

And then in one point in the report, it talks about a firewall change that killed their access that didn't seem like it was intentional. It was just like, hey, we we updated the firewall. Can you talk about that a little bit?

Suthagar:

Um so we had um uh some firewalls in in there, but they weren't really being they they were all you know kept up to date and everything, but they weren't they didn't have universal coverage. Yeah. And they it was one of the first things I found out about. Like this is before the threat haunt or anything else. Somebody mentioned it to me. In fact, it was the vendor that rang me and said I knew I knew them from a previous job. And they said, Sashia, you you gotta help us. They're not using my firewalls. I think I think his exact phrase was they're abusing my firewalls. And I had this, I had this image in my head of uh what are they like in the in the data center kicking it or something? Like, what do you mean, abusing? Um and it was because uh they hadn't gotten traction in being able to um get it installed because it would be disruptive. There was such a at the time, really hard to think about now. But at the time, uh there was this real reticence to do anything that might block internet access or or do anything of that nature. So it wasn't like they didn't have fireballs, it just didn't have full coverage. So we were already doing some routine change. I came in and said, hey, we've got to get these fireballs in place. Like just, you know, uh we can sort out the rules later, just do the fireball change. And uh there was a bunch of really good people who were trying to push that at the time. To be fair, it wasn't just me. And I guess I just added a bit of weight to saying we should do this. So it was a routine fireball, like it was routine. It was it was just making sure that we had the right spread. And unbeknownst to us, in a very sort of Keystone cops type moment, we cut off the we cut off the bad guys, right? Um But it wasn't some grand plan on our part to to cut them off, I assure you. It was it was just really fortuitous. So I guess the luck was cutting both ways. A little bit for the threat actor and a little bit for us.

John:

Yeah. Well, after that, based on what you guys found, it seemed like they had to kind of work a little bit to get back in. It seemed like they had to. It wasn't like a oh, we can just easily get back in. There was a they they had to struggle a little bit.

Suthagar:

Aaron Ross Powell They did. They they struggled. And what we f saw in in the in the logs that we ended up pulling out, I think they were getting a lot more desperate because they were getting a lot more noisy. Up to that point, their OPSEC was really good. Really, really methodical, very good, very timed. You know, when they would X-fill out a file, they would uh zero it out, wipe it out, remove the date timestamp. They had a lot of really good OPSEC uh involved, which um you know we detail a little bit in the report. Um clear discipline involved in in that operation. But then towards the end, they started to get really noisy. And I think again, I'm speculating, right? But I get the impression maybe they were running against a clock or I don't know, whomever whomever was paying the bills for them kind of said, give me some results. And they started to go a lot more noisy and try a lot more things. They were being a lot more vigorous about it. So um and when we cut them off, when we put the it was probably at a really bad time for them. Uh because they thought they'd gone into somewhere good. Uh because where the um enterprise systems were, it's actually a separate domain. And it's not very obvious unless you're inside the network, that it's a separate domain. And somehow they got through. There was a bridge and they got through. Uh but it was really they were struggling. I like that they were clearly pulling out things. We we know that they downloaded I'd love to know to this day, I'd love to know what it was. They obviously downloaded some code of their own, some sort of exploit. They try to run it. Um we only saw they compiled it on that box on one of the VMs. So we we we only see traces of that. We don't know what it is. Um and they tried to run it, it didn't work. They were obviously trying to get into that um enterprise services domain.

John:

Yeah.

Suthagar:

Uh really, really trying hard. And and eventually they got through. Um, they found a password, they got through. Uh someone had put a credential somewhere and they were able to use it. And that's when they were getting really noisy. And it was about that time we um happened to put in those firewall changes, and of course it just blocked everything. That they didn't have their C2 pathway anymore. Um So that's that's that's what happened. And and then we tried really hard to get in by by some other means. And that's when a very sharp-eyed IT person said, just these emails don't look right. I think there was another reason too, though. I I again speculation on my part.

John:

Yeah, yeah. Yeah. Um And they were listening. I mean, that was the other thing that that I I think stood out was just they were constantly, whenever they would get some access, they were listening on the wire or they're you know sniffing traffic. That's what it that was that correct? Yeah, they were based on what you saw. Yeah.

Suthagar:

They were they were doing um packet captures, they were looking, they were sniffing for credentials on the wire, um, they were looking for weak protocols. Uh you know, all of these are um very basic tools that you can use. And in fact, a lot of them were freeware tools that they and they were just sniffing out um um uh any credentials that they could find. But when they got to the enterprise services domain and they got access to the underlying databases of these systems, they didn't go through the web interface uh that allowed them to see uh what this data was. What they got was it wasn't encrypted at the time. They got these databases. They got to the actual databases themselves for these enterprise applications. And it wouldn't have made any sense to them if you didn't have a data dictionary. For once, security through obscurity actually worked. Um I looked at some of the stuff that that um they would have been seeing and I thought like it's not encrypted, but it might as well be because it's um because it was you know, you did have it like a student ID and then it would sort of concatenate with your data burst. And it was it was our own weird, hyper-customized data structure. And if you didn't have the data dictionary to go with it, you would have had no chance of of of reading it. Like even us looking at it when you and with inside knowledge, we were sitting there thinking, what is this code? What's this code actually? Yeah, yeah. And somebody in some other division would go, Oh yeah, that's this code. That means this. Oh, gotcha. How did I know that? And um uh so I think they got that and looked at it and went, hmm, these guys are good. They uh I think they wanted to go through and find a way to get into a user that actually had GUI access that actually could do something a little bit more like a power user or something. And because honestly, I I that's why I think um even in the press when I was talking of it as you know, my my genuine sense is they didn't get what they wanted because they didn't know what they had. They just took a bunch of data and then couldn't do anything with it.

John:

And I don't remember if this was in the report, but one you know, if you believe they're still in your network at the time, and you've got to be pretty careful and cautious about tipping them off.

Speaker 1:

Yeah.

John:

How d how did that play out? It's just like, okay, we're gonna eventually if we do see them still, we're gonna try to k kick them out. Um but we gotta do it carefully and not let them know that we know until we kill that axis, right? Did that is that how it happened?

Suthagar:

Yeah. Um we at the time because we were still seeing some things, we weren't sure if they were active or something residual that they left running, or uh So their activity at the time we cottoned onto them was them trying to get back in. Uh and you know they had a they had some purchase in some other parts of the organ uh some some other parts of the environment in one of the research schools. Um that's what we picked up. So we weren't seeing them in the sort of core of the network where we were most worried. They were they were more historical traces. But to your point, um, and certainly in other circumstances that I've I've had to deal with, yeah, when you're up against a live actor, you are trying to balance the uh intelligence value of looking at what they're doing and how they're doing it, and therefore, you know, some of their motivation techniques and um uh the the the tradecraft that they're using and that obviously is very useful to you and it's useful to the authorities. Uh it helps you defend better sometimes the next time and also to contain them. Uh you've got that. But there's also the the harm minimization component. And at some point that way outweighs the intelligence game. You just have to do it. You're there ultimately to protect. You know, you try to hold that as much as you can because there's so much valuable stuff that you can find out, and not just to protect yourself the next time, but pr protect others as well. Um but at some point that becomes too dangerous, that becomes too much of a risk, and you just have to act. Okay, if you tip if you tip them off, that's fine, good, but get out of my network. Um but what we really wanted to know, and that's why we split into two teams, as I said before, we wanted to know how they got in. We wanted to understand those vectors so we could we could shut them off. And we wanted to know what they were going after so that we could put other measures in place to uh protect that target, whatever it might be. So it's that's how we did it. We we we sneezed it out that way. And we were very, very gentle in our probing. We were trying really hard to maintain our own OPSEC. But at some point you make the call. Doesn't matter what you do, they're gonna find out. They're not gonna know that they're gonna know that you know. What we didn't know at the time was there they already thought we knew because we'd cut them off with a firewall change.

John:

Right, right.

Suthagar:

Um so uh no one no one had told us that we knew. That was the only difference. Um But yeah, you you you're you're absolutely right. And and if if they were much more live, there would be a lot of caution around um what we do, what we don't do. Uh but there will always come that tipping point. There will always come that tipping point of, okay, it's a bit shin time.

John:

Sure. Yeah. Um just a couple other things. Um you know, just as you were you you guys came up with this report, I think fairly, or at least the notification, maybe not the report, but the notification of hey, this is what's what's happened, and being pretty open about it and and sharing this with the community. But that communication is it's very important, as you said. You've got the stakeholders, you've got students, you've got parents of the students, and it and on top of another previous breach and faculty just talking about communications in general and how how that played out. I think you had great support from your vice chancellor, very good. You know, from that. But but if you could speak to that a little bit.

Suthagar:

I I look at one thing I'll say before I get into the detail of that. He set the benchmark. He really did. Uh um while we were advising him and helping him write, you know, while we were writing, uh what what was ultimately sort of put out, he had a lot of input into it and it was his voice at the end of the day. And he wanted to own it. He owed it to his uh constituency, his students, his fellow academics, professional staff members. He owed it to them. He felt very strongly about that, as did I. Um we have to own it. And we have to we have to be humble about it. We have to say we're sorry. You know, um and re I uh uh uh as I said earlier, you you have to always be in control, and being in control is also being in control of the narrative. Because if you're not controlling narrative, somebody else is controlling it for you.

John:

Right. Right.

Suthagar:

When uh we played an incredibly like to your earlier point about not tipping off the threat actor, we made sure that pe that even the media wasn't tipped off. Like there was so much control for those two weeks as we hardened because what we thought was gonna happen, and it did, what we thought was gonna happen was as soon as we went live, it might create a reaction from either that threat actor, right? Likely, but more likely others who might just be opportunistic and see if they can get in whilst we're still down and out. So that's why we try to keep that time from when we first knew to when we went public. That was a it was like a two-week period. Yeah. One week to marshal our resources and start getting started, and and one week to just bat down everything that we could. We didn't want to let it go beyond two weeks. Uh and um you know, people called us out for saying, why did it take so long for two weeks? Well, in the grand scheme of things, it wasn't actually that long. Uh but but you know, people talk out of anger and I and I appreciate that and I acknowledge that. Um but two weeks is is a reasonable time, especially when you're trying to make sure that there's no further harm. But the day we released that first comms, the very first day, we were attacked again. But we were attacked again. So we were exactly right. We were exactly right to have done what we did because we would have incurred more harm had we not. So from a harm minimization perspective, I think we were dead on the money. But the comms was really aimed at saying we are sorry. We own it, we will do better, we will fix this, we'll do everything in our power to help you make sure that your um digital uh you know identity on on um is protected as much as possible. And uh just uh a couple of hours before we went live with those first comms, we got a very angry email or phone call from the media saying because we told we told our ministers, we told the the the politicians that this was gonna happen. And that was the first that the they'd heard about it, uh the that the media had heard about it, and they were really angry, like that he didn't tell us blah, blah, blah. I said, well, they have no reason to tell you. You have no need to know. And and they were a bit stunned that somehow we had kept it so compartmentalized and still managed to do all of this stuff.

unknown:

Yeah.

Suthagar:

So especially in a town like Canberra, it's not that big. Um I have no that that was a proud moment for me that we managed to managed to to make them so annoyed, right? And uh but it but it but it drew a line in the sand saying we are in control of this narrative. Yeah. It is us who will who will tell you what what is, you know, you can speculate all you like, but there is an authoritative voice to go with this, and that authoritative voice is the vice chancellor. And uh, you know, he speaks for all of us. And he was so deeply apologetic. And I've seen that same prior to that, I hadn't seen anyone come out, certainly in Australia, I hadn't seen anyone come out and be that humble and that open before. But subsequent to that, I saw so many other people adopt exactly the same language, phrasing, you know, um, the humility. No one showers themselves in glory from a data breach, right? But you should be you should be humble about it and you should be um um you should own it. Uh so that was that was the first set of comms, and we were trying to keep people abraised and we were doing updates as much as we could, and we gave people track practical advice. It was the first time anyone had heard from me officially. So it was it was who is this guy? Uh and uh we've got a CISO now. We Yeah, yeah, yeah. It was it was my first sort of foray into this, and then we ran some town halls uh where we fielded a lot of questions and we, you know, we we answered them as best as we could with all the information that we had at our disposal. We were trying to be open and transparent. And I still remember the other really um interesting moment was we were about to shut down the crisis management team. And we were in the in the main hall, like this uh um very, very large conference room that the university has, it's like its main boardroom. And we're all sitting there around this massive august table. And uh all of us who've been involved, and um the the person who was leading the crisis management team, she said, right, okay, so we can close this down now. We've done everything we need to do. Um okay, job well done, guys. And and I put my hand up and I said, there's one more thing we need to do. And she said, What is it? I said, uh, we need to do a breach report. And and we was like, Yeah, yeah, we need to do it, definitely need to do a breach report. Yeah, absolutely. So you know, you need we need to write a breach report, very important. And I said, Yeah, and then we need to make it public. And you watch four 30 people in the room, it was like something out of an ad. They all turned around and looked at me and went, Ew, nuts. So um and uh uh and Brian, he was sitting next to me, he said, he's absolutely right. He's absolutely right, we must do this. Because we live and you know, same reason I said before, we live in daylight. And you know, we're the victim. We uh you know, we made mistakes, we own those mistakes, but we owe it to all of you to, you know, as our community to tell you what we're gonna do next about it. You have every right to know. And we in fairness and all due credit, some of my inspiration for that actually came from what the Singapore government had done for a breach of their virtual environment that that happened probably a year before ours. And I was so amazed at how accountable that government held itself for that breach. Um, having watched so many other organizations in Australia not do that, here is this government who had in their incident response review panel had the Prime Minister. Wow. And uh they were writing this report about this is what happened. And you can I think it's probably still probably up there somewhere, right?

unknown:

Yeah.

Suthagar:

And I still, and I thought it blew me away that they were so willing to do that and how how open they were. And um as much as I'd love to take credit for for us releasing our report, um I I was inspired a little bit by by that. And I thought that is the right thing to do. Like, you know, kudos, but that's that's absolutely the right thing to do. Um, the only difference we made was their their report was very technical. They're like, like, you'd have to be a real cyberboffin to follow that report. We didn't want to do that. We wanted to write it for everyone. We wanted to write it for uh a broader audience. And I you we we ended up using that breach report as training and presentation material on courses and outreach. Uh it had a life of its own. People have come to me from other institutions saying we use that as training material, we use that as education material. And not that that was our intent, but the reason for that uh was that um it was written to be read. It was written to be read by anybody who could pick that up and understand it. Uh and that's fundamentally uh what that report was about. It was our compact, our social contract with our community to say we own this. We're giving you all the details that we possibly can, which is the the vast majority of it. Um the only bits we didn't, we didn't name and shame. Uh we didn't want things uh uh because we were still in a fairly uh fragile state, fixing things up, so we didn't want to release certain details. Uh we didn't want to be attacked again. And so that was fundamentally why that report was released. And I was really blown away by how well it was received. I I was um very uh very uh interested to see how people responded to that because we kind of hoped others would follow. Yeah. And I remember um one organization ended up in front of Sen our um Parliament, and the senators at the time were saying, um, well, are you gonna release a brief report? And this poor person was saying, Oh, we have one, but we're not gonna release it. And the senator said, Oh, but ANU released their report. This uh this this poor hapless person was like, yes, we all applaud what ANU did. Very good, uh, excellent, but we're not releasing our report. Uh and um I thought, oh well, you know, everyone has to make their own call. But we're a public institution. Uh, you know, we're taxpayer funded. Uh we have a very open democratic uh environment, dare I say, full of academic freedom. Uh and we owed it to we owed it to the spirit of that place, uh to ANU to do that. So that's that's um otherwise we would have done a breach report, we would have shared it with you know various stakeholders, and then it would have become shelfware. Yeah. So to today it has its own, you know, years later people are still talking about it. And um I'm pleased that it I um that people got some value out of it. I I really am. My only regret, uh straight after I wrote that report, I uh it was it had been a long month or two, and I took a couple of weeks' leave. I said, I I need I need to decompress. Yeah, I need to go, I need to go um uh you know fix myself. And uh I just stayed at home. I wasn't doing anything else other than being at home and just sort of relaxing. And I made a rookie mistake, John. Rookie, rookie, rookie mistake after that report was released. I read my own press. And and and I read it on LinkedIn. Oh. And all I can say is, and I I I didn't have much of a presence on LinkedIn back then, and all I can say is it was predatory self-promotion, is how I can best describe it. People were having a go at the report as though it was fake and that we were using it for deflection, and there were all these, you know, like, you know, nobody would do this unless it was a fake thing, and somebody was some some someone was being really narky saying, I'll check it for viruses because it probably has a virus. And it was all this sort of really narky stuff. And yeah, I just sat there thinking, oh my god, what have I done? I've led poor A and U up the garden path. Oh dear lord, what have I done? This is terrible. I thought I was doing the right thing. It worked for the Singaporean government. How come it didn't? Like, I have completely misjudged this. Uh and um then Richard reached out to me, Richard Gold. He reached out to me. Uh, and he said, Hey, I've done this report. Uh, what do you think? And I he was the only person I made actual contact with. So many other people reached out to me on LinkedIn seeking comment and all the rest of it, and I didn't want to talk to him. Um partly because I just wanted a break, right? But Right, right. But Richard actually entered into a dialogue and I said, I cannot thank you enough for doing this. And I will confirm for you that you got 98% of it correct. And the other 2% you couldn't have gotten. There's no way you could have gotten because it wasn't in the report. So um you did it brilliantly. And just when I thought I was losing faith in anyone with cyber in their title, uh uh, you come along and you do this, like you've restored my faith in my industry. Thank you so much. Uh and I said, But uh, you know, I really, I really love you and I hate you at the same time. And uh he said, Why do you hate me? And I said, Could you have done this two weeks ago? Because my holiday uh uh like if you'd done this two weeks ago, I would have been a happy person. I would have enjoyed my holidays. But uh I ended up meeting him um in London um a a a year or so later and I took him out to dinner. I said, I owe you this dinner because you you restored my faith and you did such a good job. And I'm so, so thrilled that there are people out there who genuinely gravitate to the right message of this.

Speaker 5:

Right.

Suthagar:

Uh, and and have the expertise to unpack and reverse out the expertise that we had to um not hide, but m turn into, you know, sort of lay sort of language. Right. Right. Uh and you're able to do that. Like that doesn't just speak to your ethics or your um um uh you know, your drive to do this. It speaks to the incredible professional skill that you've got, because only someone with that kind of skill could have done that. And I I really do tip my hat to him. He's he's uh one in a million as far as I'm concerned. Um if you ever if you ever want someone on your podcast, find him, find him. He's definitely he's someone I would listen to as a as someone to mentor me. Uh he's just absolutely brilliant, um UK-based. Uh so um so that was our that was our report. Uh as I said, it got used a lot. I've had I still have people reaching out to me saying, hey, do you know I use this in my course? Do you know they teach this at this particular place? No, I didn't. Um I didn't even put my name on it. Uh because it belonged to the ANU, right? It didn't, it belonged. It belonged to all of us. It wasn't, it was in my report. Uh so I'm I was very touched by that. Uh Brian went on to he had uh in every presentation I do I have this slide from from his works. He went and spoke to his peers in the in the university community, the other VCs, federal government. And he started off by saying, I want my pain to be your gain. And he said, I I want you to take advantage of what we have gone through to protect yourself. And so many people from the rest of the sector came to me afterwards and said, Your report helped me start a conversation with my VC or my whomever, so that we could get more funding. And if that report didn't do anything other than that, I'm so pleased. Um so incredibly pleased that that it helped others in in a in a way that we may not have intended originally.

John:

Yeah, and I, you know, not trying to it's just lessons learned. And I and I was going to talk about just for you, you've mentioned a few of these, but before I do that, I will say I will it just helps relate the stories from other, especially higher ed institutions that are similar. We have similar challenges, but when you're communicating, hey, this is what happened and this is how it happened, this is how they were affected, here's what they learned from this situation. Let's learn from them. Let's not go through the same if possible, if we can. If there's one thing we can change that makes us better, makes us stronger. Um, it just relates. So that your report being that example, right? Other universities can look at that and say, look, you know, we we everybody has similar challenges, so let's look to work together to try to prevent this from happening here. And I've done that from other other breaches, um, and just try to help each other because it the it really is the bad guys working together, right? They're working together. Um, and we can help each other in ways like this.

Suthagar:

Yeah, I I think um, you know, this is that's going back uh quite, you know, six six, seven years now. But um I I think things are a little changed uh for better and for worse. I I think uh there have been so many breaches that it's not as salacious in the press anymore. Uh and I think the the it's pivoted a little bit from uh, hey, I don't want to disclose that I this happened to me because it's very shameful, to hey, this has happened to me because I have to report it in a regulatory way, I have to be transparent, um and I have to be apologetic, and I, you know, all that kind of good stuff. But I haven't seen it quite tilt into full transparency. I haven't quite seen a lot of that yet. I I'm I'm still hopeful that that will happen. But I have seen a more uh accepting uh dimension to it. Having said that, uh I've also seen uh, you know, class actions and uh legal actions being much more at the forefront. Um I was reading a report uh a couple of weeks ago about how uh there was a mapping of class actions here in Australia against missing controls and how they were going through the controls list of what this organization didn't have and should have had. So wow, this is the level at which the the legal response uh has has become, right? And and I sat there wondering, is that going to aid transparency or not? Because from a defender perspective, exactly to your point, John, we should be working together and we should be sharing this so that we're all safe. You know, harm I I I one saying that I have over and over, harm to one is harm to all. And if we don't share, we're gonna be weaker. We're we're much stronger together when we share and we are open and we're not judgmental to one another. Because I'm sure a lot of people read that report and went there, but for the grace of God goes us, right? Yeah. Yeah. And um uh that's uh we we shouldn't judge one another when these things happen. We should be helping in making making the whole ecosystem more resilient. Um it's probably not a view everybody shares, but it's it's one that I hold very strongly.

John:

Yeah, I agree. So a couple couple questions. Um just in general, looking back, you've talked about a few things you might have done differently, but just any any lessons that you would share just uh from living through that breach and if there were things you would do or approach it differently for people that were are are going to if you're in security, you're going to experience incidents. You're good, you're you're gonna have a bad day. You're gonna have a bad thing. Yeah, you will.

Suthagar:

You definitely will.

John:

Um, so anything you would share to those, hey, you know, I think some of that is just just some of the things you said, just the compassion and understanding and and strengthening each other, like, hey, we're gonna get through this, but anything else you might would do or say to look at it differently.

Suthagar:

Uh there are things that I think we perhaps could have been a little bit more on the front foot around. So um you gotta think about the comms to the public really carefully. So it wasn't just the first wave of comms, which I think we did reasonably well. It was the subsequent set of comms. Um like we we dialed in an entity called ID Care, which is uh a not-for-profit organization that can help people restore their identities. Like we should have probably done that right at the beginning. Like there should have been a little bit of like sequencing to that messaging in terms of putting ourselves in the shoes of an affected member and saying, okay, well, how would you feel? How do we r how do we sort of tailor that message a little bit more to you? Um, how do we make sure that that process is not too clunky? Because unfortunately, I think some of that we were just doing a little bit on the fly. Sure. And uh looking back on it now, that that it didn't go badly, but it could have been a bit more, it could have been a bit more smooth. Like we set up a call center and surge capacity and all that kind of stuff. But um some of the comms bit, I feel like in that aftercare period, probably I I would have done a little differently with with hindsight and knowing knowing the constituency a lot better as well.

John:

Yeah.

Suthagar:

At the time I only was six months in, so I didn't know them too. Um so I I think there are things that I would have done a bit differently. That the advice that I gave uh around, you know, here's some controls you might want to apply at home, you know, you know, strong password, password managers, all that kind of stuff. It was a bit stilted. Uh it felt somebody said to me afterwards, it sounded like somebody from corporate wrote it, not you. Funny thing, it was actually like I sent it to corporate. Uh and um and they said, that's not that doesn't sound like you at all. And I thought, oh yeah, you're right. It that authenticity needs to be there all the way throughout, that warmth, that authenticity. Uh uh always reflecting back that you've got a human being who's been affected at the other end. Right. Particularly when there's PII involved. Um, you know, it's their personal information, it's it's theirs. Uh you're a custodian that unfortunately has allowed that um to uh beyond uh beyond your control, but uh has allowed that to go out into the wild. Um and someone at the other end is going to be very angry because they trusted you, and it's a and they will see it as a breach of trust. And you need to factor that into your comms. So I I think my replay to anyone is be really mindful. Don't be I do worry that because we've seen so many breaches now, and you know, you get every day is another breach. I I think there was one for Tenable today. Oh, oh no. You need to go look this up. Uh yeah, so uh um uh i it's um it i it's really sad, right? It's really hard. And but you don't want to get so normalized to it.

John:

Yeah.

Suthagar:

Like that should never be. There is always a human story behind it. And so that's a reflection of what we could have done a bit better. Um I think we could have been talking about it a little bit more. Like I did a lot of presentations and maybe it was just the sign of the times, to be honest, because I again I think things have moved on a little bit, but I didn't get a willingness of a lot of people to sit there and talk about it. Like um one thing, one thing that we did, we did, we did reflect on and do something about. Uh, we had journalists come down looking for outrage students, and they wanted, you know, quotes from outrage students, and all they got was a meh. And um uh and I thought, oh my goodness, what have we done to these students that they they think that a breach is like so parse, right?

Speaker 1:

Yeah.

Suthagar:

Oh my goodness, what have we done? And you know, we there's a cynicism to this that we need to we didn't think about. Uh we we subsequently started putting that into our cyber training, which we did ourselves. And we started using humor to cut through the cynicism. Yeah. Uh the I still have copies of it. I still have copies um of them, but they were making memes uh about uh about the breach whilst we were trying to do comms. They they kept me suitably entertained. I I gotta thank all the students for doing those comms, uh doing their own personal comms. And um I thought they were hilarious, but uh you know, they it just shows that we didn't understand them. And the messages we were putting out were a bit stale. And if I had my time again, it would have been it would be a little bit different in terms of that. Know your audience, I guess, is the is the lesson there. Know them really, really, really well and tailor compassionately to them. Uh because that's the difference between them walking and sticking around because they understand that, you know, you're the victim.

John:

Yeah. No, that's great. Um, last question. So just for those that are, you know, any advice for those looking to get into cyber or early in their careers in cybersecurity, there's a lot of the folks that we talk to, they're they're trying to break in. Sure. And, you know, we talked to a lot of stuff. There's a lot here to that we've unpacked, and thank you for that. Just of what you, you know, it could be scary. Like, hey, this is not every day, luckily, for fortunately. Um, but there is a lot here that can happen, and and some of the things that I think about and I try to communicate is you know, you have to be understand that you're gonna have those days that are gonna be tough. And and we in security need to be able to work together, have good communication, all the things you talked about. But what are there any any advice you would give to those that are early in their career or looking to get into cybersecurity?

Suthagar:

Aaron Powell So two things I would suggest to people. Uh number one, it is a job that demands resilience. Uh and uh some of that comes from compassion to oneself as well as to others. But some of it is also an internal fortitude because you're gonna have those bad days and it feels like a real marathon interspaced with bursts of, you know, 100-meter sprints. It it's it's ti it can be tiring and it can be um not everything's inside your control. And you're gonna be frustrated because you know what to do. And most of the time you're not you're not in incident response. Most of the time you're not in that world.

Speaker 1:

Yeah.

Suthagar:

Most of the time you're trying to strengthen things to stop them from happening, right? You're in the resiliency mode and not the response mode. And you're gonna come across people who still don't have MFA on their account or or um, you know, something ridiculous like that. Like, why can't my password be password? Right. Um uh and you sit there thinking, what have you been like living under a rock for the last decade? I mean, uh do I really need to do this? Is this is this some kind of stitcher? Um uh and you and you sit there wondering, and and it's really deflating because you know that such something simple can stop a lot of harm from happening. Yeah. Um in an analogy I use, uh it's probably the same in the US, but you know, in in Australia on beaches, we have two flags and you're supposed to swim between the flags, right? But it doesn't stop people from swimming outside the flags. But but to me, cyber is like some swimmer has managed to get caught in a rip tide because they swam outside the flags, and so the lifeguard has had to had to go in, you're the lifeguard, and you've dragged them back to shore, kicking and screaming all the way. And then finally you get them up onto land, and the first thing they do is pick up a little bit of sand and throw it in your face and say, How dare you save me, sir? Like um and then and then walk off back into the water, and you think, What just happened? You know, and um uh so you're gonna have days like that because that is normal. Yeah. You're dealing with human beings. And if you're gonna be a good cyber person, be a student of human beings, right? Really understand the psychology that goes into turning the dial. That is a far more powerful weapon than EDRs or NDRs or scenes will ever be, right? It is a cultural thing. At the end of the day, what is a breach other than a sequence of bad decisions, and the outcome of a sequence of bad decisions? So stop them from making bad decisions, right? You've got to guide them. That's the that's the whole premise. You are a guide. Yeah. And and you've gotta you've got to guide them. And that means you've got to know the mindset of the person you're dealing with. And it might not work the first time. And you might sit there thinking, hmm, I'm not sure we're speaking English. Yeah. Uh I gotta try again. And and you've got to you've got to change your message. If you don't get the response, you change the message, right? And you've got to change the message to find something that works for them. Get good at that. Get really good at that. So I think the best cyber people I've met and helped train haven't come from technical backgrounds always. And where you start is not where you end. And the more breath you have, because you know different walks of life, the better you are. Um, you know, one thing I used to do with my SOC team at ANU, um, we used to go for visits. We used to stage visits to other parts of the university. So they tell us about all the fun things that they were doing with their uh, you know, with their science experiments and and so forth. And it was kind of cool. Yeah. It's just ridiculously cool because you've got you nowhere else in the world could you get that right other than that? But I did say to them, you're never going to understand this place looking at your screen. That does not tell the real story. It doesn't tell the why. So if you think tool proficiency, uh what I call buttonology, if you think that that alone is enough, you're on the wrong path. You're on the completely the wrong path. So I I highly recommend to people be a student of human nature. Get out there and talk to the people who are most affected by the decisions you make, um, by the things that you see. You know, we talk about pattern of life as a as a mechanism to understand the data. Well, go and see the life. Go and see why this person is doing what they're doing. They don't want to do MFA because they're in because they're in a lockdown environment and they can't take their phone, right? Yeah. Oh, okay. Well, let me go and help them. Uh you need to have that mentality. They're not doing it because they're silly. They're doing it because they don't understand or you they need to be guided to a place that's free from care. You know, no one no one truly wants to be unsafe. Um so I think I think just having that mentality. And the last thing I would say is if you haven't done something other than cyber, take the time out to do something different. Um sounds really weird. Like I've seen a lot of uh conversations around should cyber be your first job? And there is a very compelling case as to why maybe it shouldn't. Um because there is so much intersect and so much broad thinking. It's such a broad and ecumenical church, right? There's so many avenues, technical and non-technical, from where you can start. Um that's what I love about being a SISO. It's so multidisciplinary, so transdisciplinary in the thinking that you've got to bring to bear. And I my recommendation is if you do want to be a CISO one day, take a step out, go and work in the business, right? Whatever it might be, to see it from the outside in, look at security from the outside in and go, hmm, okay, my job is not to increase friction, my job is to reduce friction. My job is to, my job is to make these people's lives easier. And even just taking a step out for a year half a year, a year, your perspective will broaden so dramatically, and you're going to be just that much better a cyber per a cyber professional. So I highly especially if you're going into leadership roles. So I highly, highly recommend that.

unknown:

Yeah.

John:

Yeah. I I totally agree with with all that. And and that's something it's great because we've said similar things in previous episodes of how important the human side is and the soft skills side is and understanding people and the hum understanding humans, right? Speak human. If you can't speak to people, it's gonna be it's gonna be challenging, which a lot of technical people, it is a challenge. They're very, very smart. They know the ins and outs, they know the the ones and zeros, but they they have a hard time, I would say in IT in general, but also in security, being able to communicate to leadership, especially, but in general, like that's this report, putting it in layman's terms enough so that a student could read this. Okay, I get it, right? Um, and that's I I think too, that's the superpower that you can bring that you can, it's a skill you can learn. That's a good thing. Right? It is a skill. It's not it some people have it more, maybe they're more, you know, it comes easier to them. But you can definitely get better at it and keep getting better at it. And that's a that's a great, that's a great thing because it's not just a well, I don't, I'm not a great speaker. Well, you can you can get there, you know. You you can get there with practice and and training. Absolutely. Um yeah. Yeah, yeah, definitely. Well, Suthagar, just um thank you so much. This is great. I really think I really appreciate your time and going through this and just your perspective on everything was was great. I really appreciate it. And I think there's lots of just lessons from this, but just concepts and the thought mind, the mindset of of you know how you guys approach this and your your mindset behind this and making this report public. That's why I I found it and uh and have used it and talked about it with my team, and I'll I'll continue to use it. Now I'll even have more uh data behind and your your your stories behind it to help me and in my role. So just just thanks so much. Thank you for having me here. It's been a pleasure. Yeah. All right, everybody. That is the end of our episode. Thank you.

Steve:

And a huge thank you to our sponsor for season five of the Cybersecurity Mentors Podcast, ACI Learning. You can check out ACI Learning at acilearning.com slash simply cyber. Thank you for tuning in to today's episode of the Cybersecurity Mentors Podcast.

John:

Remember to subscribe to our podcast on your favorite platform so you get all the episodes. Join us next time as we continue to unlock the secrets of cybersecurity mentorship.

Steve:

Do you have questions or topics you'd like us to cover? Or do you want to share your journey? Join us on Discord at Cybersecurity Mentors Podcast and follow us on LinkedIn. We'd love to hear from you.

Speaker 5:

Until next time, I'm John Hoyt, and I'm Steve Higaretta. Thank you for listening.