The intersection of Security and Human Behavior ft. Classie Clark
TLDR;
Thanks, Cassie for the fun conversation. Here’s a summary of our conversation:
- To enhance security program effectiveness, infuse security into daily practices of engineering and others. A simple example could be doing a security review as part of the code review process in SDLC.
- Training and Awareness programs should be continuous and actionable with a focus on behavioral risks in addition to technical controls.
- Choice architecture plays a major role in implementing security programs at organizations. A few examples could be the option of a password manager vs a specific password manager. Option of using a specific MFA provider vs support for multiple providers.
Transcript
Host: Hi everyone, this is Purusottam, and thanks for tuning into the ScaletoZero podcast. Today's episode is with Cassie Clark.
She's passionate about bringing humans into security. She has worked in security awareness programs for eight years at organizations ranging from small businesses to mid-size to enterprises and leveraging the power of behavior change, or sorry, rather evangelizing the power of behavioral change.
She believes the most impactful security culture efforts use behavior design methodologies and can often be seen holding a cup of coffee. I'm super excited to have you in the episode because I'm curious about how humans play a role in security. So thank you so much for coming to the podcast.
Cassie Clark: Thank you for having me and I do have intact my cup of coffee ready!
Host: So before we start for our audience, do you want to briefly share about your journey? Like how did you get into security or like your overall journey?
Cassie Clark: Sure, happy to. So I think like many people in the industry, I have a very unusual, circuitous route to security. So I have many years of work in nonprofits, mostly. So things like community management, program management, some administrative, some marketing communications, things like that. I have a grad degree in women's studies, and so I did some teaching and things like that.
And over time, I was ready for a change, for many reasons. And decided to, for whatever reason, look into working in tech. And so I looked at what skills I had that were transferable. And one of those was community management. And so I got very lucky that the company that I first got hired into Salesforce was looking for a security community manager. And they were very clear that they could teach security to somebody, but they could not teach community management skills. And so I was able to make that jump into there and that brought me into a whole world regarding security and humans and culture. And I hadn't, I didn't even know these were jobs before. It's been great.
I've been in this now for eight years. It has evolved wildly and the entire industry to be fair has evolved within the past eight years as well, both the security industry and the subset of security awareness, security culture, human risk management, whatever you want to call that.
But I really started at sort of that community management level working with developers and thinking about how I could help them be more secure, how I could get them training, things like that. And that slowly became all employee engagements and then it became all employee culture and it was a little bit of internal communications. And then I decided that I wanted to try to do everything. And so I went to, and I ended up at two different midsize startups.
So both of those, I built the awareness program from scratch and so it was everything from training, engagement, communications, but then also some, some opportunity to pull in behavior, which as we'll get to talking, you'll see is sort of the tenant of how I operate. And so then it started to be things like user enablements. How can I help our technical teams to be more successful since I'm the person who understands how all of our other employees operate in the day today?
It's to look like, thinking more about how to apply behavior first and then think about some of the traditional inputs that we have like training instead of the traditional approach, which has been, we have to do a training. So what behaviors do we want to look at? And so trying to flip the script a bit to be more, frankly, more impactful and more useful. I was laid off in January. I'm very excited that I have a new opportunity coming up, but it really gave me an opportunity to spend a few months focusing on how I wanted to dig deeper into behavior and make that the primary focus.
And so really more looking at how we could use behavior design in particular and focus on sort of catching people with there, and I'll get into this, their automatic thought processes instead of asking them to stop and remember a thing that they may not be primed to remember at the moment and think about how to apply that to the security landscape. And so I'm very excited to be here and share more.
Host: So first of all, like you have a very unique journey to security, right? Most folks start as like pentesters or as a hands-on people and they focus on the technical side and slowly move toward like human behavior side. But you started with human behavior, which is often the key when it comes to any engineer's life, right? Security engineer or non-security engineer as well.
So I'm… I'm curious to learn from your experience as part of today's episode. But before we start, one question that we ask everyone, all the guests, and we get unique answers to it, is what does a day in your life look like? So what was your day like when you were working with organizations and helping them build the right security culture or doing the behavioral analysis so that you can implement those in the security programs?
Cassie Clark: Yeah. And I, what I was doing in my last job, what I've done in the meantime with consulting, and what I'm about to do are all a little bit different. so it could look like things like chatting with technical teams about some of the data they receive, or it could be setting up partnerships with various teams so that I'm able to help them sort of enable them to be able to be more secure in a way that is less friction, frankly. it could be something as simple as content creation.
So. You can't get the message across if you are not coming up with compelling and actionable content, for example. And all of those things I think will translate well across all of those different experiences, regardless of whether or not you are just starting in awareness or you are running a program in awareness like I was in my last few jobs. If you're doing consulting work and then this next opportunity, which is very focused on behavior, all of that will require some form of partnership engagements, really, really carefully crafted communication. And then now a lot more of the sort of the behavioral data analysis.
Host: So communication, the cultural part of security is a very vast topic. I hope we can cover a few of the areas today. So today our focus is security and human behavior. How does the behavior play a role in that? So you have often written about humans, their impact on security, security, and relationship with humans, and their behavior.
When you speak with security practitioners, often they say that humans are the weakest link to insecurity, right? So we would love to learn this from your point of view on this topic. In your experience, what are some of the biggest human behavioral factors that contribute to some of the risks that we see or some of the attacks that we see nowadays?
Cassie Clark: Yeah. So it has to do with the way that humans operate. And this is true for almost every human. I'm going to summarize as quickly as possible, but if anyone is curious to learn more, there is a really fantastic book that is called Thinking Fast and Slow. It's written by Daniel Kahneman and it's incredible research.
And it says that up to 95 % of the way that we think and the way that we act is driven by our automatic brain. So this is the brain that does not think about the action we're taking. It's subconscious, it's reflexive, it operates based on habit building, etc. And as little as 5 % of that is deliberate methodical conscious. And so when you put that into the framework of security, where often some of the things that we tell people about security is that they need to slow down, they need to take a training and remember this thing, but their context, their environment is not set up to help them succeed in this area. It's no wonder that we are seeing people who feel that humans are the weakest link.
On the flip side, if we were to enable them, if we were to bring security to them, if we were to infuse security into already existing processes, if we were to nudge them at just the right time, if we were to essentially make security easier for people, they could actually be one of the strongest lines of defense that we have.
And the challenge is that attackers have manipulated our behavior for years. They've understood this since the beginning. We're just not catching up.
Host: Yeah. Yeah. So the book that you highlighted, yeah, I read that book and that is amazing. Like often you don't, you don't think about how your brain acts sometimes when you're in a situation, but that book highlights some of those things really well. So we touched on like how humans think and so a slight bit of psychology, right? So
how can organizations leverage that knowledge, the human psychology to understand or also predict if there are any security incidents that could occur if we don't train them the right way?
Cassie Clark: Yeah, and sometimes it's not even about training. Sometimes it's about how can we take a decision away from a human, for example, if they don't need to make that decision, why would they need to? Why should we expect them to take that on? And sometimes it's about making sure that we can enable them as much as possible, but they still have to make that choice. And so for me at least, being able to learn about even the basics of human bias has been really helpful. And all of this is related to psychology and neuroscience.
Things like habit formation and how long it takes to build a habit and what it requires are really helpful. Even just again, a base knowledge for understanding that sort of thing. Honestly, one of the most impactful things that I've learned is that I wouldn't necessarily expect organizations on a grand scale to learn on a deep level, but if they had even a base level, I think would really impact the way they approach some of this.
And that is that the majority of how we behave is driven by our context, not by our individual choices. So what I mean by that are things like our time. So are we pressured for time in other areas? If I'm a salesperson, how many competing deadlines do I have? And again, these are decisions that are often not made consciously. Or our physical environment.
So for example, if we're in an office and we want people to shred documents with personal information on them, how far away? Is the shredder, is it, you know, I sit on the second floor and it's on the 13th floor and I have to have a special access code to get into an elevator versus is it down the walkway from me.
And those decisions, unfortunately, subconsciously often come into play. And it's not that people want to necessarily make those bad choices or those less secure choices. It's that that's the easier thing. And that's what humans have evolutionarily been designed to do is to be lazier. It's to conserve energy.
And so thinking about some of those things from a contextual perspective can probably significantly enhance the way that we approach security and make it a lot more effective for other people to be able to engage in security in the way that we want them to.
Host: I've never thought about the shredder example that you gave, right? I've never thought about that if the shredder is close to you, you will do it often versus it's on a different floor, right? But it makes a lot of sense, right? Like as you said, humans are designed to be lazy. So they're like, yeah, maybe I'll do it once a week instead of maybe doing it every day. That could become an attack vector in a way, right?
Even though it was not intentional. As we are designed, based on what is available close to us, we make some of these decisions subconsciously. We are not being very intentional, but subconsciously we are making some of those decisions.
And some of the organizations invest a lot on security awareness programs, things like these, that, hey, you have to shred every day. I'm just using that as an example. And set up practices around it.
And then there are some simulators also, like security code or secure code warriors or hack the box, which often leads to fatigue for the employees, even the security teams.
Can you shed some light on security fatigue and how does it impacts employees?
Cassie Clark: Yeah. And I think this gets back to how can we enable people, which is sort of my goal with bringing some of this information out. There are some really great programs out there. Secure Code Warrior actually is one of the better ones in my experience. I know there are a lot of newer ones. I haven't had the chance to test that yet, but also I've heard great things about, there are also frankly a lot of very bad vendors and I won't necessarily, I won't name them. We probably all know them. They pop into our heads when we think of a bad vendor.
And those are the ones that really lead to security fatigue. Because if you're not presenting somebody with content that is actionable, clear, and relevant to the work they do or the company they're at, and this is where a lot of the very neutral vendors, unfortunately, are going to be less impactful just because of that.
If you're not presenting them with that really great content, not only are they not absorbing what you want them to learn from that content, but They're also taking that on as time away from the job that they are currently trying to do. And frankly, despite the fact that security doesn't often want to hear this, security is not everybody's job.
There are responsibilities everybody has to do secure behaviors, but that's not their job. Their job is whatever their main job is. And so sometimes the more that we add on in a given timeframe, the more that we actually add to this feeling of pressure of that's the word I'm looking for. Obligation.
And we don't want them to feel obligated per se. We want them to feel excited or at least, you know, bought in. And so finding that right balance, I think can be really tricky. And it comes down to understanding what are those specific employee needs.
And as much as we're passionate about security, we don't need them to do every little thing. We can offer them optional things if they're interested in taking them, but they should be doing really what is the core of what they need to do.
And so let's say for secure code warrior, if developers need to learn some sort of coding challenge, you send them the ones they need to learn. They can obviously do all these other ones, but then, you know, we can't expect them to want to come to a whole month of events in October for security awareness month. Right. So thinking about some of the different ways that we can offer things to people, but not mandate buy-in how we can make sure that everything is very clear, very actionable, concise as much as possible and really show that we respect their time as well. I think that's a big part of it.
Host: Yeah, makes a lot of sense. One of the things that you highlighted is that security is not, let's say if you are not in the security org, maybe it's not my responsibility, right? So, like we got this question from Brooke from Rad security. One of the things that she highlighted is how can security teams like make other teams aware that it's a shared responsibility model. It's not that only security team is responsible for security and we can do whatever we can. And in order to make this happen,
How can security leaders get board or leadership support? What are your thoughts on that?
Cassie Clark: Yeah, I think this is such a great question. I'm super happy Brook asked it.
So I think the funny thing is that everybody talks about building a security culture. And my approach actually looks at how do I ensure that behaviors are happening. Because then when you reflect back to the people who are doing them, suddenly they realize they've been security minded all along and that's how you've built a security culture. And so it's sort of an add-on effect. And so that's sort of similar with the shared responsibility model is you, you find ways to infuse it into the processes they already are doing.
And then they're already doing them and then you can call that out and they realize that it's a shared responsibility. And that's not to say that it's not important to use language. It's not important to get buy-in from the leadership of those teams. And certainly, we want to also showcase the risk in a very relatable manner. So a way that they can understand, but also if we can find ways to a great example,
if you want your developers to do security reviews, and embed that into the pre-existing software development life cycle process. They're already doing it. Just make it as easy as possible for them to do and they will naturally start to do those things. Things like a vulnerability management program. That one can be a little bit trickier just depending on how it's set up. But again, there are ways to sort of facilitate that as simply as possible, whether that is through automating messages that go directly to that person.
If it is... maybe taking on more of the responsibility as the security person in the beginning to build a partnership, whatever that might look like. Being able to sort of facilitate that as an opportunity for them to get their work done while also doing the security things is more what we want as opposed to throwing a bunch of security work at somebody, which is often how they perceive it.
There's a little bit of like a play. that goes back and forth there, but it's a lot more around taking the time early on to essentially take a lot more steps toward them.
Host: Yeah. I like how you use the word infuse, right? Infuse security into their day-to-day activity. Like the example that you gave, adding security reviews as part of the SDLC itself, rather than doing it as a separate activity altogether. You are slowly improving the security culture that way.
And the other thing that you highlighted is the communication and the messaging part. How do you work with other teams and message it so that they see value rather than just throwing, just sending a report that hey, there are 200 vulnerabilities and it needs to be fixed.
Cassie Clark: Even something as simple as not expecting a developer to understand the priority order of those vulnerabilities. And so if you have 200 vulnerabilities, can you showcase that in order from most critical to least critical and help them come up with an action plan for some of these things? That goes a very long way to showcasing empathy and respect for the people that you're trying to partner with. And they'll be more likely to want to help out in the future.
Host: Yeah, absolutely. I want to double-click on the security awareness programs you slightly touched on. Often the trainings include presentations or MCQs. You watch a video and then at the end of the video you have some MCQs which you need to answer. I have also done that in the past and you might have seen those as well.
They often feel disconnected, you maybe just switch on the video and go do something else and you have multiple attempts so you can try and just get the score that the organization wants you to.
So what suggestions you have or what tactics organizations can use to develop better awareness programs?
Cassie Clark: Yeah, there's a few different things. So there's taking the tactic of we have to do training. We still want that training to be useful for people because again, we don't want them to feel security fatigue and then have to fight against that with our other work. And so then thinking about what is the bare minimum they need to know and then really stick to that.
We can do things throughout the year. We don't have to focus at all onto one big annual training. So make that as concise as possible and respect their time and then make it very actionable, very clear, and very, very specific to them.
And so again, you get a challenge here where how do you use a vendor and still get that sort of achieve that end effect. And there may just not be a way to do that without investing a little bit more, or at least bringing in a consultant to be able to create some content. I actually created the content for every single place where I owned the program, because it was just easier for me to write it all out, work with an instructional design vendor who knew how to put it together into an interactive, compelling package.
And it was worth it because the amount of time that I spent and the money that we spent on that vendor, the results that we got from our data was incredible. It was 4.9 out of five for the rating. Every single time the comments were things like, this is the best security training I've ever had. And then I would have people follow up with me individually throughout the year to say things like, your training really helped me because I was able to help my dad avoid being scammed by. some form of bad actor.
So I was able to save him $5 ,000 and I was like… Those kinds of things are what happen when you make it very specific and very relatable. You have scenarios in there so they can see themselves in it, that sort of thing. And so taking the time to do that really is important, especially since for most of us, we have to do some version of these.
But then it's thinking about how can we develop an awareness program that is sort of a grand scale. So from the content perspective, could we have little tiny bite-sized micro -learnings is what they're called throughout the year instead.
And so is there opportunity for prompts to come up if they're about to engage in a behavior we want them to maybe not engage in and we'll still either let them engage in it if the risk is a trade-off or we will block it but we will redirect them to the behavior we want them to do. Or are there ways that we can sort of facilitate learning in simpler ways?
And so that's part of an awareness program is a lot of the planning and a lot of the investigation about how we could do this. And then frankly, some testing and seeing what works and what doesn't. So there's iteration over time. The other aspect of awareness programs that is still very, very new is coming up with a list of the behavioral risks that we have, which is challenging for some people as well, because they don't necessarily identify the behavior. There's a risk, there's a behavior, and then there's how you essentially apply an intervention.
And that framework is very new to people. But having that framework of this is my risk and that's how I report up to leadership that I can show whether or not it's been, there's been movements there. And then this is the very specific behavior that I'm looking to change. And then maybe there are some sub-behaviors, but a great example I love to use is the risk is we want to, like the risk is compromised credentials.
And so the behavior we want is we want people to use a password manager. consistently. But then sub behaviors are all of the different things that they have to do in order to use a password manager consistently. So enroll in a password manager, use a password manager once a week or whatever the things would be. And then we have data for each of those things to showcase whether or not it's effective. And so those types of, of framing and planning are such a huge component of this work that I think most people either don't know they need to do, or a lot of people who maybe don't do this work don't realize is a huge component.
Host: I was going to ask you for an example, but you have already given an example of like the password manager, right? Which makes a lot of sense. Like password managers have exploded. Like there are so many password managers available now. It's so easy to set up and they're available across browsers, even on like laptops. So it has become very easy to use them. But even then some folks are not aware either they… don't use them or they save it in the browser or something like that.
So I totally see your point. For some of these, bringing awareness to the employees that, hey, we want to use a password manager and this is how you use it and making gamification around it so that they see the value of it will go a long way. So we spoke about behavior early on. I want to go a little deeper into it.
There is a battle between human behavior and security, right? It's a fact that the majority of security leaders or engineers often think about technical controls. That, hey, I want to have MFA or I want to have password manager as we just spoke about.
How can human behavior be factored in when we are designing the security architecture for an organization?
Cassie Clark: So I think there's a couple of different aspects to it. I think it is amazing that people always think there's a tug-of-war between the two when really they are both important and they're ones that can be beautifully coexisting. Human behavior can actually really help security in a lot of ways as well. But figuring out how to do that is obviously a big challenge.
One, I would always try to have somebody who is focused on the people. So traditionally these are your security awareness, security culture, human risk manager type of people involved in that process because they're the people who know that side or they should know that side. Right.
And so having that person is sort of your user adoption person or your, your bridge between the technical team who is very often, very smart, very capable and the people you're trying to give this solution to or get them to onboard to it.
And then the other factor, I think that's really important. We've already talked about, which is how can we make it as simple as possible for people to adopt whatever this would be? I am a huge fan of technical controls. I mean, I already mentioned a couple here in terms of things like password managers, because again, if somebody doesn't have to make the choice to be secure, if it's not impacting their day-to-day, great, why wouldn't we?
And then even the ones where they are expected to make that, you know, we can't We can't technically mitigate it the way like fishing, for example, how can we better protect them? And that's things like MFA. And so I am a huge fan of technical controls, but we will by far see more success if we bring our people into that process as well. And that in these instances is usually how we take the little bit of extra time to make it as easy as possible, which truly in the long stream impact takes less time for the technical team, but it's hard for people to wrap their brains around.
Host: Yeah, we often sort of think in short term, right? That, hey, I have a sprint, I need to complete something, so I'll just do whatever is quick and easy to do, rather than thinking long term and maybe spending some more cycles in the current sprint and then getting bigger rewards in the future.
So user behavior analytics is a very valuable tool, right? So how can organizations use that to maybe identify potential security incidents on any anomalous or odd user behavior?
Cassie Clark: Yeah, actually, my last company was really great on this third detection response team was whip-smart. and they really focused around things like, how do we set up alerting? And so looking at things, not if I say this, in the same way that security is probably running really well when you don't see it, that's sort of how we think about, user behavior as well.
And so if for example, somebody has done the same thing every day for 167 days and then on day 168 they're logging in from, I don't know, Venezuela, for example. These are the types of alerts that are very simple to set up. And then, you know, you can decide which ones are higher risk and which ones are less risk and then prioritize from there because you'll get a lot of alerts. But those types of things I think are really easy.
There is certainly respect and privacy that I think is a conversation that has happened a lot, especially in the DNR space for quite a while. And so obviously that is company dependent and you want to make sure that you are giving as much privacy or affording as much privacy and respect to people as possible.
But thinking about that in terms of alerting what information can we see? What information do we choose to see things like that? I honestly, I love data. So the more behavior data people will give me, the more I will take because you never know how I could use that in the future and how it can be combined in different aggregates to be able to come up with an interesting solution that's very specific to a subset of the organization, for example. So I'm a big fan, but being mindful about how we set that up is important.
Host: I guess with the. Yeah, I guess with data, yeah, I agree. More data, often you can use that to draw patterns and then implement more behavioral controls. But at the same time, you have to keep privacy in mind, like what type of data you're looking at or what type of data you don't want to look at at the same time.
Cassie Clark: It can also frankly identify some of the human risks that we have where people buy and far people are operating with security as something they want to do, but not necessarily something they always remember to do. And so let's say that you have 30 different incidents of people downloading information directly to their device. And that goes against policy. Maybe one of those is an insider threat but the rest of them are not.
And that gives people like me a lot of insight into, okay, so we have some work to do here to help educate people about where they can safely and appropriately store things. And so some of that is really important, both from our perspective as well to guiding behavior, but certainly then also helps people catch, let's say insider threats who are maybe, maybe that person's about to be off -boarded, for example.
Host: Yeah, yeah, absolutely. That's a good point. One of the things that you touched on slightly earlier is simplicity, right? Like how you build simple security solutions for your employees. And one of the good examples is MFA, right? Like biometric devices like Ubi keys, which we use for the second factor.
Can you maybe highlight a little bit more on the simplicity of security or user -friendly security and how it can be leveraged to design security programs?
Cassie Clark: Yeah, this has actually become one of my favorite things to work on because it's so interesting to get insight into both sides, both the technical and then the employee side. And so I think MFA is a great example. There are people who are not comfortable with things like touch ID, for example, or face ID or any of those things. And so having Nubikey as another option is great. There's something called, there's a certain amount of choice architecture, it's actually one of the contexts that I mentioned early on in terms of what influences our behavior.
And so part of that is people feel more bought in if they feel they've been able to make a choice. And so a great example that I like to use with this is if a company is rolling out mobile device management. A lot of people do not like mobile device management, right? A lot of people are really against having a company access to their phones.
And so if you give them an option, if this is in your budget, of you can either enroll this MDM service on your phone, and this is what we will see, and that is it, or you can have a company-issued device. Similarly, they are a little bit more bought in because they were able to make that choice. And so even designing solutions like that can be incredibly user-friendly, and it will actually help you in the long run with their buy-in, which can be tricky, especially with ones like that where people feel like you're monitoring me.
You are going to see certain things, etc. And it's not always because they're doing things that they shouldn't be doing. Sometimes it's literally just, they don't want you to see anything. Other examples, I'm trying to think off the top of my head for user behavior ones. Anytime that you can provide self -service documentation, but in a myriad of ways, as well as have somebody available to help walk somebody through some challenges they're facing is a huge step towards that. So for example, if you are running, let's go back to the MDM example, actually.
So if you are getting this person, let's say they chose to install the MDM on their personal device, having that self -service, very clear documentation, both with written texts and screenshots, hugely helpful. And then that is linked in the messages that you're sending out. So very clear. And then if they run into a technical issue, making it very clear how they can immediately contact somebody who can walk with them through it.
That is incredibly user -friendly. And it, frankly, reduces the amount of time it takes to adopt. Because if we run into a barrier, we tend to stop, and then we don't come back to it.
Host: So I had to go through the same experience like with my previous company. We had to choose between whether using our own device or getting a new device. I was absolutely OK with it, but some of my colleagues were not. They said that, yeah, since I have the option, I'll get a new company-issued phone because I don't want the company to see what I'm doing on my phone. So yeah, like… When you said that it clicked to me that, yeah, like we had that option. I never thought it from that perspective, but it makes a lot of sense now that giving that option to your employees can put their mind at ease, right? Instead of forcing that, hey, you have to install it on your phone. So yeah, makes a lot of sense.
I think another example that I have seen with some vendors is for, again, MFA. If you are using the app-based authentication, some organizations force you to use a specific app versus giving you an option that you can use any app, as long as it does the two-factor using the app-based authentication, we are okay. If you need to download a specific app for two -factor authentication versus you have another app for everything else in your life, then also that's friction for the employees.
Cassie Clark: I think that's such a great point too, because this actually gets at the other side, which is sometimes we want them to have friction because that forces them to use that 5 % more. But it's not that we're at like, it's how do I say this? Often we are expecting people to be able to access that 5%, but the friction actually works to force them to use their 5 % if that makes sense. We're not asking them to just draw upon it.
And so I… I have this conversation a bit with product designers or product managers who want to move really fast. And that's important in many contexts. And in many contexts, I do want to reduce the friction that security provides because we can build it in a way that we can do that. But there are times that, and this is how I say it to the product managers, there are times friction is actually a good thing. And security is usually one of those instances. So no, it doesn't need to be this huge elaborate thing most of the time, but we do want them to slow down just enough.
And we can help them understand why we're asking them to slow down. And I think this is actually a great example too, in terms of multifactor authentication apps.
Usually the reason that companies will choose to have one is because it's much easier to troubleshoot from the back ends. So if somebody gets locked out of MFA, if they need a reset, etc, it's much easier for them to control it from the account on the back ends, as opposed to trying to figure out which of these various authentication apps this person is using, help them troubleshoot, especially if it's on a personal device, that sort of thing.
But something like being able to condense somebody down so they're not sporadic all over the place. Going back to the password manager, for example, if you offer them the ability to store their personal credentials in there as well and help them understand that we can't see those, that can be an easy way for them to reduce down and actually keep everything in one place, which also frequently builds habits. So there's a, there's a, it's, it's a difficult balance, but it's a balance. Yeah, for sure.
Host: Yeah, so I see your point. You have to find the right balance between simplicity and also at the same time being able to enforce some controls as well from an organization standpoint.
The next question that I have is, like again, staying on the human and security aspect, some security analysts claim that human error is the biggest factor to data privacy. And there has been many phishing attacks or social engineering attacks to steal employees' data and then through that steal maybe customers' data as well.
Recently, we have seen some attacks to Twilio and Cloudflare, even Okta as well.
What would you recommend to organizations so that they can prepare for such attacks?
Cassie Clark: That's a great question. And the tricky thing is that when it's, especially if it's a third-party service that gets attacked, it's really, really hard to help your employees prepare for something like that.
And this is why attackers are so smart because they know exactly how to impersonate the right people. I will say that in our training, one of the number one things that we use as a scenario is something like, somebody calling you pretending to be from IT because it's so common. Because it's so easy because they're using authority bias. The other thing that I do in all of my trainings is I take some time to talk about the most common ways that attackers manipulate our behavior and talk about it from a high level. And the reason I do that is because when people have a chance to understand how they're being manipulated, what's being manipulated, they're actually more likely to be able to recognize how to protect against it.
And so I'm not asking them for a specific example in every single context, but I'm more I'm more training them on being aware of the types of things that might elicit a response from them. And so this is a great example. So if somebody receives a call from IT, the people at my company, I would hope would know my IT department would never call me. It would never call me. And that's something I can reiterate periodically throughout the year. I can put it in, you know, that can be on microlearning, for example, it can be a little scenario.
We can use the attacks that have happened both internally at a company if that has happened at that company or externally at other companies to our benefit by having these little things. And so that's really the most effective way. And then of course, as most of us already have, having a good level of MFA is really important as well.
The one that frankly I'm the most terrified about coming up are the AI voice-based attacks. Because they're just so good. And there are little cues, but it's much harder to get somebody to recognize that little cue, especially if they're really hitting something that is fear-based, that is authority-based, you know, all the things that attackers are so good at manipulating.
Host: Yeah, yeah. So let's say you prepared for an attack. You have done the awareness programs, the micro learnings, and all of that. But we cannot be sure 100%. It's still possible that attackers will attack. So
how should organizations react when they find out that they are under attack?
Cassie Clark: Yeah. So I think it depends on what stage they're at. Obviously they need to take whatever action they need to take to lock down a situation. And so if that, let's say that somebody has gone after a specific employee and that employee has given them some level of access, I think it's perfectly fine to cut that access. You can follow up with the employee afterward.
What I never want an employee or an organization to do is to engage in like a blame and shame kind of thing. It's often something where you were manipulated. And if we go back to the context, for example, even with phishing, we train people on phishing more than anything else, frankly, more than passwords, more than privacy, more than anything in a very particular context.
And so when I'm at my computer and I'm doing my computer things all day, great, I am super aware. And then when I go home and I'm looking at my phone and I see an email pop up at 8:30 at night and I'm watching TV and I have somebody talking at me and my ear and my food is burning. Very different context. And so it makes no sense to blame and shame somebody for the context we have never trained them to be able to recognize these things in.
And so that's the thing I want organizations to avoid and then really be able to work with parties to take the action they need to take, but then be really empathetic towards that person and help them maybe for some. Maybe some sort of training. Usually it's better as a conversation, quite frankly, helping them understand the risk and how they were impacted, but also really helping them understand that they're not a bad person. They, you know, they're not going to get fired. They're not going to get penalized. Things that frankly, a lot of people have a lot of fears about with in terms of making a security mistake, but really helping them be able to change moving forward, having those conversations.
If it's later on the end. And so let's say information has been leaked. There has been an actual breach. That's a very different scenario, of course. And then the sort of mitigation work on the backend is much more complex, but again, similar approach in terms of, I want that organization to do what they need to do to lock down. But I also want them to not take penal, punitive action, for example, for something that we can enable them as much as possible and recognize that this is still unfortunately a risk that we will face moving forward because attackers are very smart.
Host: So, yeah, I totally agree on the blaming part, right? Blaming and shaming part. Because if, let's say, I was attacked and because of that, the company is going through an attack or things like that, I'm already vulnerable in a way, right? And if you, instead of taking the human out of this attack, if you start blaming or shaming folks, then… In future, I would hesitate even reporting some of these things. So it makes a lot of sense what you highlighted.
Cassie Clark: Exactly. Absolutely. Yeah, that's a huge piece. And quite frankly, I mean, there's a reason that everyone says that you never waste an incident.
That's true. You don't waste an incident. So then finding ways to make better improvements based on what happens. Are there other controls that we should have had in place? You know, what kinds of little trainings can we offer people? Are there times when a prompt would make more sense? That sort of thing is, I mean, those should happen regardless of the level of the incident.
Host: Yeah, absolutely. So we reached out to some common friends and we have got some questions. I've already asked you one. The other one is around burnout. So this year at Gartner, it was highlighted that around 73 % of CISOs feel burnout at some level. And we, like Brooke also asked something similar, which is how can security professionals deal with burnout?
Cassie Clark: That's such a such an important question, one that is tricky because it's really going to depend on your leadership. But I am personally a huge proponent of I work hard and fast and then I take a lot of time. It's just the way I've always worked. It's not something that I have attempted to do.
But that time off is really essential. There's also a certain amount of learning to manage your own time and have very clear boundaries when that time is not being respected from the other side. And there are times that, you know, it makes sense that you're going to work more hours. For example, when I was both creating an entire security training and trying to create that October event, it happened concurrently because I did that to myself and I learned from that. That creates burnout.
But there are also times you get a lot of that pressure from other people. And so being able to help structure the work that we're doing in a very organized manner so that it's planned out, allow buffer time, things like that. That's going to help, I think, with some of maintaining those boundaries.
Taking time off, doing whatever you need to do to refresh, thinking about what types of things do help you keep from getting into burnout. Because once you're in burnout, it takes a very long time to get out.
Host: No, I agree. So I think I like how you highlighted it. Like, find your way to disconnect from maybe your work and manage your time in a better way so that you can avoid some of these, getting into some of these scenarios, and also tune out of some of these things so that you can relax and start again in a way.
Cassie Clark: Yeah. And for CISOs in particular, it's obviously challenging because there's only so much time they could take, for example. And so for them and for anybody more in a more of a leadership role, I think it's much more around, delegating properly. And that is something I did not do well when I first started managing. I wanted to control everything, not because I didn't trust my people. It's just, I was so used to it. And so for CISOs to be able to really delegate, I mean, I think potentially the number one skill you might need to have as a CISO would be adaptability or flexibility. So that I think is very different.
Host: And I feel that's a challenge with most transitions from an IC to a leadership position. You are so used to doing it on your own to transitioning to a delegation mode takes time. And I can totally relate to what you're saying.
Early on when you transition to a leadership role, you still want to be hands on. But your primary purpose is to enable your team rather than doing it hands on. But yeah, it takes a little bit of time to transition to that role and to do that in a better way.
So another question that I have got from Dustin Lehr is, you have been very active in the community. You do a lot of talks, and you have a lot to offer to the community. So as a fan, he wants to know, what are your plans next? I mean, you can. You feel free to share whatever you can, but yeah, that's something that we got from Dustin.
Cassie Clark: That's very thoughtful and very lovely! I actually take it as my responsibility to my industry to share more about this perspective because I don't want to be the only one who feels like I can increase in how impactful I am. I want my whole industry to succeed.
And so I would love to do more talks. I gave my first keynote talk this year and that was super exciting. It was with the wonderful Marissa Fagan and I am working to explore more of those opportunities.
So working on building a website, which is sort of a predominant way I've been told can help. And then once that's built, I'll reach out to some speaker agencies. But if anybody wants a keynote, and then some other opportunities, this was a lovely opportunity I was super excited about. And then I also want to focus more on some content creation. So being able to share some of these ideas with people in a variety of formats is really important to me. I love getting some of this conversation going and I love having conversations with people.
And so being able to promote that type of opportunity is really important to me. And then from the job perspective, I'm actually about to start a new job and it's very, very focused solely on behavior within security. I am incredibly excited. So yay.
Host: We'll see an update soon, I guess, on your LinkedIn. So yeah, thank you for sharing that. And that brings us to the end of the security questions.
The next section is focused on rating security practices.
Rating security practices
The way it works is I will highlight a security practice. You can rate them from one to five, one being the worst and five being the best. And you can add context why you have given a particular rate. So the first one is,
Use strong passwords that contain a mix of uppercase and lowercase letters, numbers, symbols. Change it frequently and avoid using the same password for multiple accounts. What do you think about that?
Cassie Clark: funny because if we broke these out into just individual components they would have a different number. So I think aggregated I would give it three out of five. I am of the camp that you don't need to bother changing your password frequently. I am very pro password manager because again, if we're looking at human behavior, enabling people to do the most secure thing is going to be the most secure thing or going to lead to the most secure thing.
And I understand that people have their own concerns about password managers, but that frankly is much more secure than somebody who If we're being perfectly honest, there are maybe one in 10,000 people who will create unique, complex, strong passwords that are different across every account.
And every time that you ask somebody to change it, they're going to take their preexisting password and they're going to add a number or a character or a letter change. Like they're not going to change the password at its core. And so they're not actually being more secure. You're actually making them less secure. Because of the way that humans are gonna operate. They're gonna make it easier for themselves.
So yes, please do use strong passwords that contain that mix. And yes, please avoid using the same password for multiple accounts, but please do not ask people to change their password frequently. It does not help them. Ask them to use a password manager. The password manager can create it for them. Then you don't even have to have them do that part. Like, great. Bye.
Host: Yeah, so we use Bitwarden and you can define whether you want symbols or not, numbers or not. You can also define how many characters or the length of the password, all of that. So you have that flexibility, right? So yeah, makes a lot of sense to use password managers.
The next one is granting users unrestricted access to systems and applications so that they can move fast and rule out new features quickly.
Cassie Clark: For very tiny companies, unfortunately that's often not feasible. So I will move it up a little bit just for those really tiny companies where basically everyone is an admin because everybody accesses everything and needs everything. But really, I mean, there's very little reason that most companies that everybody should have access to something, especially admin level. It's a huge amount of risk with very little trade-off, in my opinion. I do appreciate moving fast and rolling out features quickly. There are other ways to build that in.
Host: Yeah, love that. The last one is DevOps practices are needed to move fast and deploy code to production. Security practices are not the most important right now for us.
Cassie Clark: That's a conversation to have with a partner as a partner. But obviously that is not high on the list. That's like a, I'll be really empathetic and say it too, because I do understand again, I want to be empathetic to them wanting to move fast. However, helping them understand the risk to whoever is consuming that, whoever their, their audience or customers really are is a huge part of that.
And then the other part is how do we enable them to do that? Maybe not quite as fast as they would have otherwise, but certainly in a much faster way by wrapping that into whatever, or building that into whatever process they may already be using.
Host: Yeah, so this again goes back to what you said at the beginning, right? Infusing some of these security practices as part of the day -to -day rather than a very separate program so that they are not, they are just doing it naturally rather than just thinking about security as a separate step altogether.
Cassie Clark: And then that partner who says that security practices are not important, they're probably seeing security on a far other end of the spectrum and then you could help them see that actually they can coexist really beautifully.
Host: Yeah, yeah, absolutely. So yeah, thank you so much. That's a great way to end the podcast. But before I let you go, I have one last question. Any reading recommendation that you have for our audience?
Cassie Clark: Yeah, can I cheat and say two? Technically three because I'm going to include picking fast and slow as the one I said earlier. So definitely that. And Cliff's Night's fine. It's very dense. So it's not one that people need to feel they need to read cover to cover, but certainly getting a good sense of the core of what that book is.
There's also a fantastic community. I'm in it on Slack. They also have a weekly newsletter, so there are different ways you can get involved, but it's called Habit Weekly. It is actually not a security specific resource. It is a behavior specific resource. And so it's run by behavior scientists, et cetera. But it's a whole bunch of different contents about things like biases or habits or patterns or new research that's coming out. And so that can be really helpful just to get a sense of what behavior looks like. I picked up a lot of the stuff that I've learned over the last five years or so.
And then the other resource that I would say that is security-specific is anything by Dr. Jessica Barker or Perry Carpenter. They just recently did a master class together. They are fantastic people in this space. They are incredibly smart. They've been doing this work for a long time. I don't
I think like most people, I don't love to use the word thought leadership, but they are absolutely the thought leaders that I think of. They have books, I'll have some of their books. So I would recommend some of their work for some good insight into security awareness or security culture in particular.
Host: Thank you so much, Cassie, for joining and sharing your learning. Because this is a new area, even for me, I'm also trying to learn. So yeah, thank you so much. There was a lot to learn for me individually. Yeah.
Cassie Clark: Yeah. Excellent. I'm so glad. No, thank you, Brewer. This was great. This was lovely to be able to share some of the stuff I've been thinking about for quite a while.
Host: Absolutely. And to our audience, thank you so much for watching. See you in the next episode. Thank you!