In this episode of our “Overshadowed” series, Nudge Security CEO and co-founder Russ Spitler welcomes Ira Winkler, CISO of CYE Security, to discuss principles for designing a security program that engages employees in positive and effective ways to help mitigate risk.Â
Russell Spitler:
Hello, and thank you for joining us for episode six of our Overshadowed series, discussing how modern IT and security teams are dealing with the risk of shadow IT. I'm Russ Spitler, CEO and co-founder of Nudge Security, and with me today is Ira Winkler, CISO of CYE Security and author of You Can Stop Stupid, Advanced Persistent Security, and Security Awareness for Dummies, and five other books. Welcome Ira, really great to talk to you today.
Ira Winkler:
Yeah, thank you for having me.
Russell Spitler:
So today we'll be talking about how to design a security program that engages employees positively in the process of mitigating risk, a topic that has been one of the founding principles of Nudge Security and obviously one that you've had extensive research in. Can you give us a little bit of history about what work you've done and then how long you've been working in this area?
Ira Winkler:
Yeah, I'm going to be dating myself, but way back, I can't remember the year, but way back when, one day I was working for a government defense contractor, and they said, "Instead of going to the Pentagon, can you make a few phone calls?" Anyway, three days later I had control over one of the world's largest investment banks doing what was essentially social engineering. And I didn't even know what the term social engineering was at the time, they just basically asked me to call up and lie to people and try to get access to computer systems. So that was just like growing up in New York, that's how you survive pretty much.
So anyway, did that. Then I wrote a paper about that and people called it the seminal work in social engineering, and I had to look up, again, what seminal meant, what social engineering meant. And I ended up, people just started coming to me to do more and more of these social engineering things. I called them espionage simulations because I'd combine it with not just calling people up but doing onsite work, black bag operations. I brought in special forces and intelligence operatives to help me with this, so I did a bunch of espionage simulations essentially. And the companies would have me do that and then they'd have me come back and give people presentations on how I essentially screwed them over. And people loved it and stuff, and the great irony was it was so ineffective.
I mean it's like... But I would keep getting all this work. I see people, how to rob banks and everything, and it's just so unproductive at the end of the day, and I'll talk about that probably later. But anyway, so I would do this stuff. I'd start creating awareness programs and things like that. I started a company, Secure Mentem, which focused on the human aspects of cybersecurity. And then over time, essentially going into what you're talking about, awareness is important, but telling funny stories, you got to consider what's known as the Forgetting Curve. And the Forgetting Curve basically says that when you tell somebody something, go ahead and over time it decreases. I think the statistic is something like they'll forget 70% of what you say the first day, and then by the time you get to the end of 30 days, they forget pretty much 90%, if they remember anything by that point in time.
And just giving people random information isn't going to work. Giving people, for example, a lot of people say you need funny videos and stuff like that. Funny, in some cases, trivializes things. You never see a funny awareness video for sexual harassment, for example. And at the same time, you could trivialize it, but at the end of the day, what does it actually do in practice at best? At best, it extends the Forgetting Curve a little bit more.
Russell Spitler:
Right, right.
Ira Winkler:
And so what you want to do is you want to figure out what can you do, not just with "awareness", and I use my Dr. Evil quotes to say awareness, but what can you do to modify and improve behaviors throughout the course of your entire security program? And that's where I began to realize, again, you mentioned my book, You Can Stop Stupid, where I essentially lay out a practice that I call Human Security Engineering, where what you're doing is you're embedding proper security before it gets to the user, trying to lead the user into better behaviors, and once you have the user into better behaviors, you can then go ahead and guide them, but simultaneously expect it.
I keep telling a story, sorry, I don't know what your question was anymore, but...
Russell Spitler:
I'm not going to stop you.
Ira Winkler:
I tell the story how I was at an event once and they were giving away stickers that essentially said, "Don't click on SH*T." Sorry, try not to curse for you guys. But anyway, basically said, "Don't click on," I'll use the word stuff, "Don't click on stuff." And this guy in front of me was an admin type, and he's like, "I need a whole bunch of these stickers." I'm like, "Really?" And he's like, "Yeah, my users keep clicking on all this stuff." I'm like, "Wow, you must be giving your users lots of stuff to click on." And then he's like, "What do you mean?" I'm like, "Well, if they're clicking on stuff, how are they getting it? You must be giving it to them." And then he's like, "Well," and he's mumbling. And then I go, "And by the way, if you know you're going to give them all this stuff and you know they're going to click on it, why aren't you doing anything to stop the impact of them clicking on it?"
Russell Spitler:
Right.
Ira Winkler:
And so you got to look at everything as a system. The problem is when people talk about awareness, they talk about awareness, but awareness is a tactic. And frankly, even though I wrote Security Awareness for Dummies, I still think security awareness is a tactic that might be the least efficient tactic in many ways. For example, a secure email gateway is going to filter out 99.99% of actual malicious attacks. Security awareness, don't get me wrong, is still critical because you still need to go ahead and, at the point or at the proximity of where an attack can be realized, you want to go ahead and say, "Hey, how can I reduce the likelihood that the proximity will activate the attack?"
And unfortunately, we keep too much focus on phishing. It's not just on phishing. It's, for example, lost computer devices, giving away passwords, allowing shoulder surfing, leaving unattended facilities, leaving vulnerable information vulnerable and so on. But you have to go ahead and have awareness to make sure, but at the same time, you also want to go ahead and put some technical controls in place as well. So for example, yes, people can leave their computers unlocked and unattended, but that's why we have computer monitors or screensavers that time out, as an example. It's why, for example, you might have a turnstile at an entrance instead of just a swipe with a mechanical door that opens up for people so anybody could go in when one person swipes it and so on.
So anyway, not remembering what your question was, I'll leave it there and-
Russell Spitler:
Yeah. So I mean, what I love about that is you just brought up so many great examples of security controls that actually embrace the realities of human behavior. Whereas when we often think about cybersecurity, a lot of the things that we've done in the past is create these controls where we treat humans like an extension of the computer. If I just stop them from doing something, we're going to be fine now. But the reality, as you said, is they're going to still click on the links. What can we do to actually stop the impact of that click? Or they're still going to sign up for the [inaudible] or click an oauth grant or whatever it's going to be. How do we actually start to mitigate the impact instead of trying to prevent the action?
Ira Winkler:
Yeah. And I mean that's fundamentally where we seem to be losing it. Again, we are addressing the problem with tactics and not strategies, because it's a tactic to implement awareness. Now there might be a strategy of awareness, but overall awareness is a tactic. A secure email gateway, very good tool, it's a tactic and we need these tactics, but we need them to be harmonized across a strategy. And that's where the industry's failing itself. I mean even when I see, well still consider most of them awareness vendors, even when I see awareness vendors saying, "We're now a behavioral modification company," I'm like, "Oh, how are you doing that?" They're pushing out videos in different ways and keeping different metrics. It's... Okay.
Russell Spitler:
I always think about when I was learning to drive a car. I didn't learn about a three-point turn reading it in a book. I didn't learn about it doing it by myself 50 times. I had somebody sitting next to me helping me through the first 10 times I did it, and that behavioral modification was incredibly beneficial, having that advisor and that expert. When you think about how we bring that into human-centric programs and engagements, how do you get an assessment of how we can do a better job of providing that in-context awareness or in-context modifications? And how do you think that might map to the broader cybersecurity programs that people are putting in place?
Ira Winkler:
So that's essentially what... I mean well, the name of your company, Nudge, is screaming. That's a leading question, I'm pretty sure. But I think nudges are essentially awareness lessons delivered at the point in time where they're most valuable. And so for example, the iconic one that most of us are familiar with, I want to look in my cell phone because there's actually a picture, I took a picture the last week of the most... Yeah, sorry, I don't know if you're going to be able to see this. So anyway, I don't know. Can you see that?
Russell Spitler:
There we go. Wash their hands, right?
Ira Winkler:
Well, it says, "Of course our employees wash-"
Russell Spitler:
Oh, sure. Yeah.
Ira Winkler:
It says, "Of course our employees wash their hands." And the reason I took this picture was there's the iconic one that we're all familiar with, which was, "Okay, employees must wash hands before returning to work." That's the most common nudge I see in the real world where they're telling employees, "Yes, before you leave the restroom, you must wash your hands, period."
Russell Spitler:
Totally.
Ira Winkler:
But this one, I just like the wording, well, it has the same impact, but it's phrased like, "Okay, we don't hire morons," but it still gets the message across that if you're an employee, you got to wash your hands.
Russell Spitler:
Yeah. Yeah, don't be the moron employee. Right.
Ira Winkler:
Yeah. So I like that because it's more delivered in a way that doesn't scare the potential customers while giving the message to the employees type of thing.
Russell Spitler:
Totally.
Ira Winkler:
So, that's a better way, but that's an example. Having things like interactions. There was a long time ago, I was working with one of the anti-malware vendors, this is more than a decade ago now, where we were trying to go ahead and when, for example, DLP would block something, they wanted to go ahead and we were experimenting with putting awareness messages in response to that, like, "Okay, you're sending out a message that contains PII." And DLP should stop that, but not only should it stop it, it can use a messaging at the same time that says, "Wait a second, this message contains sensitive information." You don't just randomly stop them, you provide them information at the right point in time. So that was an example of a technical control, combined with a nudge at the same time. And nudges can be coffee cups that people put on their desk. It could be those mouse pads, could be a screensaver, could be stickers on laptops that remind people, "Do not leave this laptop unattended," things like that. Those are example of nudges.
Russell Spitler:
I think that DLP example is such a great one because that's one of the areas where I sort of, well DLP has a long and sordid history in terms of accuracy and effectiveness, but when I've been in those environments, there's been times when I've tried to send documents for legitimate business purposes with legitimate data sets in that sort of more defensive control where they might block it or filter it has caused me to go, "Okay, well I'll throw it in Dropbox, we'll get around the DLP," whatever it might be, as opposed to potentially a more effective nudge, which would've been, "Hey, are you sure this sender is appropriate? This is the type of content we found. Did you know this content was in there? Click a button to proceed."
And that's exactly where I see the opportunity for us to evolve in terms of security controls and programs, where we actually do give a lot more autonomy to those employees, but a lot more context for what the impact of their decision is to ensure that it's appropriate. Now, there's always limits to that, but I see that as a future that we strive for.
Just a couple of other questions. So when you're thinking about designing and rolling out new security controls, obviously we've been talking about some of the awareness aspects, but when we think about more traditional controls like an EDR agent or something along those lines, how can we start to engage leaders? How can security leaders engage employees positively in that process to drive better outcomes? Is that something you've taken a look at before?
Ira Winkler:
Yeah, I mean that's always been critical. I mean how you deliver, see, there's always this thing where... So, sorry, switching to a topic that seems completely irrelevant, but there was one time where, I have a lot of friends who are CISOs, I have a friend who, at the time, was a CISO of a Fortune 30 company, and he goes, "Ira, I have a question for you." He's like, "I was just in the executive meeting with my CEO and all the other C-level people. And during the meeting, the CEO said that we are moving to a BYOD device policy." And he's like, "Well, that's just impossible to control, blah, blah, blah," and all that sort of stuff. And anyway, the CEO basically said, as everybody was leaving, he pulled the CISO aside and said, "Okay, here's the thing. In two weeks you're going to tell me, how am I going to implement BYOD securely, or in three weeks I'm going to have a new CISO."
So the question then became, I have this thing and just about in every book I write, it's called the Department of How, and cybersecurity should be considering how do you do things, not just, you can't do something.
Russell Spitler:
Absolutely.
Ira Winkler:
In general, I shouldn't say always, you don't want cybersecurity to be the Department of No, it should be, "Okay, how do you do it and how do you embed it within practice?" Because too many times we look at security as something that's bolted on to everything else as opposed to building cybersecurity practices into other systems, other processes and things like that.
So for example, if you're going to go ahead, here's a great example that I see a lot, what do you call it? Accounts payable fraud is a big problem for a lot of companies. And so there was one company I was working with, it was a private company, so it can't be like a Fortune 500, but it was a $60 billion private company, and they had a lot of issues regarding where criminals would try to send these account changes into the accounts payable people. So for example, they say, "Oh, we just changed our bank account. Please now route our funding to this new account as opposed to the old account." And so it was like, "Oh my God." Obviously they fell for it a couple times, but then the question became, how can we go ahead and implement this securely? How can we go ahead and take the process and have a verification of account changes?
And so there were some account changes where we put in manual processes that included, "You are not allowed to accept a change of bank account unless you go through the following checks and verification." So it wasn't just an email came in or something was faxed in, you had to go through, somebody had to pick it up. And to change it it's like, okay, you needed to have two sign-offs that the account was changed by the POC and then on the POC's letterhead. So they then, in theory, had to not just send an email in, they had to go ahead and know who the POC was on top of knowing everything else about the company and so on, which I'm not going to say they never experienced it again, but it made it exponentially more difficult because there's no such thing as perfect security, but you can go ahead and enhance your processes to account for different things.
But the key is it's not something you bolted on. It's something that was just accepted as a process within the process of changing account information.
Russell Spitler:
I think that's such a great example because I often find myself in this conversation of risk tolerance versus friction for the employees. And the reality is it doesn't need to be a trade-off between those two attributes, there's another piece, as you mentioned there, which is we can introduce new processes, which don't necessarily introduce friction, but reduce our risk substantially as well. And that's, I think, an area where we have a huge opportunity for improvement as we go forward.
Ira Winkler:
Yep.
Russell Spitler:
Well, Ira, really enjoyed talking with you. We could go on for hours about this topic, but it's time for us to wrap up today. Thank you everyone for joining us. Thank you, Ira, for being our guest today. It's been a pleasure to talk to you and stay tuned for future episodes of Overshadowed. I hope you can join us again soon.
Ira Winkler:
Thank you.