Hello and thank you for joining us for episode five of our Overshadowed series, discussing how modern IT and security teams are dealing with the risks of shadow IT. I'm Russ Spitler, CEO and co-founder of Nudge Security, and with me today is Malcolm Harkins, chief security and trust officer at Epiphany Systems. Welcome, Malcolm.
Hey, Russ. Thanks.
Today, we'll be talking about the counterbalancing forces of risk in friction and security, a core design principle that we've kept front and center at Nudge Security. Malcolm, when we were talking before, it sounded like this has really been a big focus for you during the course of your career, but before we jump in, could you give us a quick overview of your background and work at Epiphany?
Yeah, yeah. I have been with Epiphany for a couple of years. We actually just merged with a managed security provider that we were sister companies with, called Revealed. We're in the process of doing all those changes, but prior to that, I most notably was at Silent as chief security and trust officer working for Stuart McClure, the CEO and co-founder. Previous to that, I was at Intel for 24 years, straight out of graduate school. Business roles for 10, 11 years and then I tripped my way into security almost 22 years ago after 9/11 and after Code Red and NMDA. Andy Grove running Intel still at the time was frankly beating the crap out of the CIO to deal with the availability risk issues. He had asked me to come in and run security business continuity, and that led me putting my arms around basically all aspects of things, and eventually became chief security and privacy officer worldwide.
That's incredible. What a fun time to get introduced to security, I can only imagine what it must be like to have Andy Grove breathing down your neck.
As we start, would love to drill into what you look at when you think about implementing security programs and how you define friction as it relates to security programs and implementing them at scale.
Yeah. I've always had this view that... Being a former finance person and somebody ran finance for $1 billion business unit and helped start a few business units at Intel. When I got into security, I was like, "What is the business outcome that Intel needs, the shareholders need from me?" Spending money on security doesn't add to the net income of the business, right? It's a risk management type thing. I was like, "Okay, a good security solution will create a demonstrable and sustainable bend in the curve of risk." Primary job, reduced risk. A great security solution will do that and lower my total cost of controls, because there's a cost of control. Not only the direct spend of the solution, the time, the effort, but it fans out across IT and sometimes the business unit. Then my mythical hero solution did both of those things, but would reduce control friction, because control friction is not immutable.
It's something that we can design towards or design away from. In a high-control friction environment, I always found at Intel, being in a high-tech industry, but I think this is by and large true in most organizations, a high control friction environment, one of two states occurs. People will either adhere to the control and, if it causes too much friction, it creates a systemic business risk. Slows product development, slows time to market, slows productivity. That is a systemic business risk, because it affects the P&L, or they'll say, "Screw you, security folks," and they'll go around the control. In which case, [inaudible] money on a control, the theoretically control for risk that people are going around, and then we call it shadow IT or we blame the user. They're just trying to get their job done and we've made it hard for them to get their job done, so they ignore the control. Then we blame them and then we buy another solution and then the cycle continues.
My mythical hero would always do one of those three things. If I could go to the business and say, "Give me $1 million and I'm going to deploy a solution that is going to bend the curve of risk down, lower the company's total cost to controls, and improve business velocity," what CFO's not going to say yes to that?
It's so refreshing to hear people have such a considered perspective around that end user experience, because I do feel as though, for a long time, we've taken a lot of security controls and just said, "Hey, this is a new chink in our armor, it's going to happen and our end users' are going to deal with it." You're absolutely right, the effectiveness of that control is defined by compliance to end users and that user experience is such a key defining aspect of that.
Yeah. To me, it's always been a design and architectural element, and again, I've said for a long time, most of the security industry profits from the insecurity of computing. They make money the more risk that occurs. Guess what? If I sell you a solution that generates a risk issue, because it creates friction and then causes the user to go around, guess what? I can sell you another solution to deal with that problem, too. Economically, the vendor community has never taken a major design focus on that user experience and, on the internal aspects, we've always just said it's a trade-off between productivity and security. I'm like, "No." When you trade off, it's like a balance beam. One goes up, one goes down. When you start framing things that way, you're automatically sub-optimizing. To me, it's more like an optimization problem, a calculus equation. I got to maximize the coefficients of each thing. How do I maximize the security benefit? How do I maximize the user productivity? How do I minimize the total cost to the environment?
When you put it that way, I might have to revisit my business model here, Malcolm. Let's take it down a notch and, obviously, can you give us an example of a program where that risk-friction balance was out of whack in the favor of friction, and what you saw as the trade-off in consequence of that?
Yeah. I'll give you a couple examples and they're real ones. I've written about them, talked about them before. Again, go back 20 plus years ago when I landed running security. Again, you're starting to do massive hygiene initially. Guess what? I had an authorization from Andy Grove, I didn't care about the friction I created. We pushed patches to systems and gave users 30 seconds to save what they could save, and they were getting rebooted and we didn't care. Now, I did that intentionally, because frankly, we needed to up-level very quickly, but that was a conscious, temporary, friction-oriented, "We're going to pull everybody through a knot hole," and then we started figuring out how to dial the friction back. Why? Because I didn't want to have to look over my shoulder walking to my car considering...
Again, there was that, but I'll give you another really good example. We had, again, public information. There was an employee who had stole a few $100 million of intellectual property from Intel. His name was Pani, you can look it up. Went to federal prison on it. After that event... The individual had access to a top secret database, they were authorized, and all that type of stuff. The audit ARs and the direction that I was getting from management and from various aspects of the compliance team was to make it harder to go get access to sensitive information. If you look at it from a sales perspective, particularly in those type of environments, you need design information. You need the socket data for processors to go sell to a Dell or a Lenovo or something. That's sensitive information, it's intellectual property. It was registered, so you had to go retrieve it, ask for it, get it, all those types of things. There was a push to put more controls in place.
Guess what? If you take a physical analogy and you go implement the equivalent of Fort Knox for data, what does that do? That means you've just slowed down the sales cycles. You just made it hard for a field application engineer to go work to design in a product to generate sales. That's directionally what I was told to do, I didn't do that. Why? Because that would've slowed the business. I said, "What we need is to basically speed up stuff," so we went and looked at the system and said, "How many..." I'll be directionally correct. 4000 people had access to it, requesting 4000 times a month access to those sensitive documents for that sales purpose. Average time for approval was over a week, and it usually required multiple requests, because the manager would be busy and forget and it would time out, right?
Okay. What I did is say, "We're going to auto-approve it. You've been given access to the system? You are authorized to get it as soon as you want, when you need it, and what I'm going to put in place is a detective control to look for unusual activities for gathering stuff that doesn't seem to make sense. Doing it in certain times of the day, doing it with other things that would indicate a potential insider risk." Guess what? I sped up the design in process and put a more appropriate control in place, because by and large, all those requests were eventually approved anyways, they were just taking longer.
Yeah. That's a great, great example of evolution of controls, right? Because, as you described the problem, that's immediately what I kind of jumped to in my head. Like, "Okay, approval process, ensuring access," but the reality is it's not that, because it is an insider risk and, of course, what we really need is better detection of abnormal behavior, not normal behavior.
Exactly. There's another study that was done a couple of years ago. I can't remember the medical association, but Vanderbilt University was with it. The dean of the Graduate School of Management had co-authored it with some medical officers and a few other people. Headlines read in the news, "Cybersecurity-related incidents cost life and healthcare," something of that nature. I went and grabbed the study, looked at it, and you would think that it was the ransomware events a few years ago that were affecting the healthcare environment that was causing risk to patient care. Turns out that wasn't the case. It was the controls put in place post an event that slowed down critical care for people coming in with heart conditions that increased the death rate by 10 people for every 1000 patients treated. Our controls created friction in a critical healthcare environment that cost lives.
That was my... Just a personal anecdote, I was at a hospital once with my wife, when our children were coming into the world, and there was a nurse whose responsibility was to go around to every station and log in, I'm not going to name names, every 25 minutes, because the control said it had to be logged out every 30 minutes by default and she never wanted a doctor to walk in the room and not have access to the computer as soon as they needed it, when they needed it. Beautiful example of that kind of friction versus security and risk piece.
Yeah, because at that point, it's like, "I could care less about the confidentiality of my healthcare records. I care about the availability, one, and the integrity of it, two." Confidentiality, I don't give a shit about when I'm going in for a medical issue that's urgent.
It's a complicated space and I'm glad that I'm not in that seat, to be sure. When you think about this, and particularly when you think about... I'm thinking about your use case with that IP database back at Intel. The reality that we're seeing more and more often is it's not as conveniently located in a centralized database that you're managing on-prem, more and more often it's out in the world in this ephemeral SaaS estate that organizations are dealing with. How do you think this calculation of risk and friction has changed as SaaS adoption has exploded and more organizations are using enterprise SaaS solutions for core parts of their business?
Yeah. Again, like you said, that data store has created an urban sprawl that's everywhere. It certainly adds a complexity to not only, "Where is the data?" And, "How do I assure the right person's accessing it on the right system that's protected in the right way and, in some cases, in the right location?" Because there might be some geographical restrictions. There's a lot of complexity there. I think a lot of people have frankly thrown MFA and cloud-related authentication capabilities in place to try and deal with that, but I say "Try," because with my credit card, I could go get HubSpot, my credit card, I could go spin up AWS instance, or even if we have it as a... Take a collaboration tool like Slack. I could create a Slack channel and include people external to the business that I don't have anything on.
It becomes really hard, and what I see is it's not well managed, it's not well articulated in, really, the scope, so there has to be a better way to really understand that spread and then create the right, I'll say, control environments that in some cases enable the user to make better choices and, in other cases, allow you a level of control and both need to occur.
It's interesting, as you talk about it that way, because when I think about this in the categories of friction versus risk, there are some organizations that I've interacted with who make assertions like, "Hey, I've white listed the 5000 domains my employees can go to on the internet." Of course, I would classify that as a high friction environment, but what's interesting in those environments is I don't think they've actually managed risk on the other side. What they've just done is driven the risk into less desirable behaviors. "Okay, I'm going to access those sites on my phone, on my personal computer," on whatever it might be that supplements their work console that's so heavily restricted.
You're 1000% correct. Again, another true story. Yammer was launched, what? In 2008, 2009. Sometime a while ago.
Yeah, something like that.
I was one of the first 50 users at Intel on Yammer. We were getting ready to do an earnings release and there's a lot of sensitive information, and then we always did these business update meetings. Literally, right after the earnings release, CO would get on and do a live stream business update meeting in Q and A with employees. One of the employees who matrixed to me, who was also a part of the internal social media team and doing wikis and stuff, but was helping on security architecture, had blasted on an internal wiki, "Hey, let's all get on Yammer and chat about stuff during the business update meeting." That got wind of the head of HR, the CEO, the general counsel. CIO pulls me out of a meeting and tells me what was there and I was like, "Yeah, I saw it."
He's like, "Paul," who was the CEO at the time, "Wants you to shut off Yammer." I'm like, "No, I'm not doing that." He is like, "No, no. You don't understand, Paul told me to tell you to shut off Yammer." I was like, "Paul doesn't know... I get why Paul's reacting that way, but I'm not doing that." [inaudible] the head of HR legal calls me and is like, "You don't understand, we need to shut this off," and it's like, "No, you don't understand. People will pick up this, they'll do their tablet, they'll go on the guest network. It's a faux sense of control."
"What I need to do," which I already started doing is I was participating in those Yammer groups. Why? Because culture's the strongest form of control and, if I can shape the path, my job as a risk manager is to be the biggest risk-taker and be on the riskiest things first. Why? Because how do I know how to manage the risk if I'm not the first mover on the riskiest things? That's the thing that drives me nuts about the security team. You need to be the risk-taker and be in front of the business on the riskiest things in order to figure out how to design the controlled environment for those three outcomes that I mentioned.
All right, Malcolm, here comes. What do you think the riskiest thing that you're doing right now is?
Besides talking to you. No. Just kidding.
You know, to to be honest, it's a great question. I You know, again, there's this whole thing on large language models, chat, GPT. Do you use them? Do you not use them and all that type of stuff?
I, you know, I've played with that stuff. I'm not afraid of it. There's certain, I think behaviors and awareness and things that we've gotta, you know, educate the users on. Maybe you put a splash screen instead of blocking it.
You put a splash screen on the browser when somebody goes to one of those and says, hey, just recognize you're stepping off prem, you're gonna go do this. Here's the equivalent ten commandments. Here's the three things you're we're we're gonna have you a test you're not gonna go do. You're not gonna share intellectual property.
You're not going serve share personal information. You know, whatever, whatever. Click accept. Great. Now you've just made me informed.
You've made me risk-aware.
You've got documentation that you've done that. So if I screw up, you can fire Okay. You're enabling, you know, I I have a trademark on three words. Protect to enable. That's when I believe our job is. Protect to enable data and the business.
That's a great way to frame it.
When when you think about kind of going forward from here. And and, certainly, I appreciate your perspective on on the, you know, emergent risk generative AI.
I kind of view that as more of, like, the most acute example of what people have been doing for years, right, which is, you know, there's some magic box on the other end of my web browser. And I'm giving it data. And I'm assuming something's happening and there's some protection around that data, but I'm not thinking a whole lot about it because there's some value I'm getting richer. Or
It's no different than when we first went to Google or Yahoo to search for something.
Exactly. Exactly. And we've kinda gotten over the the hurdle in a lot of other areas, but I think it's, frankly, a great way for for people to kinda refocus on the reality of what are doing in SaaS applications across the board. But, what advice would you have for security leaders kind of looking to modernize their their security program to kind of balance that risk versus friction as as you very accurately describe. Yeah. Well, I think I think there's a couple things.
One set a design goal.
You need to have a design goal at a macro level with the company. My design goal at Intel was no material or significant events.
That framed everything we did. Which also meant that depending upon the context, a high friction control might be required. To hit the design goal.
But if it's not material or significant risk, why would you put in place a high friction environment Right. So you can you can really start thinking about that.
The other aspect, you know, like when I got silent That was the design goal and the second design goal was only a nation-state actress should ever be able to get in and they have to work.
Now I had uh-uh alignment with the business around not only the level of risk that we were willing to accept but also the threat factor we were willing to accept. And then my job is again to manage underneath that. The risk dial, the total cost dial and the friction dial. Because if I just implement friction, and I generate people going around the business. Guess what? I might create a material or significant event. Or I might create the ability for the threat actor that I wanna have to work for it to make it easy because I just made an opening because the user went around a control.
That's why I think when you think of this like a calculus equation and the optimization, you end up with a better answer for the business and a better answer for security.
I think that's a really mature way of thinking about those things. And it's, obviously, been thinking about a lot over the year we we always used to, my our back the napkin was, like, how secure you wanna be? There are users doing stupid stuff. There's broad-based attacks, and then there's nation eight actors.
And it's sort of like, pick your poison. What do you need to get rid of today? Right?
Malcolm Hawkins: Yep.
And that's that's always a challenge. Well, we could go on for hours, but it's time for us to wrap up. Thank you everyone for joining. And Malcolm, thank you so much for for making time for us today.
I really enjoyed talking to you stay tuned for future episodes of overshadowed, and I hope you can join us again soon.