Anyone spending even just a short while listening to cybersecurity experts discuss the role of the human in data and network security will likely come away confused.
Depending on who you’re listening to, a network’s users are either its last and best line of defence, or its principal point of weakness. The engaged staffer is either your biggest ally in the fight against the phishers, or a living vulnerability about to allow havoc to be wreaked on your systems.
Somewhere between these poles, of course, lies the real world we all inhabit. One where systems can be protected, but bad things will still happen despite all the hard work and shared good intentions.
And it was in between these extremes that a lively and impassioned debate took place during a RANT Takeover – where representatives of event hosts Metomic sparred with some formidable members of London’s cybersecurity leadership class during a discussion of the company’s concept of the “human firewall”.
Attitude Academy
“We find that, in most companies, there’s a lot of sharing of sensitive data in SaaS applications,”
– Richard Vibert, Metomic’s CEO, explained in his opening remarks, referring to the Software as a Service tools that have become a cornerstone of modern business communication and collaboration.
“They’re built to store and share sensitive data, and we have to let sensitive data be shared if we don’t want to break productivity,” he continued. “A lot of the problems arise when people don’t know how things should be shared in the business. We enable these employees with real-time controls. You share something, and immediately you get a notification – through applications, like Slack and [Microsoft] Teams – saying that you’ve shared sensitive data. And then the employees can take action. This is what we call the Human Firewall.”
To sum it up: empower your employees by enabling them to do what they feel they want and need to do to get their job done as efficiently and promptly as possible, but let them know – without having to leave the application they’re working in – about the potential consequences of what they’re doing.
Rather than approaching the data-sharing problem with a series of pre-emptive restrictions – which will likely limit leaks but almost certainly stifle the kind of enhanced productivity your business adopted Slack and Teams to capitalise on – you presume the sharing will happen, and turn every data exchange experience into a teachable moment. In the process, your workforce gradually grasps both the benefits and the drawbacks, and, over time, add steel to your digital defences. What’s not to like?
Bring Down the Walls…
On the other side of the argument, fellow panellists Laura Penhallow and Daniella Somerscales suggested that humans can’t be completely relied upon for ultimate data security.
“There’s no patch for humans! Sorry,” replied Laura Penhallow, Head of Information Security for a large hedge fund, and one of RANT’s co-respondents for the event. “The lines are so blurry. Everything’s on the internet now – so what’s only OK to share internally, and what’s public? In financial services, we’re really clear about what not to share, but it still happens. Having the critical ability to say, ‘This is OK to share but that’s not’ is why [security leaders] get paid. We need to give people good tools – but still, there’s no patch for the humans.”
“We can’t rely on humans,” Penhallow’s fellow RANT responder, Daniella Somerscales, Group Director of Business Information Security at the London Stock Exchange Group, agreed. “We all mean to do the right things, but in the end, we want to be nice as well – which is a problem. We all want to share stuff.”
Somerscales labelled herself a sceptic of the Human Firewall approach when she was first introduced to it, but, she said, she was keeping an open mind. “I looked up the ‘See It, Say It, Sorted’ campaign,” she explained, referring to the UK government’s much-lampooned attempt to reduce the chances of terrorist attacks in public places by encouraging people to report unusual behaviour to security authorities. “And it turns out it’s effective,” she added. “So I’m on the fence.”
Raw Deal
Technology, an attendee argued, was only a part of the discussion that needs to be had. Other aspects were vital to consider too.
“One is culture,” they said. “Everyone leaves the house and automatically locks the door. When you leave your car, you lock it. It’s ingrained.”
The trouble is, when it comes to protecting business data, the same strong link between the individual action and the possible personal consequences may not be as intuitive or as strong. “If you really want the Human Firewall to work, then what’s their [employees’] connection to the data?” they continued. “Why do they care?”
“Also, people don’t understand the impact,” Somerscales agreed. “In companies, there’s a lot of box-ticking. People don’t do [security-minded things] because they understand – they do it because they’re told to. At home, some of the really basic stuff – shredding documents… the average person not involved in security, they probably don’t do it.”
Metomic’s concept takes these understandable and built-in barriers into account, Vibert argued. Culture and understanding may not exist at a sufficiently strong level at the moment, but part of Metomic’s approach is to help instil a culture of security in individual users on an ongoing basis.
“I was a data scientist, frustrated by people like you telling me what I can and can’t do,” he said, with what looked like a smile but could also have been something of a grimace. “I think about [the Human Firewall concept] as a productivity enhancer with security in the background. You give people controls that help them get home early but meet their productivity goals.”
Bring Forth the Guillotine
As the pros and cons were debated back and forth, one or two RANTers dared to advocate for extreme solutions.
“The only way is to name and shame,” one hard-bitten security leader said. “Sack a few people and they’ll soon get the message.”
Another suggested that perhaps the best way to share sensitive data might be using a fleet of well-trained carrier pigeons.
Other potential responses felt rather more conventional, if no easier to implement, nor particularly likely to improve productivity.
“I’ve worked for a number of companies and the only way I can see [of preventing sensitive data sharing in SaaS tools] is to turn them all off,” one said with a resigned shrug. “It may have consequences, but people will understand that at the beginning.”
Try finding a company who’s managed to stop classified data leaving the organisation, they suggested: if you can identify one, have a look at how they did it. Chances are it’s by preventing all sharing. “It’s the only way I’ve seen that works.”
Twenty Seconds To Comply
“I’m getting a sense of deja vu,” another experienced security leader said as the conversation raced towards its close. “I’ve heard these conversations for a really long time. We talked about nudge-based security as far back as 2012, 2013. The threat’s changed with SaaS and cloud and generative AI. But it dawned on me that there’s no silver bullet. It’s all about risk reduction.”
Ultimately though, stopping employees from using SaaS applications just isn’t practical – from either a logistical or productivity standpoint.
“Turning off isn’t an option; just saying ‘No’ every time isn’t an option,” Penhallow said. “We have to find the path, then branch off where your industry needs to go.”
“We need to understand what the business needs, and put the right tools in place, build the right culture,” Somerscales agreed. “We need to start training people in school, not when they get to a workplace – never mind a regulated workplace.”
“What we say is, lock down your data and not your employees,” Vibert summed up. “Optimising for a golden path where everything’s perfect is not an option. Minimising exposure to risk is the way you achieve that.”