Also, UK Cyber Official Urges Businesses to Build a Safety-Focused ‘Reporting Culture’
Governments are increasingly calling on security researchers and the academic hacking community to improve the state of cybersecurity by better informing policymakers. And businesses must do more to foster a “safety culture” in which input from security teams gets translated into ongoing problem-solving.
Those were two themes to emerge at this week’s Black Hat Europe conference, held in recent years in London, but this year – for the first time – held virtually due to the ongoing COVID-19 pandemic.
Governments seeking greater input from security researchers and ethical hackers carries risks for all involved, said Jeff Moss, aka the Dark Tangent, in his opening conference remarks on Wednesday. Moss is the founder of both the Black Hat and Def Con conferences and a long-time member of the security research community.
Over the last five to 10 years, he said, researchers and hackers have increasingly been called upon to better inform those develping government policies. In Moss’ case, that includes serving on the Global Commission on the Stability of Cyberspace and stints as the CSO of ICANN and as an adviser to the U.S. Department of Homeland Security.
“Policymakers have grown up with technology and computers, and they’re asking us our opinion,” Moss said. “This is a really dangerous time for us right now, on one hand, because they’re finally asking us our opinion. So, great opportunity. The great risk, though, is they’re asking us our opinion, and if we screw this up, we may not be taken seriously.”
Government officials and hackers, by their nature, approach problems in different ways, he said. Many hackers, for example, literally take things apart to see how they work, regardless of copyright protections or restrictions designed to try to prevent technology from being reverse-engineered.
Hackers also tend to be vocal when something doesn’t work as advertised. “Through this kind of very energized, cyclical process, we’ve come up with disclosure, which led into commercial bug bounty programs,” Moss said. Unlike lobbyists, “we act as sort of a neutral third party, telling policymakers what is and isn’t possible.” But this can create tension, and “it’s very important that the community of the infosec researcher, and the community of government … learn from each other and we learn how to work through this tension.”
Focus: Internal Cybersecurity Communities
Another key message being delivered by government leaders is that businesses need to pay more attention to their own cybersecurity experts.
That theme was sounded by Pete Cooper, the deputy director for cyber defense at the U.K. Cabinet Office, who delivered a keynote presentation Wednesday.
Cooper says that when he goes onsite at an organization that has suffered a hack attack, he too often hears two contradictory versions of the impact – one from senior leadership and another from the trenches.
Following a ransomware attack at one organization, for example, he said the board of directors and leadership was sanguine, saying they’d “dodged a bullet” and done pretty well.
But the security operations center and other cybersecurity-focused team members offered a different assessment, telling him that “we got away with it by the skin of our teeth,” and that “in a way, we wish we had been slammed, because then the board would listen to us.”
Learn From Aviation’s Safety Culture
Cooper also spoke from the perspective of his having served as a U.K. Royal Air Force pilot flying Tornado fighter aircraft. Subsequently, he became an air force safety officer in charge of creating an engaged “reporting culture” in which individuals felt comfortable enough to “readily report problems, errors and near misses” so the organization could target the underlying problems.
“Aviation had really huge accident rates until the sector really started digging into the culture and understanding of what was causing those accidents – and it absolutely transformed safety,” he said.
Many organizations, however, are not set up to foster the required levels of input and action from their cybersecurity teams. “Really the final keystone of that culture is a questioning culture, empowering individuals to speak up if they see risks, mitigations or opportunities,” he said. “And if they think something isn’t right, even if they don’t know the solution, it’s really important to hear their voice.”
Of course, accidents will always continue to happen, which is why technology must be “designed to fail safe,” he said, referencing an early incident in his aviation career in which he accidentally deactivated the engine on his single-engine training airplane, eventually identified the problem and was able to restart the engine and land safely.
Human error remains a fact of life across every discipline. That’s why “defense in depth is critical in both flying and also cyber defense,” Cooper said. “Things will go wrong: Users will click links; a back-end team could potentially set up a service incorrectly.” But by properly training staff on well-designed processes and technology, “that will give you the confidence that no matter what’s thrown at you, you can get yourself back to a safe state and a secure state.”