I was at a security conference just recently where a vendor was giving out stickers that said, “Don’t Click on Sh*t!” A computer admin in front of me grabbed a handful and said, “I need a whole bunch of these. I’ve got lots of users who keep clicking on sh*t.”
I asked him, “Then why are you giving your users all that sh*t to click on? And by the way, if you know they’re going to click on sh*t, why aren’t you preventing expected results?”
As a CFO, if you were to walk into a board meeting to report a multimillion-dollar financial loss and said, “This was the result of human error, but we can’t blame the employee,” the response would likely be, “You’re right. We blame you for not proactively creating internal and compensating controls.”
And that board member would be exactly correct. CFOs have all sorts of checks and balances in place to prevent losses from both fraud and mistakes. You have authorization procedures for approvals. You have segregation of duties, audits to proactively look for inaccuracies, monthly reconciliations. You require receipts for expenses over a certain dollar amount. Each of these controls recognizes the probability of error and uses processes to mitigate the effects.
So why is the situation different when an employee causes an expensive cybersecurity-related incident?
That sticker-collecting admin had no answer, and I suspect most CIOs would be similarly at a loss. The concept is so obvious, yet at the same time so contrary to how human error has been treated in the IT field.
I contend that there is no legitimate reason why your cybersecurity team is not equally as responsible as the finance or operations teams for predicting where mistakes could happen, seeking to remove or reduce the likelihood and then, knowing that we cannot eliminate human error, providing controls to mitigate the effects on the business.
The fact is, the cybersecurity industry has long been given a pass in addressing human error. Meanwhile, in just about every other department, controls are in place. Your warehouse may have telematics, autonomous guided vehicles and barcode scanners to minimize accidents and picking errors. Manufacturing teams have layers of QA checks. Facilities groups have evacuation and safety drills, alarms and sprinklers in case someone lights the kitchen toaster oven on fire.
We all need to demand that cybersecurity teams likewise proactively address the inevitability of human error.
Industry luminaries frequently say, “You can’t stop stupid.” My response? “That’s our job.”
This means we work to mitigate the effects of dumb, malicious and plain old careless. This is not a unique concept in business. My latest book, “You Can Stop Stupid,” addresses how to accomplish this in cybersecurity.
For example, you know that employees will inevitably lose PCs, so make it a policy to run tracking and encryption on company-owned equipment. Limit permissible uses of personal mobile devices. If you expect phishing messages, put in place secure email gateways, nudges on suspect mail and awareness training, paired with limited permissions to execute programs, which offsets the malware’s ability to do damage. Then put in place data leak prevention, anti-malware and web content filtering software that limit the potential for data loss, even if all other measures fail.
The above are tactics that must be implemented within a comprehensive strategy in the same way you, as a CFO, implement financial risk reduction. It is likely that your technical teams are addressing user-initiated losses with tactics that seem useful but are not part of a cohesive strategy which considers the entire process.
It can be done, but you first need agreement that it should be done.
I may be preaching to the choir, as they say. Maybe you’ve paid out a ransom or eaten the cost to notify customers that their personal data was compromised. Based on your organization’s past experience and future financial risk, CFOs know they do have a stake in the organization’s cybersecurity posture.
I realize it is an incredibly sensitive issue, but cybersecurity-related vulnerabilities are likely one of your organization’s greatest financial risks.
Take a look at this real-time ransomware cost tracker(opens in new tab) that compiles victim payments made in bitcoin to wallets linked to a dozen major ransomware variants. When I turned in this column, we were at $50,654,483.58 for 2021, so far. And that’s just the tip of the iceberg.
If this isn’t a CEO- and board-level discussion, it will be soon. By 2024, as much as 75% of CEOs could be held personally liable for data breaches(opens in new tab) if it is found that the incidents occurred because the organization did not focus on, or invest sufficiently in, cybersecurity and a breach or incident led to physical consequences, said one major research firm. Changes in the regulatory climate are, in my view, likely.
Finance teams are accustomed to compensating for mistakes. It’s second nature. Discussing how your organization’s CISO and CIO are addressing their own human-error problems might be difficult, but it’s time for that conversation.
More From NetSuite
Application and Operational Security in NetSuite
Your business systems should mitigate human error, not enable it. NetSuite ERP includes essentials like role-based access controls and robust password policies plus layers of additional protection, including multi-factor authentication, IP address-based restrictions and more.
Ira Winkler, CISSP, is CISO for Skyline Technology Solutions and author of books including “You Can Stop Stupid,” “Corporate Espionage” and the bestselling “Through the Eyes of the Enemy.” He is in the Information Systems Security Association Hall of Fame, and CSO Magazine named him a CSO Compass Award winner. Ira writes for a number of industry publications and has been a keynote speaker at major information security events.
Ira began his career at the National Security Agency and went on to serve as president of the Internet Security Advisors Group, chief security strategist at HP Consulting and director of technology of the National Computer Security Association. He was also on the faculties of Johns Hopkins University and the University of Maryland.