Carol provides insights on why the insider threat is being ignored and why that might not be a wise business decision.
Whenever I talk to clients or prospective clients and I even mention “insider threat,” I get one of two reactions: either they think I’m crazy (that could never happen because their employees wouldn’t have enough knowledge to do something inappropriate) or they’re offended (because they trust their employees and believe that none of them would ever do something nefarious).
To refute these reactions, I must point out that “insider threat” covers more than just events that occur due to malicious intent. Also, both of the assumptions that organizations make about their employees are quite inaccurate…whether organizations want to admit it or not. In addition, the number of insider incidents is rising.
More Than Malicious Intent
The term “insider threat” covers more than just employees or contractors performing malicious acts. Insider threat also applies to all of the accidental errors and misconfigurations that happen just because we’re all human. This IBM X-Force IRIS Data Breach website shows misconfiguration is second only to incidents whose cause was undisclosed. The most obvious accidental error that’s in the news right now is malware. Malware is wreaking havoc with many organizations, even causing some to go out of business. Malware doesn’t just leap into organization’s computers. Some human has to click on a bad link or download and open an infected file for malware to enter an organization.
Other examples of accidental errors include running an SQL UPDATE statement against a file in production instead of one in a test library because the *PUBLIC authority of the file was set to *ALL. Another example is clicking on the wrong icon and uploading the contents of an Excel spreadsheet rather than downloading to refresh the data so that the production file is now corrupted with old data. Another example is when someone maps a drive to IBM i and drags and drops the wrong object to the trash bin on their desktop. I could go on, but I’m guessing that one of these examples has sounded familiar to you.
All of these business disruptions could have been prevented. Even the effects of malware can be greatly reduced with employee education and a strong security scheme in place. Why am I calling these examples “business disruptions” and not security events? Because I believe that “business disruption” more accurately describes the effect on your business. In all cases, application users would have been idled as the application was stopped and the system administrator restored the file from backup. (In the case of malware, the entire system may have to be restored.) Then someone in the business had to enter the transactions that had occurred between the time of the last backup and when the accident occurred. If that’s not business disruption, I’m not sure what is.
Malicious Actions Do Occur
I think one of the reasons IBM i teams refuse to accept the thought that someone may do something to purposefully harm the organization is because most of us are trusting individuals and have a strong moral compass. In other words, we think the best of everyone. Unfortunately, I’ve seen an insider turn into a criminal. When I was at IBM, there was a rash of laptops disappearing out of offices shortly after individuals left their office. When it was discovered who was behind the thefts, none of us could believe it. The individual was one of the last persons I would have thought capable of this act. What we didn’t know is that the individual had amassed a large gambling debt. In an attempt to keep it from his family, he started to steal and then sell the laptops in an attempt to pay off the debt. That situation taught me that we cannot base our security scheme simply on trust. Was this person evil? Absolutely not. But he was desperate, and that caused him to take actions that were well beyond what anyone would have ever guessed him capable of doing. To assume that you know and can trust your employees and use that as the foundation of your security scheme is simply not a wise business decision.
Then there are those who, simply put, are evil…or I should say want to do harm or bring embarrassment to the organization, want fame for themselves, or want to gain financially. Again, you don’t think this happens, all you have to do is subscribe to one of the security newsletters that I read. At least twice a month, there are headlines about someone who has left their organization but, before they did, planted a “logic bomb” (code that does harm after they leave). Or there’s a story about someone who logged back in to their former place of employment and stole trade secrets and used them in their own business. Or someone who looked up the health records of famous people and published them on the Internet. It’s quite sad where people’s minds are, thinking that these activities are OK. But they do happen in real life, not just in a TV series or movie.
The sad thing is that most organizations could have prevented these evil deeds. The most famous hacker in recent history, Edward Snowden, could have been prevented from getting to the information he released. Whether or not you agree with what he did, you have to understand that he could not have done what he did if he had been restricted to access only the information that related to his job. What I’m trying to say is that security can stop many of the scenarios I’ve just described…yet many organizations don’t bother. Why is that? Perhaps it’s because organizations have forgotten that security can help prevent business disruption and that security is not just about regulations or compliance.
Unfortunately, the insider threat is increasing. The Verizon 2019 Data Breach Investigations report shows that 34 percent of breaches were caused by insiders as compared to 28 percent and 25 percent in 2018 and 2017, respectively. It’s well-understood that the cost of a malicious insider event is more expensive than an outside hacker attack, primarily because insiders know what an organization’s valuable data is comprised of along with where it’s located. And the cost continues to rise according to the Accenture Ponemon 2019 Cost of Cybersecurity study.
Granting users only the authorities they need to do their jobs is not only a good security practice from a law and regulation perspective, but it can help protect the “casual” event as well. I attended a fraud-and-breach-prevention conference, and one of the speakers talked about the fact that employees may consider fraud, but if they think they’re going to get caught won’t even attempt it. If you’ve implemented a deny-by-default security scheme as well as turned on auditing and employees know they’re going to get called to find out what they were doing to get an authority failure on the payroll file, do you think those individuals will attempt to download it? No, they won’t. And for those brazen individuals who will try anyway, you’ve got proof of the attempt along with peace of mind to know they didn’t get the file. If you base your security scheme on trust, those individuals who want to commit fraud (regardless of the intent—evil or to get out of a bad situation) have an open door to take advantage of your business.
Summary
I’m not sure what else I can say to help convince organizations that the insider threat cannot be ignored. Unfortunately, I fear that people will continue to think of security only as a bother or something that’s required for compliance and never consider it as a tool to help protect their organization from disruption. But maybe, just maybe, this article will reach someone who will see the benefits of implementing a sound security policy that allows people to do their jobs but doesn’t provide access or capabilities beyond that. My hope is that you’ll understand that a sound security implementation will help you avoid business disruption.
LATEST COMMENTS
MC Press Online