The Security Buzzword
I fell into this field diagonally while looking for some tasty treats on a side quest. Back then, people were beginning to understand the importance but had not yet reached the point where they had joined the dotted line between secure practices and ROI for said practices. In fact, it seems quite a few organizations are still being dragged kicking and screaming into the "security is important" paradigm. Usually, in the form of the loss of, or near loss of the Crown Jewels, or reputational damage. The question is one of money.
In the past, when I used to meet with potential clients, the phrase "ROI" was thrown around a lot and not much was seen in terms of a positive return. After a breach, however, the barn door was replaced with 10-foot thick lead doors guarded by automatic motion sensing flamethrowers and a platoon of RoboCop-type androids. Inside the barn, however, was a half eaten bag of chips, a few banana peels and a poor orangutan trying to figure out how it got there from the happy jungles of it's native habitat. (The orangutan, though an interesting character, will be dealt with in another article). Now, it seems, the pendulum has begun it's inexorable swing the other way and senior management are beginning to see the light at the end of the tunnel. The money, however, is being thrown at the problem without much being done by way of managing the actual issue.
Let's step back a bit and define the issue at this point.
Security is not a stand alone technique (for the want of a better term). It is in fact a state of mind. A learned behavior, if you will. Just the way we have been trained to turn off the ignition of the car when we arrive, we need to train ourselves to use basic security practices without thought. A client that I worked with in the past had spent millions of dollars on securing their architecture, but none of it helped when one of the employees allowed an important server to get cryptolocked. After an investigation, it turned out, if it wasn't that particular employee, another one would have allowed it to happen eventually. The issue wasn't maliciousness, it was laziness! Basic practices from the security handbook were not being followed. There had never been an audit of the security practices and even though there was an annual security training (for all employees, no less), it might as well have been a meditation session with Beethoven's 5th playing in the background, to prepare for the breach, as it happened. (the "da da da DAAAAA, da da da DAAAAA" one.).
Another time, I was invited for a quick 2-day audit of a SIEM install that had taken about 6 weeks. A SIEM, by the way, is a tool used by organizations to log everything, from every system, into a central repository. This repository can then be searched for specifics (breaches), or can be used for it's predictive capability (machine learning - imminent device failure), or can just be stored (legislation or regulatory requirements). Having gone through the entire architecture and setup, even though things were humming along well at the time, there were basic practices that were either missed or ignored. These missed items would have led to the tool being less than ideal over a period of a few months. Now I have two problems with this.
First, the tool being used is amazing (in my opinion), however, a lot of people have complained to me in the past that it gets very slow, and blame the tool for the issue. I'm going to start calling it software now, tool sounds too much like "tool". 75% of the time, the slowness of the software is because of improper installation or configuration. This can be blamed on the initial install. (Hence, the audit I get called in for). The other 25% of the time can be blamed on one thing - knowledge. This leads me to my second problem.
Incomplete knowledge is a very dangerous thing.
Story time:
In a land not so far away, a man decided to learn to fly. Aeroplanes weren't yet even a gleam of a thought, in fact, I think Michelangelo was still the stuff of stars (as in, we are all made of stars kind of way). He saw birds and saw them flapping their wings as they flew, and decided to make wings for himself. Over the next few months, he went around collecting feathers and invented a strong glue that would allow him to stick them to pieces of branches tied together (somewhat like a gorilla's grip on a hapless poachers' neck). Finally, he had his wings, and was he proud of 'em. Now, his spouse was a really smart lady, and told him to maybe test them out first, be the first experimental scientist, as it were. Unfortunately, those were days of patriarchal society, and women were, well, let's not go there. This is not that kind of an article (or story). He went to the highest cliff he could find, put on his wings and jumped, and flapped, and flapped some more, and panicked and flapped harder, and ended up breaking both his legs, and an arm, and a couple of ribs and some more, but his hospital records are lost to time.
So, folly number one in our incomplete knowledge quest - he had incomplete knowledge about whether the wings would work before jumping off the cliff. He might have experimented a bit and filled the holes to create a better product that would lead to, if not flying, at least landing safely. Historically, that's been the hardest part of flying.
Folly number two was not listening to good advice, regardless of source.
Let's tie the two ribbons of thought into a neat little bow now. We're almost at the end.
When hiring someone to do a job for you, you need to have an inkling of what it is they are doing for you. You do not need to have the same level of security all across your organization. In one way, if you are a library and have public access computers, you do not need to log each and every keystroke. In another way, you do not want your crown jewels to be protected by the same firewall and policies that protects your outside facing ftp server (hopefully, you are using SFTP, if you really need to). In fact, your crown jewels should not be connected to any outside facing network at all.
Identify what your security needs are and then work into the contract exactly what is being protected and monitored and how. Figure out if your security partner is responsible for letting you know of any holes they find in your armor. Most times, these things get ignored. Are you going to penalize your partner if you get breached while they are responsible for your security? What if it's your employee that causes the breach due to unsanitary security practices. Do you allow audits? How often? Who does them?
Finally, do YOU personally follow good practices? Or are you just a little bit lazy?