A Seminar In Cybersecurity Risk Analysis

A Seminar In Cybersecurity Risk Analysis

There’s a scene in the movie Caddyshack where Carl Spackler, the greenskeeper played by Bill Murray, utters to himself, “I have to laugh” as he molds a squirrel out of plastic explosives. At times, I feel that when I read some LinkedIn articles and most of the Wallstreet Journal for that matter. That is, I see articles such as, “Top Qualities of High Performers” or “The Seven Keys to a Better Resume” and take them as fact only to find out that they are all too often opinions. It’s then that I think of that scene from Caddyshack and find myself remorseful for wasting the last 10 – 15 minutes of my life reading a piece of fiction.

No doubt you’ve seen articles like that many times, perhaps you’ve written one. When I read a headline like that I instinctively think, “OK, the author did a controlled experiment where he or she took a random sample and identified a control group and a test group. Then, after a series of observations, he or she performed some analysis and found evidence to support a particular claim.” The problem is, that may not be the case.

Many articles in the popular press are opinions. Unfortunately, even the ones that have lots of numbers in them can still be opinions. There’s nothing wrong with an opinion piece, but readers often take them as fact: When I read the “resume article” mentioned above my first thought was, “So if you do these seven things to your resume, you will get the job?”

It’s no secret we live in a world of opinions but as Deming put it, “without data you are just another person with an opinion.” In our business of decision science, we teach that, if there is a chance of being wrong, and a cost of being wrong it can be measured. Said differently, and to paraphrase of my boss’ first book, “you can measure anything” you just need the right tools – this by the way, is not an opinion, it’s a mathematical claim that can be proved.  

One problem is that we often measure what is obvious or most visible. Then, we erroneously assume that whatever it is we measured has some significant impact on an outcome of some decision – the decision to implement a sales portal to increase business, as an example. We observe that often the variable measured, such as upfront development costs, may have little or no impact on the decision.

What we tend to find is that often the variable that would have made the most significant impact as to whether to accept or reject the investment is something we never thought of before or something we perceived as immeasurable. In our business we often find that organizations measure almost exactly the wrong things. This is known as the measurement inversion and its application is broad and includes many of the hot-topics of the day, such as cybersecurity.

I read a variety of technical journals including some in Information Technology. I see countless adds for products in the cybersecurity sector: from antivirus software to firewalls.  If this is your field of expertise, and it is not mine, how do you know the value of these products? Think about it: how do you know the real dollar value of a particular control, such as antivirus or malware? Moreover, how would you know if that particular control can effectively mediate your single biggest cybersecurity risk? For that matter, what is your single biggest cybersecurity risk? Doug Hubbard, author of How to Measure Anything (Wiley, 2007, 3rd Ed., 2014) has just released a new book, How to Measure Anything in Cybersecurity Risk with co\author Richard Seiersen. 

This October 6th, 2016 Doug will co-host a 1-day seminar in Arlington, VA. Doug is teaming up with Doug Samuelson, DSc in Operations Research and experienced Cyber-Counterintelligence analyst, to provide an introduction to how to fix cybersecurity risk problems by using quantitative techniques that produce measurably improved outcomes.

For more information and to register click here.

To view or add a comment, sign in

Others also viewed

Explore content categories