Uncertainty, Measuring CyberSecurity and Heisenberg
- Joel Van Dyk
- May 26, 2021
- 3 min read
“The Concept of Measurement As far as the propositions of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.—Albert Einstein”
Measuring: it seems like such a simple concept. How long is something? Whip out the measuring tape. How heavy is something? Step on the scale (usually bad news). How many times pints here in the UK? Measure it out against the glass (or 2 or 3).
For a long time in our history, that was all that was necessary. A foot was the king’s foot, a cubit the length from the king’s elbow to the tip of his fingers.
But, as anyone who has had to measure anything for construction has experienced, is it 88” exactly or 1/8 more or less. A lot of times the object’s actual length falls between the graduations on the scale. So, it depends on how good the measuring tape is and your judgement. Even with a laser, your measurements are as good as your aim. Every measurement is uncertain. In science we express this as +/- a percentage (%) of the measured quantity.
This realization actually goes to the basis of our physical world in our understanding of Quantum Mechanics. The more you know about one quantity, the less you know about another. You can only measure things within a certain level of certainty, and you cannot measure a quantity without effecting the system that you are measuring. It’s the Heisenberg Uncertainty Principle. While this is often thought to apply only to the very small in nature, it gives you realizations about the world on our scale: that every measurement you make has a level of precision, from distance, to the speed of light, down to the mass of a standard kilogram. A kilogram was defined initially as a liter of water. But water’s weight varies with altitude and composition (salinity, minerals, etc). These days, for the precision we need, a standard kilogram is defined in terms of three fundamental physical constants: the speed of light c, a specific atomic transition frequency ΔνCs, and the Planck constant h. The theme here is not exact measurement, but reduction of the uncertainty around measurement.
While CyberSecurity is no less uncertain, it is no less measurable than any other quantity.
Many of us have been the subject of scales from Risk Management Officers that are “high”/“medium”/“low”, “red”/“yellow”/“green”, and 1-5. While this sounds like measurement, it is one with a large large of uncertainty. This is what makes these imprecise measurement scales not useful. Science needs to be able to be exact and make predictions. You can’t do that with these. Once, I was in a meeting where one of my colleagues who was in charge of CyberRisk was using this measurement scale. In his absence the meeting asked me, “well, your colleague has 4 highs, 4 mediums, and 5 lows here. What is the aggregate risk.” I had to say, “I don’t know.” A bad answer. But, that is about all you can say.
As with the distance measurement above, our goal is the reduction of this type of uncertainty. In CyberSecurity you measure key process indicators and key risk indicators. These are quantitative, even if some of them are binary (is there a governance committee, yes/no?). One common measure of CyberSecurity Risk is the number of vulnerabilities an organization is carrying in its server software (stated in CVSS) + the vulnerabilities it is carrying in its application software (stated in CWSS). Even this is not as easy toting up the existence of the vulnerabilities. You need to correct for a number of mitigations that may or may not be in place around the vulnerability, as well as the efficiency of your measurement process. This all leads to uncertainty, which needs to be stated and accounted for in a scientific measurement.
The ways to do this are amenable to statistics. More on that in another post.

Are you saying that perhaps cybersecurity is more of an art than a science? Good points Joel. Look forward to your next post.