Norman Marks on Cyber Security

Norman Marks provides some welcome sanity in a recent post on his blog, On Governance, Risk Management, and Audit. Commenting on an April 2016 white paper, Cyber Security and the Board of Directors, by the Delta Risk company, Marks notes that Delta’s call for educating board members on technical details of cyber risk is likely unproductive.

Delta’s approach seems to stem from their identifying the Statement of Risk Appetite, required for banks by the Basel II accords, as a way for the board to communicate the organization’s risk boundaries and rationale.  Delta fails to see that an assessment of risks, given a firm’s operations and objectives, is a discreet task requiring specific skills unlikely to present on a board. It requires a good deal of rigor and must be continuously maintained. So while cyber risk should be incorporated into a risk-appetite statement, it is fundamentally different from the tasks of establishing priorities and communicating performance expectations.

Marks also gets my praise for, uncommonly in ERM, calling for expressing cyber risk in terms of the potential for a breach to affect the achievement of each of the enterprise’s objectives.

A Functional Hazard Analysis approach to modeling risk (more accurately, modeling hazards) in the context of specific operations and objectives of an enterprise would address this need. The refusal of this industry to use an FHA (or a systematic, enhanced BIA) approach has always puzzled me.

Marks is also admirably critical of neglect of probability in Delta’s recommendations.  As with ERM in general, many in cyber security seem to believe that vagueness is a cure for uncertainty.

While the Delta paper mentions metrics, it does so only in vague terms (“cyber-related status metrics” as KPIs). Marks correctly notes the absence of any metrics for deciding whether a firm’s information security program is effective. He asks how they might measure it.

To that I would also ask a potentially more revealing question: how would you know if it didn’t work? That question can better explore a program’s provisions for low-frequency hazards, since merely searching for confirming evidence (e.g., “we intercepted this one…”) can ignore low-frequency, high-impact hazards that have never occurred simply because of limited exposure time.

Use of FMEAs in risk management is common, despite their limited usefulness as a risk-analysis tool. Use of FHAs (or a structured version of Business Impact Analyses) seems nearly non-existent. I’ll be writing some recommendations for use of FHA in the future. Philosophizing about risk is a poor substitute for modeling hazards.


–  –  –

In the San Francisco Bay area?

If so, consider joining us in a newly formed Risk Management meetup group.

Risk assessment, risk analysis, and risk management have evolved nearly independently in a number of industries. This group aims to cross-pollinate, compare and contrast the methods and concepts of diverse areas of risk including enterprise risk (ERM), project risk, safety, product reliability, aerospace and nuclear, financial and credit risk, market, data and reputation risk, etc.

This meetup will build community among risk professionals – internal auditors and practitioners, external consultants, job seekers, and students – by providing forums and events that showcase leading-edge trends, case studies, and best practices in our profession, with a focus on practical application and advancing the state of the art.

If you are in the bay area, please join us, and let us know your preferences for meeting times.

One thought on “Norman Marks on Cyber Security

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s