In a speech to Harvard professors and students yesterday as reported by the Harvard Crimson, former President Clinton expressed a view on national risks that further illustrated the highly risk-averse nature of many American progressives.
Explaining the preferred method for dealing with risk, Clinton said "“We’ve got to try to avert disasters—not just be prepared to bomb somebody if a disaster occurs.”
In many respects, the former president's statement not only reflects two terms of that witnessed excessive avoidance of growing international threats, but also illustrates an approach that is increasingly common in corporations respective to operational risk. The sirens call of risk prevention is an enticing one, leading many into false hopes of the avoidance of all things bad. Indeed, some argue that much of the foundation of current progressivist thought is one based upon this avoidance philosophy.
Unfortunately for these optimistic believers, when outlier risk events ultimately occur, the theory fails them and rational response measures are lacking. Instead, false causations are usually established and scapegoats found and punished. "If only we tried harder to prevent it and had more money to do so" is the conclusion presented in response to the inherent failure of the prevention strategy. Usually, such systems perpetuate a significant decline until the participants recognize the folly of a prevention-only rhetoric.
For those less familiar with the pedagogy of risk, the former president's perspective can be described as one that believes the significant risks are outside the system and subsequently can be prevented. In risk nomenclature, this is known as exogenous risk, or risk that is "external to the system." The opposite believe, that risk is inherent to the system, is known as endogenous risk.
In many operations environments, managers are likely to find a pronounced exogenous-believing, risk-avoiding culture. Assets are deployed for risk prevention. In my information security experience with community banks, this corresponded with purchases of network firewalls, host firewalls and the application of software patches as the overwhelming majority of the information security budget. Methods that detect risk were a small minority of the budget, usually limited to the periodic review of log files, while more expansive detection and response capabilities were simply nonexistent. Managers sought to believe that risk was outside the bank's information processing environment and could be protected by barring the virtual doors from threats. (in firewall culture, some refer to this as the "Great Wall" strategy where all defenses are focused on a single great barrier between the exogenous Internet environment and the company's internal network).
In the event such a threat either passed through the barrier, threats usually magnify quickly, finding homogeneous environments in which to spread. Companies often operate the same type and version of operating system (at the same patch level) on their servers, administrative usernames and passwords are often the same, and defense approaches on servers identical. Once the threat has defeated one system's prevention capabilities, it has defeated them all. Only detection and response remain, but as we've seen, these have been neglected in many environments.
Addressing this disparity requires the adoption of a philosophy in the business culture that risk is endogenous, e.g. "bad things can and will occur." Instead of fearing and seeking to ban all fires, a balance is made between cost-effective prevention and effective detection and response capabilities. In the information security practice, this balance is well supported by best practices communicated by numerous professional and regulatory bodies (e.g. the ISACA and the ISC2). In fact, a risk management approach I've found successful in smaller operational environments has been one that closely evaluates risk prevention systems in order to locate aversion-bias and the occasional behaviors that cause leptokurtic skew of the organization's risk environment. Overconfidence in prevention appears to be highly associated with operational risk leptokurtosis.
Additional options may emerge, including one under evaluation in my research that focuses on the application of settlement processes (e.g. marked-to-market) to operational risk that support real-time risk recognition, similar to the daily settlement of futures positions found in futures markets. In such systems, the consequence of a risky position is felt quickly and usually with a moderate impact, providing the organization with immediate feedback. I'll be commenting more as this model evolves.
Saturday, May 12, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment