Security B-Sides Vegas 2011 Review: Cultural Cues from High Risk Professions

Conference: B-Sides Las Vegas
Title: Cultural Cues from High Risk Professions
Speaker: Gal Shpantzer

In this B-Sides LV talk,Gal Shpantzer employed the Swiss cheese model of catastrophe as a parallel for the information security industry. The model was originally developed by James Reason of the University of Manchester and Dante Orlandella[1],  and used to analyze the causes of systematic failures in aviationengineering and healthcare. The model likens organizational problems to swiss cheese – where each problem can be viewed as a hole in a piece of swiss cheese. The layers in the systems and processes are designed to catch mistakes before they become catastrophic. But, if the holes in each layer align, serious problems can result. Much like a hole going all the way through the piece of cheese.

For example, Korean Air at one point in time had 17 times as many catastrophic incidents per million miles as United Airlines.  Investigation revealed that it came down to differences in processes and protocols. Whereas, at United Airlines, volunteering information and seizing controls under emergency circumstances, etc were incorporated into the official cockpit protocols. The captain was the authority but could be questioned. This was also discussed in depth in the context of cultural influence in Malcolm Gladwell’s book, Outliers, there was an atmosphere of over-deference in the cockpit where one does not question the captain. And, it wasn’t just Korean Air where this happened. There were other airlines headquartered in countries where respect for authority is so ingrained in the culture – like in Colombia.

In the info security space, Gal Shpantzer proposed protocols where there is responsibility but people are not afraid (i.e. penalized) for volunteering information.  Pain and hostility shuts people down and leads to swiss cheese.  In the medical profession, it was found that the more expert the physician, the more likely that physician was to miss simple things like administering aspirin before/after operations that reduce probability of cardiac problems.

Summary:  I find little to disagree with.  This is one of those common sense, obvious when you hear it talks that is none the less worth mentioning because when you don’t hear it, it tends to not get done. No product ideas, but good general security philosophy.

About M. J. Power 22 Articles
Connect with Mike on Google+

Be the first to comment

Leave a Reply

Your email address will not be published.


*