Computer Security and The Human Factor

November 7th, 2004 | by ian |

I recently read The Human Factor by engineering professor Kim Vincente. This is an eye-opening book about the importance of “adapting technology to people” instead of “forcing people to adapt to technology.” The field of Human Factor Engineering has apparently been around for awhile, but never has society been in greater need of it: thanks to personal computers and the internet most people I know deal daily with technology which fails to acknowledge human nature.

Technology for Technology’s Sake

Kim says technology has grown too complex because “technology wizards” tend to design for people like themselves or for the sake of the technology itself, and because there is a culture of reductionism in science which tends to favour design from the bottom up without understanding the emergent properties of the system when all components are connected. Both of these struck a chords with me: as a scientist I tried to swim against the reductionist tide, and as a network and software engineer I strive daily to consider “layer 8” of the network: the user.

Systems Analysis

Prioritizing systems analysis in a world of cheap (and quite effective) reductionism is not an easy sell. It is complicated and costly, requiring careful study, testing and monitoring of interaction and interdependencies. Where then are you likely to find such approaches? Typically when the loss of human life is the consequence of failure. Of particular interest in Kim’s book were his contrasting case studies on aviation and medical industry safety records.

When Death is on the Line

There was apparently a time when taking the plane was not safer than driving. According to Kim, up until the mid 70s “the FAA was simultaneously responsible for receiving information about near misses, and reprimanding people and organizations when they screwed up.” Obviously a lot of really serious matters did not get reported due to the potential liability involved, and often problems would not be addressed until there was an accident. Loss of life in aviation is high profile and can lose you customers quickly. Obviously this situation was unsustainable for the industry and eventually the FAA collaborated with NASA resulting in the Aviation Safety Reporting System (ASRS) allowing private reporting of incidents. The details are all in the book which I encourage you to read but here is what seems to make the system work:
1) voluntary
2) confidential
3) non-punitive
4) independent
It costs $2 million annually, deals with @40,000 incidents, and there has apparently NEVER been a breach of confidentiality of the reports! And thanks to ASRS in large part, aviation is much safer than it was in the 70s.

…Culture Matters

Kim also studies US hospitals where 44,000-98,000 people die yearly from preventable causes. This horrific situation doesn’t seem to get attention proportional to this statistic and does not seem to be improving fast. Taking responsibility for one’s actions is central to the culture of medecine and you would think this would be a recipe for excellence. Yet fear of reprimand, job loss, loss of reputation, and the highly litigious US medical environment are incentives against reporting incidents and accidents. Conspiracies of imperfect technology, ridiculously long shifts, and these more complex social issues act as an “invisible hand” that guides the system to generate more mistakes. Kim argues valiantly that the medical system stands to gain a lot from human factor engineering and provides some excellent examples.

Have a Safe Compute!

It seems like everyone can benefit from reading what Kim has to say and applying it to their fields. I certainly want to try. I want to follow up on a conversation I had with Adam.

Embracing internet use is a cost of entry, albeit quite low at first glance, to doing business in today’s world. The true cost of computing on the internet is probably quite high if the risks are properly accounted for however this is a difficult task, you would know if you ever tried to get a security budget approved! Adam assures me this is an unsolved problem because many or most computer security incidents go unreported (if anyone has any estimates please let me know and I will link to them here) leading to a lack of data with which to assess risk.

This reminded me of the problems Kim discussed in aviation and medecine, since the major reason we don’t know more about security breaches is that there are penalties for disclosure:
-security staff may face reprimand or loss of job and reputation
-an organization won’t risk publishing forensic analysis of incursions and results of security audits for obvious reasons
-companies don’t want to admit the financial damages caused by security breaches to their shareholders and show weakness to the market

Can a CSRS help us design more secure products?

Would it be possible to have an effective “Computer Security Reporting System”? Would a non-profit organization, properly funded, be able to collect accurate data and produce meaningful reports and recommendations for industry architects and CIOs alike? Could they protect the identities of participants and the information that belongs to them? Would we be able to generate key cost information from this data that would serve to educate while helping build better products and security budgets? Has this already been done and does it meet the aforementioned key criteria? I found something here, but government related agencies don’t seem to meet the “independent” criterion.

Perhaps it is not feasible to have a CSRS:
1) too many players and technologies involved
2) security, while expensive, is not usually “life and death”
3) too big a problem – too expensive to be cost effective

I’m not a security insider so I will “ask the audience” on this one. I know you’re out there!

Post a Comment