Sunday, November 14, 2010

Catching Up Part 3

I've found it very hard reading Vicente's The Human Factor. I tend to find his central thesis a bit repetitive, essentially if you don't account for the human factor in your design, whatever can go wrong will go wrong (with apologies to Murphy's Law). My posts on Vicente's work are a requirement for this course, and since I'm being paid to do a book review (nor did anyone ask about my thoughts on the book), I'll continue on with my obligatory reading responses. I'm going to try and kill three birds with one stone here, and write some responses to Chapters 5-7 today.


Chapter 5 is entitled "Minding the Mind II: Safety-Critical Psychology", in which Vicente explores whether human-tech design principles can be used to create more user-friendly everyday technologies also be used to design "green" products and systems that are more environmentally friendly. At first, I got excited because I thought that he was going to discuss something along the lines of planned obsolescence along the lines of what Annie Leonard has been looking at with her "Story of Stuff Project", and her more recent movie called "The Story of Electronics" (see embedded video below).





I was a bit disappointed when he started talking about student projects. Student projects from 1994 mind you. He discussed the amount of energy being used by PCs, and that people forget to turn off their computers at night, so some of his students designed "his favorite" project, the Power Pig which was an on-screen reminder to workers to power down their PC at the end of the work day.


Vicente also discussed nuclear power plants once again, this time focusing in on the Three Mile Island disaster. There were several flaws in the design process (which Vicente outlined in detail) and due to some of these major design flaws, Three Mile Island was "an accident waiting to happen". He bombard the reader with statistics such as how much it cost to build Three Mile Island ($700 million), how many months it was fully operational (4 months) and that it cost $973 million to clean up all of the contamination. It's too bad that nuclear power gets so maligned in this book. I thought that I would look up some statistics of my own. No one was killed during the Three Mile Island accident. While people did die at Chernobyl, and many people got sick, poor design and safety violations were so egregious and numerous that the International Nuclear Safety Advisory Group published a 148-page report in 1993 detailing every possible thing that went wrong and how it could have been easily fixed. That doesn't change the fact that everyone around the accident got massively screwed in a big way, of course, but it seems that our initial estimates of the long-term damage of a nuclear event may have been exaggerated. Apparently we should be afraid of every other kind of energy production though. For example, coal kills more miners every few years than the initial blast at Chernobyl. This, of course, doesn't take into account air pollution from coal, which dwarfs those numbers yearly. But come on, that's not really surprising, is it? We know coal is bad for us -- that's why we're developing all these great green forms of energy. They're renewable and better for the environment.
Unfortunately, they're actually not necessarily safer than nuclear energy for those involved in producing them. A study found that in Europe alone, wind energy has killed more people than nuclear energy and, worldwide, hydroelectric energy has, too.
The leading cause of accidents involving wind energy farms is "blade failure," which is when a turbine blade breaks, sending shrapnel flying through the air. I guess I'm just getting tired of Vicente's fear-mongering in this book about nuclear power.


Chapter 6 is entitled "Staying on the Same Page: Choreographing Team Coordination", and it focuses on one aspect of "soft" technology (as Vicente defines technology) and that is that "designers must create a system that is tailored to the characteristics and needs of the team as a distinct entity in its own right. If they don't, the system won't run effectively and accidents will occur." (Vicente, p. 156) He looks at examples in the aviation industry, and how a crew crashed a plane by having everyone in the cockpit becoming so focused on a burn-out light that they lost sight of their primary purpose: to fly the plane. He details the CRM (Cockpit Resource Management) training that is now standard in the aviation industry. The aviation industry seems to be trying to learn from their mistakes, not only in the design of the cockpits but also by trying to learn from "near mistakes" (the ASRS, Aviation Safety Reporting System as outlined in Chapter 7) and interactions between the cabin crew. He also looked at the perceived infallibility of doctors in Chapter 7. Apparently, in Alberta it's becoming easier to report medical errors and the Health Quality Council of Alberta has issued a foundational document called the Patient Safety Framework for Albertans which was developed to guide, direct and support continuous and measurable improvement of patient safety in the province. Hopefully, these new reporting procedures will improve patient safety in the province as the ASRS has improved the aviation industry.




No comments:

Post a Comment