I found Meyer's discussion on root cause analysis informative, particularly the hypothetical example discussing how a combination of factors create situations that are difficult to detect. What makes Meyer's point interesting is his reference to Nancy Leveson.
Leveson's home page contains a good collection of papers on safety in engineering. One paper investigates the Therac-25, a medical device containing software issues that massively overdosed six patients. The section on "Causal Factors" is informative.
One conclusion from Leveson's paper is that focusing on particular bugs does not lead to a safe design. The mistakes attributed to the Therac-25 involve poor software engineering practices and using software to ensure safe operation. You can't patch your way out of a poor implementation and you shouldn't involve software in safety critical functions.
Meyer's point in his hypothetical example on how a combination of factors can be difficult to detect and result in catastrophic failure is made real in Leveson's discussion of "Unrealistic Risk Assessment" in the Therac-25.
It also looks like a good lesson in probabilities wherein a probability of greater than zero means that the event can occur (however unlikely).
YG||6jm@ZetZom=j\][=toVZlo[t^£~XZbk ̄Z¥=j{b^bal|=jmZl^@Z£ Zbo[j[z^=j[ymtvwuj[tl^~mtomom@Z£l^}4k1t@rm=j|4a@r~y[om=jlvwZly[q=Z{mtuomZlo[ymbab1⁄4"C^=t~o[6j xym@ZvtZ¥=j{mt¥Z{m6j\][tyZly[^tXZ|@Zom[=jlo[Zb^b{rjuRj£l^}4k~|}babkl^=jXZ@r~o[rjm4v £b^omZXZl^}£l^Z ̄Zom=j\][to)mt+tvkXZ|tz^VZ{m@Zb)@Zg¬=j{[yuryj2VZlXl^to[}4kl^t@r ̄Zuty mZbkl^ba¤^tby=j;¢kXZ{[rXZv+Zl^ba^=j^Zgj2Z8t{[Zbk=j"uZ^=jXr`Zom=j\][=to¦Z~|bvwut@r ||6jn}||=j2ay[om}±;FRu@Zo[y+nyZ¥=j{%jZg¬=ju%t{[¤nj\Zlo[[tb^M1⁄4g£labKêZ{m6j\][tym=j|4a xro[om6j2v8^t"£l^}ylar1
No comments:
Post a Comment