HW5: Reflections
The commonality in all of
the articles is they detail poor software designs that resulted in damage to
property or potential loss of life. These incidents illustrate how
communication failure, whether between software engineering teams or between a
team and a client, can bring about disastrous and costly results.
Software engineers have
just as much of a responsibility to provide clear instructions for the use of
their products as any other profession. Failure to adequately communicate
correct use of a product can result in deadly consequences to someone using it.
In the article “After Stroke Scans,
Patients Face Serious Health Risks” by Bogdanich, the failure of GE to
properly instruct hospital staff in the use of their machine resulted in
countless people being severely over radiated resulting in hair loss and
significantly increased risk for developing cancer. As stated by a patient
interviewed in the article, various technologies in our society come equipped with
safety measures to alert the user to a potentially hazardous situation, such as
vehicles beeping when moving in reverse. This failure could have been avoided
by more effective communication, either between the hospital staff who utilized
the equipment and the trainers from GE as one of GE’s clients, the hospital
Cedar-Sinai in Los Angeles, California, released a statement that the GE
trainers never informed their staff of the radiation increase when using the
automatic feature in the product. Additionally, they cited that the manual
failed to mention that utilizing the automatic feature for a CT scan would
actually increase the radiation amount used.
Another factor that
contributed to the disasters in these articles is complacency. In ”An Investigation of the Therac-25 Accidents”,
Leveson asserts that the creators of the Therac-25, AECL, chose to test the
hardware and not the software of their device, citing that “Computer execution errors are caused by faulty hardware
components . . . ” and going on to cite values that were arbitrarily provided
by AECL in the event that the software has an error regarding which mode to
pick and how much energy to use. In “The Role of Software in Spacecraft Accidents”,
Leveson lists several incidents regarding space craft mishaps that resulted in
damaged property. The main reason for these incidents was complacency in that
less resources were used for safety and engineering techniques proven to work
in order to reduce costs. Examples of this are the Mars Climate Orbiter and the
Mars Polar Lander that were both destroyed in 1999 due to software errors that
resulted from poor engineering practices.
The FBI’s project
Sentinel seems like a prime example of big budget software failures. Although
it was finally completed in 2012, it was almost 3 years late and was cited as
being $26 million over its initial budget of $451 million, however, there are
some claims that the actual cost was over $600 million when factoring in costs
previous attempts at predecessor software (such as the FBI Virtual Case File
and Trilogy). Additionally, in 2010, then Inspector General Glen A. Fine had
reported that the FBI had spent 90 percent of its initial $451 million budget
and was 2 years behind (Stein).
The text book asserts
that reliable, secure software comes from proper software engineering practices
such as testing and reviewing. Additionally, communication is key between
clients and development teams, and amongst team members during the software
development process. The listed software fiascos are all results of complacency
and lack of communication and could have safe guarded against personnel injury
and damage to property if better software engineering techniques had been
utilized.
Comments
Post a Comment