Extended abstract for Workshop on Human Error and Systems Development
Replaced/Superseded by document(s)
Work in the field of human error has typically focused on operators of safety-critical
equipment, such as nuclear power plant controllers, and of the design of the human-machine
interfaces in such settings. Limited consideration has been given to wider system issues.
Similarly, researchers and practitioners in the field of Dependable Systems are concerned
with the design of computer-based systems which are intended to be operated in situations
where the consequences of failure are potentially catastrophic. For example, the failure of a
safety-critical system may cause great harm to people, property, or the environment. The
work reported on in this paper is motivated by the need to ‘push back’ these concerns with
the operation and design of dependable systems to the process by which they are developed.
Errors in the Requirements Engineering (RE) process are widely considered to be the hardest
to discover. Consequently, they tend to remain undetected for the longest time, require the
greatest amount of re-work, and are the most expensive to rectify of all errors in systems
development. Whilst efforts to detect and rectify errors in RE and the whole of the
development process are a necessity, the nature and cost of errors in requirements makes a
strategy of avoidance rather than detection a more attractive prospect. The benefits of such an
approach are primarily that the amount of rework can be reduced to a minimum, along with
related savings in cost and time to completion of the system.