Users Are Part of the System: How to Account for Human Factors When Designing Operational Tests for Software Systems

July, 2017
IDA document: D-8630
FFRDC: Systems and Analyses Center
Type: Documents
Division: Operational Evaluation Division
Authors:
Authors
Laura J. Freeman, Kelly M. Avery, Heather M. Wojton See more authors
The Director, Operational Test and Evaluation (DOT&E) has issued several policy memos emphasizing statistical rigor in test planning and data analysis, including the use of design of experiments (DOE) principles, and smart survey design and administration. Often, particularly when testing software-intensive systems, it is necessary to account for both engineering and human factors simultaneously in order to facilitate a complete and operationally realistic evaluation of the system. While some software systems may inherently be deterministic in nature, once placed in their intended environment with error-prone humans and highly stochastic networks, variability in outcomes can, and often does, occur. This talk will briefly discuss best practices and design options for including the user in the DOE, and present a real-world example.