HUMAN SYSTEMS ENGINEERING FOR TECHNOLOGY DEVELOPMENT
One of the desired outcomes of human systems engineering is the prevention of errors. The consequences of usability errors can range from annoying to severe, but they are real. Three-mile Island was a catastrophe that could have been avoided through the use of a proper human systems engineering process (Meshkati, 1991).
Despite the benefits that human systems engineering can provide, it is still too often neglected. Healthcare.gov is an excellent example that affected more than 8 million users in the first four days of its ill-fated rollout. At its launch, the site attracted five times more visitors than it was designed to handle (Mullaney, 2013). The problem was exacerbated by the site design, which forced users to create an account before reviewing and comparing plans (Bryant, 2013). This laborious account creation process created a bottleneck and increased the number of simultaneous hits to the site. Having a larger number of visitors than expected can affect even the best designed websites, but the account creation process could have been easily identified with user-centered processes. As President Obama describes:
Part of the problem with Healthcare.gov was not that we didn’t have a lot of hardworking people paying attention to it, but traditionally the way you purchase IT services, software, and programs is by using the same procurement rules and specification rules that were created in the 1930s... What we know is, the best designs and best programs are iterative: You start out with, “What do you want to accomplish?” The team starts to brainstorm and think about it, and ultimately you come up with something and you test it. And that’s not how we did Healthcare.gov.
It’s something, by the way, I should have caught, I should have anticipated: That you could not use traditional procurement mechanisms in order to build something that had never been built before and was pretty complicated. So part of what we’re going to have to do is just change culture, change administrative habits, and get everybody thinking in a different way. (Obama, Barack, Pres., 2015)
Even when programs are proactive about including human systems engineering, they may fall victim to budget cuts. For example, in 2008, the federal budget experienced cuts in basic human systems engineering research in aeronautics (DeAngelis, 2008), continuing a 15-year trend toward minimizing human-based research. Although government research efforts in general have been plagued with budget cuts, anecdotal evidence suggests that human systems engineering was cut at a disproportionately higher rate than other work. The reasons for this are unclear, but decisions may be influenced by a perception that human systems engineering is a soft science with diminished importance compared to the hard science of technology creation. In addition, military and government procurement processes ensure that the end user of the technology is not the same as the individual selecting or developing it. Warfighters do not select the technology they use—rather, they are expected to learn to cope with the design of systems selected by others. In the commercial world, if a product is not usable, it is purchased less frequently by consumers, so the manufacturer has an incentive to design with the user in mind. This effect is reduced if not eliminated outright by the military procurement process.
Some have argued that in system development, there is “a long and successful record concerning the use of training to compensate for poor design” (Hancock & Hart, 2002). People may assume that systems have to be complex and thus will require highly trained operators when, in reality, time and money could be saved by investing in human systems engineering to simplify designs. This logic mirrors the pre-World War I mentality that humans are plenty in number, and that if someone is confused by a system, the person should be swapped with someone more qualified. Training soldiers is not cheap, though, and publicly available information provided by the UK’s Ministry of Defense has estimated that basic training for an infantry recruit costs ~?34,000* (United Kingdom Army Secretariat, 2015). Although exact figures were not obtained for training within the United States, one can estimate that it is likely to be similar. The time spent on training should be focused on improving performance of the human system team, and not on humans trying to overcome poor design. If the design problem is fixed, it eliminates the need for training to understand a system and reallocates that time to training to perform well with a system.
Training is also unable to alleviate certain problems, such as inefficiencies. Just because someone is trained to work with an inefficient design does not suddenly make it more efficient. For example, UPS tracking numbers are 18 characters long, and due to their length, human interactions with these numbers are performed almost exclusively with bar code readers because humans have difficulty accurately transcribing sequences of this length (Johnson, 1991). In addition, user satisfaction will not improve through additional training. In fact, the opposite may be true, and users may seek to actively avoid the system altogether and seek alternative ways to meet their objectives.
There are clear benefits to ensuring that system development includes a focus on the user, including maximizing the performance achieved by the human-machine team, improving safety, and reducing inefficiencies. This understanding is gained through an iterative process (Nielsen, 1993). Both the living lab framework and IDEAS are built around the fundamental idea that good system design requires an exploration using a scientific process of theory and empirical discovery. This scientific process hinges on the systematic quantification of humans (Sauro & Lewis, 2012) as exemplified in careful measurement of behavior, cognition, and relevant environmental factors as users work with design artifacts, prototypes, and fully functional systems. Understandings of the technology, the human, and how they interact
$51,411 at the time of estimation.
should evolve together. When they do not, the final product may fall short of the need it was meant to meet.