Progress towards Automated Human Factors Evaluation

dc.contributor.authorCao, Shi
dc.date.accessioned2017-02-08T21:12:57Z
dc.date.available2017-02-08T21:12:57Z
dc.date.issued2015-10-23
dc.descriptionCao, S. (2015). Progress towards Automated Human Factors Evaluation. 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015, 3, 4266–4272. https://doi.org/10.1016/j.promfg.2015.07.414 This work is made available through a CC-BY-NC-ND 4.0 license. The licensor is not represented as endorsing the use made of this work. https://creativecommons.org/licenses/by-nc-nd/4.0/en
dc.description.abstractHuman factors tests are important components of systems design. Designers need to evaluate users’ performance and workload while using a system and compare different design options to determine the optimal design choice. Currently, human factors evaluation and tests mainly rely on empirical user studies, which add a heavy cost to the design process. In addition, it is difficult to conduct comprehensive user tests at early design stages when no physical interfaces have been implemented. To address these issues, I develop computational human performance modeling techniques that can simulate users’ interaction with machine systems. This method uses a general cognitive architecture to computationally represent human cognitive capabilities and constraints. Task-specific models can be built with the specifications of user knowledge, user strategies, and user group differences. The simulation results include performance measures such as task completion time and error rate as well as workload measures. Completed studies have modeled multitasking scenarios in a wide range of domains, including transportation, healthcare, and human-computer interaction. The success of these studies demonstrated the modeling capabilities of this method. Cognitive-architecture-based models are useful, but building a cognitive model itself can be difficult to learn and master. It usually requires at least medium-level programming skills to understand and use the language and syntaxes that specify the task. For example, to build a model that simulates a driving task, a modeler needs to build a driving simulation environment so that the model can interact with the simulated vehicle. In order to simply this process, I have conducted preliminary programming work that directly connects the mental model to existing task environment simulation programs. The model will be able to directly obtain perceptual information from the task program and send control commands to the task program. With cognitive model-based tools, designers will be able to see the model performing the tasks in real-time and obtain a report of the evaluation. Automated human factors evaluation methods have tremendous value to support systems design and evaluationen
dc.identifier.urihttp://dx.doi.org/10.1016/j.promfg.2015.07.414
dc.identifier.urihttp://hdl.handle.net/10012/11299
dc.language.isoenen
dc.publisherElsevieren
dc.relation.ispartofseriesProcedia Manufacturing;3en
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectSystems designen
dc.subjectUsability testsen
dc.subjectCognitive architectureen
dc.subjectHuman performance modelingen
dc.subjectMental workloaden
dc.subjectQN-ACTRen
dc.titleProgress towards Automated Human Factors Evaluationen
dc.typeConference Paperen
dcterms.bibliographicCitationCao, S. (2015). Progress towards Automated Human Factors Evaluation. 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015, 3, 4266–4272. https://doi.org/10.1016/j.promfg.2015.07.414en
uws.contributor.affiliation1Faculty of Engineeringen
uws.contributor.affiliation2Systems Design Engineeringen
uws.peerReviewStatusUnrevieweden
uws.scholarLevelFacultyen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Cao-Shi_Progress-Towards-Automated-Human-Factors-Evaluation.pdf
Size:
120.74 KB
Format:
Adobe Portable Document Format
Description:
Publisher's version

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.46 KB
Format:
Item-specific license agreed upon to submission
Description: