PUBLICATIONS

Conducting User-Centered Evaluations Across an Emerging Ecosystem of Prototype Bioinformatics Tools

Nicolette McGeorge¹, Nicholas Alico¹, Jacqueline Brandon², M. Reza Jalaeian², Stephane Kane¹, Ryan Kilgore¹, Susan Latiff¹, Matthew Miller¹, Akhil Modali², Jennifer Winner², Michael Rayo²

Poster presented at the 2025 International Symposium on Human Factors and Ergonomics in Health Care, Toronto, Canada (30 March – 2 April 2025)

Submission Summary
This presentation discusses methods and protocols being developed to guide the creation of an emerging ecosystem of disparate, highly automated biomedical data aggregation and analytics tools built by over a dozen independent teams. These methods include user-centered evaluations, but also new methods that specifically assess how advanced automation tools work jointly with their human counterparts. Developing next-generation treatments and cures requires new forms of information services and analytics that automatically integrate biomedical data within a unified, semantically meaningful framework, with interfaces for intuitive access and exploration by diverse users. Such enabling bioinformatics tools require broad uptake, access, and use to provide any value. To ensure these tools are useful within established work contexts and enthusiastically adopted by a broad stakeholder community (including researcher, clinician, and patient user groups, with varying biomedical literacy levels), user needs and constraints must be identified and considered throughout development. Tools must also be assessed to determine whether they successfully address these user needs/constraints and then be refined accordingly when deficiencies or missed opportunities are identified.

We describe our effort to address these challenges through a multi-disciplinary approach to support healthIT development, drawing on techniques from human factors, cognitive systems engineering, user-centered design, and good software development practices. In our role, we seek to be the champion of the user, by delivering user evaluation insights to tool developers—for both component tool design and cross-tool functionality—and informing the evolution of piece-part tools into a cohesive whole that addresses actual user needs and achieves critical user engagement. From this perspective, we will discuss the complexities and challenges of selecting, tailoring, and applying an overarching user-centered evaluation process in the context of a real-world and real-time technology development effort for an emerging collection of biomedical data fabric tools across varying stages of fidelity and functional completeness. To accomplish this assessment and broadly foster user-centered development of interoperable tools, this approach applies demonstrated human factors and cognitive systems engineering (CSE; Woods & Roth, 1988) methods that holistically facilitate technology requirements definition, design, development, and assessment activities. We bring new, force-multiplying resources to the broader community of tool developers by establishing a common, community-wide user-centric perspective to help individual teams build the right tools, delivered in useful and impactful ways, adopting a perspective that is complementary to, but often lacking in, technology-focused development.

¹ Charles River Analytics
² Cognitive Systems Engineering Lab (CSEL), The Ohio State University, Columbus, OH, USA

For More Information

To learn more, contact Nicolette McGeorge.

(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)