Research

The MedStar Health National Center for Human Factors in Healthcare research team focuses on leveraging theory to conduct applied research that addresses critical safety problems in health care. We are the nation's innovative leader in human factors and healthcare research given our unparalleled access to a 10-hospital healthcare system, diverse range of clinical expertise, and expertise in human factors, applied cognition, and safety engineering. In addition to reviewing the key information below about our research work, visit our Publications and Videos pages for more resources.

Methods 

We have expertise rigorously applying a variety of methods, including: 

General methods are used to answer specific questions about human behavior in complex environments. 

  • Focus groups and interviews
  • Time motion studies
  • Ethnography/Observations
  • Eye tracking
  • Physiological sensors
  • Data mining

Modeling focuses on developing quantitative techniques to understand human behavior and interactions with their environment. 

  • Machine learning & statistical models
  • Natural language processing 
  • Mental models
  • Predictive analytics

Usability methods focus on identifying user needs, designing and developing for these needs, and assessing to ensure they are met. 

  • Heuristic analysis
  • Task analysis
  • User-centered design
  • Usability testing 

Safety and risk assessment methods focus on identifying potential points of failure.

  • Failure mode effects analysis
  • Human error identification
  • Incident investigation
  • Space syntax

Key Research Grants and Contracts

Following is a summary of some of our largest and most recent research efforts. In particular, our MedStar Health National Center for Human Factors in Healthcare team is proud to have four active multi-year Research Project Grant (R01) awards from the U.S. Department of Health and Human Services, which are considered to be among the most prestigious grant mechanisms. Our top leaders are the principal investigators (PIs) on three Agency for Healthcare Research and Quality-funded R01s that focus on improving the safety of electronic medication administration records, usability and safety guidelines for health information technology, and technology to support emergency department communication, as well as a National Institutes of Health (National Library of Medicine)-funded R01 on developing effective clinical decision support.

For more information about our research work or to explore possible collaboration opportunities, please contact us at [email protected]

Cognitive Engineering for Complex Decision Making & Problem Solving in Acute Care (R01)

Agency for Healthcare Research and Quality
MedStar Human Factors Center PI: Zach Hettinger, MD

Purpose: This research applies human factors engineering—and, specifically, cognitive systems engineering methods—to examine the nature of cognition, tasks, and work in the context of the emergency department. Our results will help inform the design, development, and evaluation of future information technology (IT) innovations, thus providing a methodological example and “proof-of-concept” for translating cognitive engineering analysis into design.

Approach/Methods: A combination of interviews, observations, screen capture recordings, and retrospective data reviews are being conducted to perform a cognitive engineering analysis of emergency medicine and iteratively develop design guidance and prototypes addressing key challenges identified. This data will be analyzed primarily using work domain analysis and abstraction hierarchy frameworks. The developed prototypes will be evaluated using user-centered evaluation methods in realistic simulated tasks set in a clinical simulation center. Multiple usability tests will be performed for each of the many prototypes developed and optimized iteratively.

Signaling Sepsis: Developing a Framework to Optimize Alert Design (R01)

National Library of Medicine
MedStar Human Factors Center PI: Kristen Miller, DrPH

Purpose: The current structure of electronic health record (EHR) systems provides clinicians access to raw patient data without imputation of its significance or utilization of validated illness-severity scoring systems. Focusing on sepsis detection and treatment, this research develops and tests multiple enhanced visual display models that integrate patient data into validated sepsis staging scores with the primary objective of informing the development of future real-time clinical decision support (CDS) tools.

Approach/Methods: Simulated EHR environments were built using clinical scenarios based on real patient data. CDS alerts were developed for both the Predisposition, Infection, Response, Organ Dysfunction (PIRO) and quick Sepsis-Related Organ Failure Assessment (qSOFA) scores for use in usability evaluations. One-on-one usability sessions are conducted to evaluate participants’ thought processes, expectations, observations, challenges, and successes while using the simulated EHR and CDS alert designs. Usability is measured relative to users’ performance on a given set of tasks targeting: (1) performance-related measures including success rate (correctness of alert interpretation, appropriate response); and (2) preference-related measures including participants’ subjective satisfaction and cognitive load.

Read more in the related press release.

Developing Evidence-Based User-Centered Design and Implementation Guidelines to Improve Health Information Technology Usability (R01)

Agency for Healthcare Research and Quality
MedStar Human Factors Center PI: Raj Ratwani, PhD

Purpose: To improve the usability and safety of health information technology (HIT) we will analyze patient safety event reports to determine the impact of HIT. From the HIT-related events we will determine how to improve current design methods, policies, and testing procedures. This knowledge will serve to inform vendor and provider practice, as well as policymaker decisions.

Approach/Methods: Natural language processing will be used to analyze millions of patient safety event reports. Expert consensus will be used to create guidelines for vendors, providers, and policymakers.

Improving Patient Safety and Clinician Cognitive Support Through EMAR Redesign (R01)

Agency for Healthcare Research and Quality
MedStar Human Factors Center PI: Raj Ratwani, PhD

Purpose: The objective of the research is to reduce the patient safety hazards associated with electronic medication administration records (eMARs) by: (1) understanding current usability and safety gaps; and (2) creating design and development documents, wireframes, and prototypes to serve as the foundation for future eMARs that will eliminate these gaps. In particular, we focus on communication and information flow challenges between nurses, pharmacists, and physicians during medication administration and use of the eMAR.

Approach/Methods: Machine learning will be used to identify safety hazards from patient safety event reports. In addition, observations will be conducted to capture frontline clinician experiences.

Context is Critical: Understanding When and Why EHR-Related Safety Hazards Happen (R21)

Agency for Healthcare Research and Quality
MedStar Human Factors Center PI: Zach Hettinger, MD

Purpose: This study outlines a novel approach to retrospectively identify and analyze electronic health record (EHR)-facilitated errors based on the concept of a flight recorder or black box. The purpose of this study is to understand the context of errors by combining EHR safety event data queries, systemwide video capture of EHR interactions, and human factors heuristics analysis. The significance of this acceptance is the wide dissemination of this novel framework to detect, capture, review, and document health information technology errors.

Approach/Methods: Likely EHR-facilitated error patterns were selected by subject matter experts in the field of medicine, usability, human factors, and safety science. The research team then actively searched for discrete occurrences of these common safety error patterns within the EHR. Results were filtered down to instances of potential errors, and videos of these errors occurring were subsequently abstracted and reviewed. This process will be utilized to determine the specific design elements of the EHR system that facilitates these errors, and to subsequently create data-driven design guidelines based on the most common error causing elements.