■ The healthcare system:

Building safer systems to enhance clinical care delivery

Human factors

Montage of 21 portraits of physicians
Published: May 2021
20 minutes

Introduction

What is meant by human factors?

Human Factors Science (HFS) studies the human characteristics, capabilities, and limitations that influence how people interact with their environments. The goal of HFS is to support the cognitive, physical, and technological work of healthcare providers, enabling safe patient care.1 In healthcare, the consideration of Human Factors Science in the genesis of patient safety incidents (accidents in Québec) helps us prevent future incidents by guiding the design of better systems.

Human Factors Science

Human Factors Science includes the following elements:

  • people
  • technology
  • environment

The following diagram illustrates the relationship between humans, technology, the work environment, and the workplace culture on human performance. HFS experts design work systems to optimize individual and team performance while minimizing safety risks. Consideration of each component can help us engineer systems that minimize the likelihood of error and enhance safety.

Hospital/clinic

Culture, QI, safety practices

policies/procedures

Goal

Support the cognitive, physical,
technological work of healthcare providers

Human

  • capabilities/limitations
  • cognitive biases
  • situational awareness
  • teamwork
  • emotions

Technology

  • computers
  • devices
  • networks
  • technology design
  • tasks

Environment

  • noise
  • interruptions
  • distractions
  • lighting

Hospital/clinic

Culture, QI, safety practices

policies/procedures

Goal

Support the cognitive, physical,
technological work of healthcare providers

Human

  • capabilities/limitations
  • cognitive biases
  • situational awareness
  • teamwork
  • emotions

Technology

  • computers
  • devices
  • networks
  • technology design
  • tasks

Environment

  • noise
  • interruptions
  • distractions
  • lighting

Adapted from CPSI, Caflei.rekorgroup.com2

Good practice guidance

Culture

  • Culture refers to the way we work together every day. Together, an organization’s values, leadership, traditions, behaviours, interpersonal interactions and beliefs/attitudes contribute to the emotional and relational environment of the workplace.
  • An organization with a just culture promotes a philosophy and everyday principles about how we engage as teams, hold each other accountable, and identify and fix problems that could otherwise lead to harm.3

Quality Improvement Science

  • Quality Improvement Science (QIS) seeks to continuously identify changes that can improve the quality of the care in a given setting. It is not just about fixing issues after a patient safety incident, but about proactively seeking to identify and correct potential safety issues before they occur.
  • Most QIS practitioners use a range of improvement methods and tools to both measure and improve care, such as:
    • Plan-Do-Study-Act (PDSA) cycles
    • Lean methodology
    • Six Sigma
  • Human Factors Science (HFS) in healthcare explores a problem by considering the providers within a system. It then redesigns the system, tasks, interfaces, and environment to optimize safety and efficiency.
  • By combining QIS and HFS expertise, practical approaches to improving healthcare may be enhanced. Both account for the complexity of the roots of harm with the goal of improving outcomes for patients and providers.

Policies and procedures in support of safe care

  • Policies and procedures facilitate and standardize clinical care. They define the expectations of how providers will deliver care given specific clinical situations.
  • Generally, healthcare providers are expected to follow local policies, however providers must also be flexible in managing unanticipated events.
  • When deviations from policies and procedures are identified, exploring the underlying reasons for the deviations may be quite instructive. Reviews may reveal a need to adjust policies and procedures to better reflect the realities of the clinical environment and the ability of providers to safely deliver effective and efficient care. Alternatively, reviews of policy deviations may provide important insights about a workplace culture by identifying behaviours, attitudes, or practices that endanger patient safety.

Collapse section

The human factor study of the environment provides helpful insights to optimize patient safety. Increased workload, interruptions, time pressures, and distractions can affect performance and patient outcomes.

Typical examples include:

  • poor lighting that causes providers to misread labels
  • a culture of workplace bullying that inhibits speaking up and promotes workarounds
  • a focus on overtime costs that drives providers to finish OR slates on-time and may lead to a rush to finish critical steps of a procedure
  • lack of hand sanitizer dispensers, poorly located sinks, or lack of paper towels that contribute to poor adherence to hand hygiene rules

Collapse section

A healthcare provider's mental, emotional, and physical state can interfere with their ability to think, reason, make decisions, analyze problems, plan, communicate decisions, and work in a team.

Generally, according to dual process theory, two systems are involved in clinical decision-making.4

System 1 thinking is intuitive, fast, and almost subconscious. It makes a direct association between new information and a similar example in our memory. It is based on pattern recognition.

System 2 thinking is slower, analytical, and effortful.

Health providers use both systems to arrive at diagnoses, toggling between them. However, novices use the slower system 2 approach more often as they have not yet learned all the patterns required for the rapid system 1 thinking. On the other hand, experienced physicians use rapid system 1 thinking more often, based on their mental library of cases. For example, when an experienced physician sees a “typical” skin rash, identifies a particular combination of symptoms and signs, or recognizes a clinical syndrome, they may intuitively arrive at a diagnosis through the unconscious use of pattern matching to clinical templates previously learned through experience. If no match occurs or if the presentation is ambiguous, the physician may revert to analytical reasoning, which requires a more deliberate methodical approach to make the diagnosis.

Diagnosis by pattern recognition is quick, often effective, and usually correct, but is prone to interference by cognitive and affective biases that may mislead even the most experienced physicians. Errors in diagnosis may nevertheless arise from the use of either system.

Cognitive and affective biases

Cognitive biases (distortions of thinking) and affective biases (intrusion of the physician's prejudices and emotions) may interfere with reasoning and decision-making, and sometimes result in inaccurate judgments and the inability to reach a correct diagnosis.5

Some common cognitive biases include the following:

Anchoring — focusing on one particular symptom, sign, or piece of information, or focusing on a particular diagnosis early in the diagnostic process, and failing to make any adjustments for other possibilities.

How to alleviate:

  • Gather sufficient information.
  • Develop a differential diagnosis.
  • Reconsider the diagnosis if:
    • There are new symptoms or signs.
    • The patient is not following the natural course of the assumed illness and is not improving.

Premature closure — uncritically accepting an initial diagnosis and failing to search for information to challenge the provisional diagnosis or to consider other diagnoses.

How to alleviate:

  • Gather sufficient information.
  • Develop a differential diagnosis.
  • Identify any red flag symptoms and investigate appropriately.
  • Consider the worst case scenario—what you don’t want to miss.

Bandwagon effect (diagnostic momentum) — diagnostic labels may stick to a patient. If everyone else thinks it, it must be right!

How to alleviate:

  • Consciously decide to arrive at your diagnosis or differential diagnosis independently of the labels applied by others.
  • A diagnostic “time out” to reconsider the diagnosis.

Attribution bias — a form of stereotyping: explaining a patient’s condition on the basis of their disposition or character rather than seeking a valid medical explanation.

How to alleviate:

  • Avoid the rush to stereotype a patient based on their culture, gender, illness or disability, religious or sexual orientation.
  • Acknowledge that you may not have the best rapport with a specific patient and take particular care not to have this impact your decision-making and judgment.

Availability bias — recent or vivid patient diagnoses are more easily brought to mind (i.e. are more available) and overemphasized in assessing the probability of a current diagnosis.

How to alleviate:

  • Be aware of the influence of recent diagnoses on your diagnostic acumen.
  • Watch for red flags, or symptoms or signs inconsistent with a common, less serious diagnosis.
  • Avoid the urge to over-investigate or over-treat based on an unexpected recent diagnosis in another patient.

Identifying and addressing biases aids in developing successful doctor-patient relationships, support effective care delivery, and may help decrease the likelihood of diagnostic error. Physicians might consider consulting with colleagues or other specialists for patients who present diagnostic dilemmas or where physicians feel that their own biases may be getting in the way of an accurate management plan.

There is ongoing debate in the literature as to whether it is possible to de-bias one’s thinking, but encouraging data is emerging suggesting that “formally slowing down,” and using metacognition and checklists may be effective in decreasing diagnostic error.6,7

Metacognition is the awareness of one’s own thought process.8 Cognitive forcing is a form of metacognition that involves forcing oneself to consider other possibilities and to hopefully avoid diagnostic delays by asking such questions as:

  • What else could this be?
  • Does something not fit?
  • Could there be more than one process at play here?

Computerized decision support systems may reduce cognitive load by providing decision aids and suggesting a differential diagnosis, and may be helpful as metacognitive tools. As such, these systems may help clinicians who arrived at a diagnosis via system 1 thinking to consider a differential diagnosis or to broaden it by leveraging system 2 thinking.

Collapse section

Personal performance modifying factors

Personal performance modifying factors are particular to individual providers and may change from time to time. They include physical, emotional, and external factors such as:

  • lack of sleep, hunger
  • anxiety, anger, sadness, depression
  • alcohol, drugs, prescription medications
  • physical health conditions, pain
  • interpersonal issues (conflict with family members or other significant individuals)

System performance modifying factors

System performance modifying factors are factors that can affect work performance by virtue of their impact on the work environment. Such factors may affect the performance of entire groups of individuals and may vary from one environment to another (i.e. unit to unit). They include issues such as:

  • competing values (e.g. timeliness versus safety)
  • availability or lack thereof of appropriate equipment or supplies
  • effectiveness of orientation or training
  • team culture

Collapse section

Situational awareness

Situational awareness is considered one of the most important non-technical skills of a physician. It refers to a person's ability to perceive and understand dynamic information that is present in their environment and to project the implications of that information into the future so that they can anticipate what needs to be done.

Maintaining situational awareness at all times is challenging. Leveraging other team members’ insight, situational awareness, and knowledge can be helpful in maintaining team situational awareness. Speaking up and cross-monitoring are two useful practices that can be used in addition to team rituals like huddles, briefings, and debriefings to foster situational awareness.

Collective team competence

Teamwork is a joint cognitive endeavour. While each member of a team contributes to the overall care of the patient, over reliance on individual competence instead of collective competence can lead to missed opportunities to promote the reliability and safety of care.9 Achieving collective competence, however, may be a complex challenge.

  • A team may underperform even when each team member is individually competent, if the team does not function well as a whole.
  • A team can be highly competent even if one member is underperforming, if there are system redundancies designed to mitigate that effect.
  • A team can be competent in one situation and not in another, depending on the culture and environment.

There are some human factor science strategies that may improve team collective competence, for example:

  • improving health information technology such that distributed members of the team have equal access to patient health information (e.g. electronic health records)
  • encouraging the use of checklists (e.g. surgical safety checklist) to engage all members of the team
  • creating policies and procedures that build in redundancies such as double checking medication administration
  • using simulation to focus teaching of high performing, team-based behaviours
  • involving patients as active participants in their team and encouraging speaking up

Collapse section

Medical equipment is integral to the delivery of quality patient care. While technology is an enabler and holds great promise for safe care, issues like design flaws may hamper its use or limit its ability to achieve intended goals. Involving healthcare providers in designing technology and evaluating it before it is purchased can maximize its value.

Equipment-related issues include:

  • equipment malfunctions and failures
  • design faults
  • improper or inadequate maintenance
  • complex interfaces that are difficult to use in an emergency or if an unanticipated event occurs
  • electronic medical record complexities that complicate documentation or access to required information

In addition to the above issues, which are inherent to the technology itself, the way in which technology is used by healthcare workers can also play a role in increasing the risk of harm. For example:

  • wrong application, improper use, or unapproved use of equipment
  • training and supervision deficiencies when introducing new equipment
  • turning off alerts designed to flag the possibility of harm

Collapse section

Human Factors Ergonomic (HFE) Engineering is used to design and evaluate safer and more effective tools, machines, systems, tasks, jobs, and environments to maximize human capabilities and minimize human weaknesses. Although it is unlikely that we can completely eliminate factors that contribute to patient harm, HFE engineering principles can reduce the likelihood and impact of such factors by introducing mitigating measures.

Addressing equipment, technology, and environment related risks

Responsibility and accountability for equipment set up, care, and maintenance is often shared and is dependent on the care location, whether in a hospital setting, clinic or private office.

Organization leadership should consider

  • Is there a process to verify that required equipment is available, appropriately sterilized, functioning, and that the settings are appropriate for the specific clinical care?
  • Are users included in the choice of new equipment/technology to facilitate their work?
  • Are the manufacturer's recommendations for equipment maintenance, cleaning, calibration, and replacement followed?
  • Do clear policies and procedures exist for handling equipment concerns and recalls?
  • If new equipment/technology is introduced, do healthcare professionals receive appropriate training before using it?

Healthcare providers should consider

  • Are you familiar with the equipment/technology you are using, and is it appropriate for the care you are providing?
  • Do you inspect equipment for completeness at the end of the procedure, particularly if the instrument breaks, is disassembled during the procedure, or parts have the potential to detach?
  • When supervising or delegating a procedure involving equipment or technology to a trainee or another healthcare professional, do you consider if the individual has the required knowledge, skills, qualifications, and experience to operate the necessary equipment, and what level of supervision is required, if any?
  • Do you appropriately report equipment/technology failures or concerns?
  • Do you take the opportunity to contribute to the review of near-misses and patient safety incidents involving equipment?

Collapse section

Fostering safe care by leveraging human factors

Not all measures have the same potential to foster safe care, with system-based safeguards generally being more effective than human-focused safeguards.

Hierarchy of intervention effectiveness



Multiple strategies are needed to achieve a decrease in patient safety incidents (accidents in Québec)

More effective

System focused
Less effective

Person focused

Forcing functions

Constraints

Simplification
/standardization

Primarily system

Checklists

Redundancy

Policies/rules

Education

Primarily human
The diagram of the Hierarchy of intervention effectiveness depics seven interventions that can decrease patient safety incidents as a staircase going up from left to right. System focused interventions are depicted at the top of the staircase on the right as more effective, while person focused interventions are depicted at the bottom of the staircase, on the left, as less effective. From least to most effective, the interventions are: Education, Policies / rules, Redundancies, Checklists, Simplification / standardization, Constraints, Forcing functions

Adapted from Cafazzo et al. Healthcare Quarterly. April 201210

Multiple strategies are needed to achieve a decrease in patient safety incidents.

Forcing functions: make it impossible to do the wrong thing.

For example, the connectors to the oxygen and nitrous oxide gas sources are designed to make it impossible to physically connect them to the wrong source.

Another example would be to design a connection for syringes of vincristine that would make it impossible for the drug to be given intrathecally.

Constraints: make the right choices the easy choices and make it hard to do the wrong thing.

Constraints do not make it impossible to do the wrong thing like a forcing function, but they do make it difficult. They allow for many opportunities for checks, and decrease the likelihood of behavioural drift and the development of workarounds.

For example, store look-a-like drugs in different areas or develop distinctly different packaging.

Simplification: reduces the number of steps in processes or procedures.

Each step in a process of care may fail on occasion. Processes with a greater number of steps are generally more prone to failure than those with fewer steps. When a process is complex, inefficient, or both, health providers are more likely to develop workarounds to make it simpler. This may sometimes put patients at risk. Addressing such concerns through equipment redesign can minimize or eliminate undesirable workarounds.11

Standardization: promote consistency and eliminate confusion.

At the population level, standardization promotes patient safety by increasing reliability and decreasing unnecessary variation (based on preferences). Making allowances for necessary variation (indicated by patient need, for example allergy to an antibiotic) on a patient-by-patient basis (i.e. allowing the substitution of one antibiotic for another) can help achieve a balance between special patient needs and standardized care.

The use of standard order sets or standard operating room setups are two examples of standardization.

Checklists: create checklists to aid memory and promote reliability.

Most people can hold only a limited number of pieces of information in their memory at any one time.12

Checklists help to reduce reliance on memory and promote reliability of care. For example, the surgical safety checklist,12 or checklists for central line insertion or handovers.

Redundancies: introduce workflows that are redundant to help decrease the likelihood of error

Redundancies can help decrease the likelihood that an error at one step of a process would be propagated further. Asking two people to do the same thing—like verifying a medication dose independently—can achieve such a goal. Computers can facilitate the double check by alerting the provider to the wrong dose of medication or a contraindicated medication.

Policies and rules: create and adhere to patient safety policies to promote good habits.

Policies and procedures are designed to standardize care and promote best practices. Minimizing unnecessary variation in care helps promote safety by decreasing, for instance, the impact of team composition on the outcome of care.

Education: teach principles of importance to the members of the team.

Education is essential; but cannot be the only response. Healthcare providers have traditionally looked to education along with changing policies and protocols to advance patient safety rather than also looking for forcing functions, standardization, simplification or constraints to address quality issues.

Education is essential to teach the fundamentals if the central issue is a lack of knowledge about a new tool or process. 13 Education is less likely to be successful if the system and culture do not support the providers adopting and adhering to the changes. Collaborating with human factors experts is key to maximize the impact of education to promote safety in a given context.

Collapse section

Additional resources


References

  1. Russ AL, Fairbanks RJ, Karsh B, et al. The science of human factors: separating fact from fiction. BMJ Qual Saf. 2013;22:802-808. Available from: https://qualitysafety.bmj.com/content/22/10/802.short
  2. Adapted from Canadian Patient Safety Institute, 2020. Available from: https://www.patientsafetyinstitute.ca/en/toolsResources/Human-Factors-Network/Pages/default.aspx
  3. Paradiso L, Sweeney N. Just culture: It's more than policy. Nurs Manage. 2019 Jun;50(6):38-45. DOI: 10.1097/01.NUMA.0000558482.07815.ae.
  4. Croskerry P, Petrie, DA, Reilly, JB, et al. Deciding About Fast and Slow Decisions. Acad Med. 2014 Feb;89(2):197-200. DOI: 10.1097/ACM.0000000000000121 Available from: https://journals.lww.com/academicmedicine/fulltext/2014/02000/Deciding_About_Fast_and_Slow_Decisions.7.aspx
  5. O'Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. J R Coll Physicians Edinb. 2018 Sep;48(3):225-232. DOI: 10.4997/JRCPE.2018.306. PMID: 30191910.
  6. Moulton CA, Regehr G, Lingard L, et al. ‘Slowing down when you should’: initiators and influences of the transition from the routine to the effortful. J Gastrointest Surg. 2010; 14 (6):1019–26. Available from: https://pubmed.ncbi.nlm.nih.gov/20309647/
  7. Moulton CA, Regehr G, Mylopoulos M, et al. Slowing down when you should: a new model of expert judgment. Acad Med. 2007 Oct;82(10 Suppl):S109-16. DOI: 10.1097/ACM.0b013e3181405a76. PMID: 17895673. Available from: https://pubmed.ncbi.nlm.nih.gov/17895673/
  8. Norman GR, Monteiro SD, Sherbino J, et al. The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking. Acad Med. 2017 Jan;92(1):23-30. DOI: 10.1097/ACM.0000000000001421. PMID: 27782919. Available from: https://pubmed.ncbi.nlm.nih.gov/27782919/
  9. Lingard L. Paradoxical Truths and Persistent Myths: Reframing the Team Competence Conversation. J Contin Educ Health Prof. 2016 Summer;36 Suppl 1:S19-21. DOI: 10.1097/CEH.0000000000000078. PMID: 27584064. Available from: https://pubmed.ncbi.nlm.nih.gov/27584064/
  10. Cafazzo JA, St-Cyr O. From discovery to design: the evolution of human factors in healthcare. Healthc Q. 2012;15Spec No:24-9. DOI: 10.12927/hcq.2012.22845. Available from: https://pubmed.ncbi.nlm.nih.gov/22874443/Lin
  11. Isla R, Doniz K, et al. Applying human factors to the design of medical equipment: patient-controlled analgesia. J Clin Monit Comput. 1998 May;14(4):253-63. DOI: 10.1023/a:1009928203196. Available from: https://pubmed.ncbi.nlm.nih.gov/9754614/
  12. Canadian Medical Protective Association. CMPA;June 2015. Surgical safety checklists, A review of medical-legal data. Available from: https://www.cmpa-acpm.ca/documents/10179/47890/com_16_SurgicalSafety_Checklist-e.pdf
  13. Soong C, Shojania KG. Education as a low-value improvement intervention: often necessary but rarely sufficient. BMJ Qual Saf. 2020;29:353-357. Available from: https://qualitysafety.bmj.com/content/29/5/353 010411         
CanMEDS: Leader, Collaborator, Scholar

DISCLAIMER: The information contained in this learning material is for general educational purposes only and is not intended to provide specific professional medical or legal advice, nor to constitute a "standard of care" for Canadian healthcare professionals. The use of CMPA learning resources is subject to the foregoing as well as the CMPA's Terms of Use.