Coordinated Team Efforts: Sharing Insights from Health Care

Introduction
Paul Barach was a guest on Radio National’s Health Report this week (15 November). I listened to his interview with Norman Swan (link to podcast) as I was driving to a meeting with a colleague to discuss inter-professional learning, clinical education and performance in health services. His interview followed a presentation he had made titled Creating, Assessing and Sustaining High Performance in Healthcare Teams at the WCHA 2010 Conference in Melbourne.

Patient Safety
My ears pricked up with an early reference to “highly coordinated team efforts, leadership and a calm alignment”. Paul was discussing a team effort that drew upon routines developed in training and a team that did not panic … in the context of the response to the emergency on the Qantas QF32 flight out of Singapore recently.
I was intrigued to learn that the Qantas captain’s actions were akin to ‘open disclosure‘ in health care. Open disclosure is “the open discussion of incidents that result in harm to a patient while receiving health care. The elements of open disclosure are: an expression of regret, a factual explanation of what happened, the potential consequences, and the steps being taken to manage the event and prevent recurrence. There is a National Open Disclosure Standard. A key aspect of the standard is:

Ensuring that communication is open and honest, and that it is immediate is important to improving patient safety. While open disclosure is already occurring in many areas of the health system, this Standard is about facilitating more consistent and effective communication following adverse events.

Paul pointed out that open disclosure in the QF32 case gave passengers realistic expectations about what the next steps would be in the incident … “they were not left in the dark”.
There are number of fascinating insights about teams offered by Paul in the Radio National interview. These are some of the key points for me (taken from a transcript of the program available here):

  • Health care is not designed for teams. We select, we reward, we incentivise people to work as individuals and then we throw them together in teams. And so unless and until we change the incentive structure and give them the feedback as a team, it’s very difficult for them to achieve what I believe can be more sustainable improved results.
  • a lot of research suggests that when you throw people together they will not necessarily perform as a team, but if you invite them to think of themselves as a team in what we call a clinical microsystem in which they are inter-dependent, in which they are forced to communicate with each other in order to achieve their goal, in which there’s allocated time for briefing before the procedure and dedicate a time for debriefing after the procedure — then they’re actually able to reflect on their work, learn on the work and grow a greater respect for their colleagues.

Norman Swan asked Paul about his work with teams in paediatric surgery. These teams might perform immediate surgery, or provide a staged surgery on a small child.  Paul pointed out that “around this small infant you have six to ten people who are literally shoulder to shoulder. I like to use the analogy they can actually smell each other, that’s how close they are and they do this for multiple hours.”
In his study of these teams, Paul notes that:

We found a very complex ecosystem in which things are constantly being corrected and people are adapting and responding to each other, many times non-verbally, they are sort of learning to anticipate each other’s behaviour. So this is the anaesthetist, this is the surgeon, these are the nurses and the profusionist. There’s a certain choreography that underlies the work but we’ve also discovered that there’s a whole series of what we call minor near misses that don’t lead to immediate harm but when they go undetected the cumulative effect leads to harm.

The remainder of the interview addressed the issues around near misses. Paul observed that:

  • Errors happen all the time. We cannot necessarily stop errors, but what we can do is we can stop the errors from propagating to harm. We can do that by developing mitigating systems, by attenuating the errors and by surfacing them so we understand where they’re happening. It’s building a defence in depth, where you have multiple opportunities to pick that up.
  • Teams that are unable to learn as avidly as others in which near misses are small events propagate — what happens is the teamwork starts to fall apart, in essence a lot of unhappiness emerges, comments that are sometimes snide comments, abrupt responses or lack thereof; silence sometimes when there should be communication.
  • One is fear and the other is as things go wrong they turn internally, they turn inward to their micro team. In the operating room the teams are made up of sub-teams, the surgical team might have two or three members, the anaesthetist’s team might have two or three members, so when things go awry teams that don’t function as well we’ve noticed turn internally, so they are doing a lot of internal communication but they are not actually talking to the other sub-teams.


Paul’s research is exploring “non-technical skill sets, the behavioural aspects, things like leadership and decision-making, and sharing, problem solving.”
Performance Review
Surgical teams face production pressures and Paul suggests that “the scheduling logistics of before and after undermine the ability of these teams to perform at a higher level.” He notes that “Paediatric cardiac surgery is probably one of the riskiest and most complex procedures known to health care.”
Paul’s research indicates “that most of the learning around surgical procedures does not happen during the procedure, it actually happens after the procedure. So if you and me we have to run to do another case, if you have to run to your rooms, if you have to run somewhere else and the team doesn’t have time to reflect. Opportunities that could have been learned after the case are lost and then the adverse event happens again because there wasn’t time to say: ‘OK how could we have prevented this?'”
The challenge in such environments is to find time to review performance. Even with the constraints of the production pressures, Paul notes that it is possible to find fifteen minutes after a four-hour procedure. In this 15 minutes “there’s a very structured language and this is important, it’s not just a gripe session, it’s really here are the social, culture and technical issues and we go around the room and we talk to them.”
With regard to the facilitation of these review meetings, Paul observes that:

I’d like to think that it’s not necessarily the function of the surgeon, anaesthetist or other, but the ability of an individual to best facilitate and to have trust amongst the team. It might be the surgeon but in general what we’ve discovered is that it might actually be helpful to have somebody outside the actual operating team who observes the team. And so what we’ve done is we’ve invited them to facilitate this process, to just hold the mirror up and say ‘here’s what we saw, what do you think about this?’ And they immediately will say ‘well we could have done it this way, we could have done it that way’ and so the culture of self criticism and learning is very developed in these teams.

This approach resonates with another approach to performance review, after action review (AAR), in which the “AAR facilitator provides a mission and task overview and leads a discussion of events and activities that focuses on the objectives.” Paul has discovered that in surgical teams “when individuals who work together review their process separately, they don’t learn as much as when they review it together as a team. So the group huddle, if you will, acts to bring them together for example to explore — is this the right patient, the so-called time out stuff that we’re doing in health care. More importantly, are there things that are concerning you about this case?”
Paul uses a mini-STAR questionnaire with surgical teams that “each member of the team or team completes prior to and immediately after each operation.” There are four questions: “quality of sleep, amount of information received about the patient, possible worries about other team members (all prior to operation), and the occurrence of adverse events and the atmosphere in which the operation was carried out (after operation).” (See, Schraagen et al., 2009). He reports that “over a third of the members of these complex teams have serious insomnia before a big operation and they’re concerned the equipment might not be right, the personnel might not be appropriate, somebody might be late. So these small things we discovered weigh on them, cause a lot of anxiety and we’re starting to explore the relationship between that anxiety and some of the burnout and disruptive behaviours.”
The outcome of this review process is that results improve. “There seems to be a relationship that the more the team is able to be present in the procedure the more they’re able to pick up these near misses and minor events, the more they are able to prevent the major events.”
Paul is contemplating adding a Line Operations Safety Audit (LOSA) to his work with teams to provide trusted feedback from a colleague. LOSA is used in aviation to support pilots’ learning. An inspector joins a flight and will give a report to the pilots about what occurred during the flight in the cockpit. It is a confidential report for the pilots and is not shared with the aviation company.

The International Civil Aviation Organisation (2002) provides more information about LOSA:

  • LOSA uses expert and highly trained observers to collect data about flight crew behaviour and situational factors on “normal” flights. The audits are conducted under strict no-jeopardy conditions; therefore, flight crews are not held accountable for their actions and errors that are observed. During flights that are being audited, observers record and code potential threats to safety; how the threats are addressed; the errors such threats generate; how flight crews manage these errors; and specific behaviours that have been known to be associated with accidents and incidents.
  • LOSA is closely linked with Crew Resource Management (CRM) training. Since CRM is essentially error management training for operational personnel, data from LOSA form the basis for contemporary CRM training refocus and/or design known as Threat and Error Management (TEM) training. Data from LOSA also provide a real-time picture of system operations that can guide organizational strategies in regard to safety, training and operations. A particular strength of LOSA is that it identifies examples of superior performance that can be reinforced and used as models for training. In this way, training interventions can be reshaped and reinforced based on successful performance, that is to say, positive feedback. This is indeed a first in aviation, since the industry has traditionally collected information on failed human performance, such as in accidents and incidents. Data collected through LOSA are proactive and can be immediately used to prevent adverse events.

Discussion
Paul Barach’s work offers some great insights for those involved in performance environments. As ever I am thinking about the potential to use these insights in learning environments. I am very interested in the implications for coaches in professional sports and how teams share their mental models of what occurs in team contexts. Paul’s interviews and his research papers have a synchronicity for me too.
Recently I completed  some Advanced Firefighter Training with the New South Wales Rural Fire Service in which Crew Resource Management was a key issue. James Reason’s (2000) ‘Swiss Cheese Model‘ of aviation safety was discussed in detail. Following Paul’s interview I am keen to look at the interconnections between safety, review, performance and learning.
I was really fortunate that after such a fascinating interview I was able to discuss it with my colleague before we got on to inter-professional learning. A key aspect of inter-professional learning is the translation of theory into practice. This post is my attempt to start the move from theory to practice.I hope it provides a good topic to share with colleagues in the Faculty of Health at the University of Canberra some of whom are keen to explore performance in allied health contexts.
Photo Credits
Operating Room
Weekly Creative Team Meeting
787 Flight Deck Simulator
Postscript
My colleague Laurie Grealish alerted me to Knowledge systems, health care teams, and clinical practice: a study of successful change by Curtis Olson, Tricia Tooman and Carla Alvarado (2010) after reading this post. The abstract for the paper is:

Clinical teams are of growing importance to healthcare delivery, but little is known about how teams learn and change their clinical practice. We examined how teams in three US hospitals succeeded in making significant practice improvements in the area of antimicrobial resistance. This was a qualitative cross-case study employing Soft Knowledge Systems as a conceptual framework. The purpose was to describe how teams produced, obtained, and used knowledge and information to bring about successful change. A purposeful sampling strategy was used to maximize variation between cases. Data were collected through interviews, archival document review, and direct observation. Individual case data were analyzed through a two-phase coding process followed by the cross-case analysis. Project teams varied in size and were multidisciplinary. Each project had more than one champion, only some of whom were physicians. Team members obtained relevant knowledge and information from multiple sources including the scientific literature, experts, external organizations, and their own experience. The success of these projects hinged on the teams’ ability to blend scientific evidence, practical knowledge, and clinical data. Practice change was a longitudinal, iterative learning process during which teams continued to acquire, produce, and synthesize relevant knowledge and information and test different strategies until they found a workable solution to their problem. This study adds to our understanding of how teams learn and change, showing that innovation can take the form of an iterative, ongoing process in which bits of K&I are assembled from multiple sources into potential solutions that are then tested. It suggests that existing approaches to assessing the impact of continuing education activities may overlook significant contributions and more attention should be given to the role that practical knowledge plays in the change process in addition to scientific knowledge.

In following Laurie’s lead I discovered too Communicating, Coordinating, and Cooperating When Lives Depend on It: Tips for Teamwork by Eduardo Salas et al (2008).

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here