Sunday, March 25, 2012

Systemic Quality and "The Iron Triangle" of Quality, Cost and Schedule.

The work on Safety and Quality systems by James T. Reason and Charles H. Perrow redefined the world of Quality, showing up in acceleration Safety in Aviation post-1970.

But what do you call this approach?

I'd like to suggest, "Systemic Quality".

Perrow called them "Normal Accidents" and Reason "Organisational Accidents". Both were talking about System created Accidents. Where multiple events, not individuals, are the cause of unintended poor outcomes. But neither coined a term for this approach to Safety and Quality.
My reasoning for the naming is:
Name the approach after the cause addressed, Systems create the problems, so it's Systemic Quality.
The text below is adapted from a piece on suggesting Medicine become a Modern Profession, like Aviation.

What Dr W. Edwards Deming understood so well is that Quality, Process Improvement and Performance Improvement are linked through the same fundamental:
Deliberate, focussed review of work outcomes with intentional Learning and Adaption are necessary for, and common to, all three.
This is enshrined in Deming's P-D-S-A (Plan-Do-Study-Act) cycle, which he called the Shewhart Cycle.

Systemic Quality through its design and nature improves Safety, Performance/Productivity and Economic Performance/Profitability.
Something that Apple Inc knows and Microsoft, the long-time market leader, does not.



What does a modern Profession look like?
Aviation as a perfect model.

In Project Management, there is the "Iron Triangle", usually explained as "Good, Fast, Cheap: pick any two".

Alternatively, the "Iron Triangle" is described as: "scope, schedule and cost constraints".
This definition, with no explicit mention of emergent or unspecified Dimensions like Safety and Quality, can lead Project Managers astray. Deming showed that Cost and Quality are intimately linked, and focusing only on costs ultimately drives costs up, while driving quality down.

This piece of received wisdom says that Economic Profitability, Job Performance and Product/Process Quality are competing dimensions, to optimise one of them, others have to be sacrificed.

This just isn't so.

It only appears that way if a) you examine a single project (in the short-run) and
 b) your Project Methodology doesn't include the last half of Demings' cycle (Plan - Do - Study - Act to improve system).

Dr Deming's proven theories on Performance and Quality rely on two fundamentals which you might recognise from the Scientific Method:
  • Be inquisitive, examine your own performance, look for insights into your work and outcomes, self-examination is the precursor to insight, and
  • try to constantly improve both your knowledge and practice, to consciously learn both from your failures and successes.
This "conscious, deliberate learning" mindset is a necessary condition for constant improvement in all three aspects of the Iron Triangle: Profitability, Performance and Quality.

It's a long-run, not short-run, effect. It doesn't appear within a single project, but after the execution of many. The most important part of every project is the Analysis/Learning phase after it, the Project Review.

For cottage-industry crafts, where you only practice "as learnt" skills without deliberate improvement or correction, the veracity of the "Pick any two" rule is both obvious and unbreakable.

For modern Professions which practise System Quality, i.e. "Do it Right, First Time", the 'rule' is wrong and misleading.

Back to Aviation, a modern Profession where, in most but not all countries, Systemic Quality is pervasive and firmly embedded in the culture and practice of each discipline and speciality, as well as in the governance of the whole Industry and its component parts.

More importantly, there is free, public data on the performance of the Industry.

Page 11 of the EASA's 2010 Annual Safety Review, has a powerful chart [Fig 2-1] showing how the Industry has progressed/improved and some words that should make the Australian Medical Profession both ashamed and envious:
The data in Figure 2-1 show that the safety of aviation has improved from 1945 onwards. Based on the measure of passenger fatalities per 100 million passenger miles flown, it took some 20 years (1948 to 1968) to achieve the first 10-fold improvement from 5 to 0.5. Another 10-fold improvement was reached in 1997, almost 30 years later, when the rate had dropped below 0.05. For the year 2010 this rate is estimated1 to have stayed at 0.01 fatalities per 100 million miles flown.
The accident rate in this figure appears to have been flat over recent years. This is the result of the scale used to reflect the high rates in the late 1940s.
Another Canadian educational / reference site, OLD with an inspiring graph on the improvement in Aviation Safety says:
Up to the early 1970s the number of fatalities increased with some proportionality with the growth of air traffic. By the 1970s, in spite of substantial growth levels of air traffic, fatalities undertook a downward trend. This is jointly the outcome of better aircraft designs, better navigation and control systems as well as comprehensive accident management aiming at identifying the causes and then possible mitigation strategies.
This isn't isolated or peculiarly European: 
The reason for these massive, on-going improvements is the detail and seriousness of incident investigations. Notably, while commercial "Air Carriers" have improved their Safety and Operations by several orders of magnitude while being profitable in a cut-throat industry experiencing a 1,000-fold increase in services delivered, "General Aviation" has improved, but by only approximately 5-fold. The difference isn't in the technology, training available or processes/procedures detailed. It's the Professional "Right First Time, Every Time" approach of Systemic Quality and attention to preventing Organisational Accidents.

The crash in early 2009 into the Hudson River of US Airways 1549, piloted by  Capt. "Sully" Sullenberger, was dramatic, widely reported, and resulted in no fatalities and only a handful of injuries.
In most professions, it would be regarded as a huge success and not studied.

Yet it led to 35 "Recommendations" by the US official investigator, the NTSB (National Transport Safety Board). Think how different this is to most Medical practice: even injuries resulting in permanent disabling of patients, like the 2010 preventable and foreseeable injury to Grace Wang, a repeat of a well known Error, led to news reports, but no obvious investigation and certainly no consequences for anyone involved.

These NTSB "recommendations" will be implemented, will be checked upon by a regulatory body [the FAA] and failure to do so will result in proportional, direct, personal and organisational consequences.

This is completely at odds to the 550+ page report by the 2005 Queensland Public Hospitals Commission of Inquiry, triggered by Jayant Patel and others, where the Recommendations are optional, their (timely) implementation won't be checked, nor will there be consequences for anyone repeating these Known Errors, Faults and Failures.

One of the reasons for this cultural change in Aviation and resulting the on-going improvement of Safety, Quality and Performance in Aviation is the theoretical work of two men:
NASA uses Perrow [PDF] as a basis for its Safety programmes.

Prof. Reason seems to have retired from Academe, but is still listed as an advisor to "The Texas Medical Institute of Technology (TMIT)".

James Reason's work is well known in the medical community: it was used by Dr Brent James and colleagues in the remarkable turnaround and improvement of Intermountain Healthcare, reported in "Minimising Harm to Patients". On the wikipedia page on "The Swiss Cheese Model", a large number of pieces in "Further Reading" are medical.

Where this ends is a 2012 article published in "Journal of Patient Safety", available on the TMIT site, "An NTSB for Healthcare, Learning from Innovation: Debate and Innovate or Capitulate", where the authors, Medicos and Aviators and authors of 100 medical papers, call for applying what is known to work in Aviation to Medicine.

An idea that seems long overdue, although they don't go as far as suggesting the second, necessary, pillar of the Aviation system, the US FAA or UK's CAA, a regulatory and compliance organisation (also responsible for provision of common services, like Air Traffic Control). These organisations are charged with first implementing and on-going checking of NTSB recommendations, bringing direct, personal consequences to those not complying.

Without "accountability", recommendations and findings have little likelihood of being fully and consistently practised.
Abstract:
Economic and medical risks threaten the national security of America. The spiraling costs of United States' avoidable healthcare harm and waste far exceed those of any other nation. 
This 2-part paper, written by a group of aviators, is a national call to action to adopt readily available and transferable safety innovations we have already paid for that have made the airline industry one of the safest in the world. This first part supports the debate for a National Transportation Safety Board (NTSB) for health care, and the second supports more cross-over adoption by hospitals of methods pioneered in aviation. 
A review of aviation and healthcare leadership best practices and technologies was undertaken through literature review, reporting body research, and interviews of experts in the field of aviation principles applied to medicine. An aviation cross-over inventory and consensus process led to a call for action to address the current crisis of healthcare waste and harm. 
The NTSB, an independent agency established by the United States Congress, was developed to investigate all significant transportation accidents to prevent recurrence. Certain NTSB publications known as "Blue Cover Reports" used by pilots and airlines to drive safety provide a model that could be emulated for hospital accidents. An NTSB-type organization for health care could greatly improve healthcare safety at low cost and great benefit. A "Red Cover Report" for health care could save lives, save money, and bring value to communities. 
A call to action is made in this first paper to debate this opportunity for an NTSB for health care. A second follow-on paper is a call to action of healthcare suppliers, providers, and purchasers to reinvigorate their adoption of aviation best practices as the market transitions from a fragmented provider-volume-centered to an integrated patient-value-centered world.

No comments:

Post a Comment