Cause and Effect Analysis is often considered a comprehensive approach to addressing problems and identifying solutions. This approach is true in closed systems where linear cause-effect relationships typically occur. Many areas of manufacturing and assembly would fit into this model.
Most work situations do not occur in a closed system but instead are in open systems where many factors affect an outcome.
Ishikawa recognised this in his ‘fishbone analysis’ approach to problems. He recognised that many causes, and layers of causes can affect an outcome. He also advocated for the collection of data on causes and outcomes before relying on an intervention to have a desired affect.
If we ignore this advice we will implement actions which will have unintended consequences. Send a ‘communication’ and it is interpreted as a threat. Give a pay-rise to one person and it is seen as a betrayal of trust by a dozen other people. Instigate an inspection system and it generates a new industry (and associated cost) to the sector. Seek to drive down costs by reducing resources and it will increase costs by causing repeated failures.
If you have a lunchtime to spare it is worth listening to Peter Senge’s discussion about systems in organisations:
Peter Senge’s keynote speech “Systems Thinking for a Better World” at the 30th Anniversary Seminar of the Systems Analysis Laboratory “Being Better in the World of Systems” at Aalto University, 20 November 2014.
One thought on “Systems ignorance and unintended damage”
Given that Ishikawa’s approach to analyzing problems recognizes that open systems have multiple factors affecting the outcome, what steps can be taken to ensure that all potential causes are identified and properly evaluated?