Tag Archives: knowledge

You couldn’t make it up!

Picture the scene…a middle aged man digging in his garden, when he hits an object, possibly a root, which slows his spade. Lifting the top 3 inches of turf away, he clears the space to find that the object is an intact, unfired rifle cartridge – complete with bullet. He thinks ‘Hmmm – either an intact live cartridge or a replica, but 3 inches under my lawn‘ and decides to call the police non-emergency helpline to request its secure collection (the nearest police station is over 10 miles away).

The helpline service is reasonably helpful – it automatically patches the call to his county police constabulary and after a little wait he gets through to a real person. The operator is personable and thorough, takes note of the details (the middle-aged man used to be a marksman, so knows a bit about bullets and was able to convey this). The operator agrees to call back to arrange the collection.

About 5-10 minutes later the Police helpline operator calls back and informs the man that after consultation with her supervisor they have advised that the bullet is simply disposed of. The man suggests that he will not throw it in a fire or anything, but how should he dispose of it? He is told to put it in a bag and pop it in the dustbin.

The waste bin was collected by the council service that day…

The man, knowing a bit about ammunition, had already decided that he will not put the bullet in the waste and instead plans to find a way of getting it to the authorities – ‘waste worker shot in freak accident‘ is not a headline he wants to see in the Sunday newspapers.

Two minutes later the helpline call again: ‘sorry‘, says the operator ‘my supervisor and I have spoken to the inspector who suggests we get someone to collect this bullet from you. You haven’t thrown it away have you?

What is the point of this story? It is this:

However good your procedures and however willing and polite and committed your operators, a helpline service must have expertise at the point of transaction – the operator. If not, you tend to add re-work (e.g. lots of follow-up calls) or waste (or worse).

Sadly, most advice centres are NOT designed this way, instead using less-qualified people on the phones; creating waste, error & discouraging users.

In this instance neither the operator nor their supervisor had the expertise (or judgement) to make this decision. How did the discussion and advice arise between call 1 and 2? How did the second discussion with the inspector arise – was it luck, or part of the procedure of escalation? What would have happened if the inspector had not been there? What if the man HAD put the bullet in his dustbin? What if the man had gone shopping before the third call was made?

How many police would have been required to search the man’s bin, or worse, the contents of 5,000 bins at the council refuse centre had the bin been collected? Or what if all bin lorries had to be stopped on the roads for inspection to remove the suspect bullet? What if the bullet had exploded? You get the point.

This was a real incident involving real people on a Bank Holiday Monday sometime in the past year. You really couldn’t make it up.

 

Further reading on call centres:

Seddon, J. (2005) Freedom from Command and Control, Vanguard Press, Buckingham, UK.

 

Don’t do it to people

vader
One person’s management hero is another’s villain

Management involves getting the most efficient utility from people and resources;

Leadership involves getting people to do things they would not otherwise choose to do.”

EVEN IF TRUE
DOES THAT MAKE IT RIGHT?

In a nutshell the statements on management and leadership summarise conventional wisdom  accrued since 1900, first through the ‘scientific management’ methods of Frederick Taylor and later the alternative ‘human relations’ approach advocated by Elton Mayo. The latter’s approach was apparently set to counteract the rigidity and hierarchies of the former. Unfortunately both approaches have the same defective focus – ‘doing it to people’. They are both a reflection of a command-and -control mindset which many would percieve as ‘managerialism‘.

Improvement comes from understanding the system and making meaningful improvements to ensure better outcomes. Doing it to people does not achieve this. Whilst efficiency in car manufacture  increases, so do the additional costs of salaries to compensate boring jobs, and industrial relations and (at best) static levels of quality – in other words total costs go up.

Whilst most managers and leaders do not want to be working for the ‘dark side’ and genuinely want the better for their teams, they must understand that if they follow the scientific/human relations approach the consequences of their actions are: de-motivation, a loss of dignity, a diminished sense of purpose, and reduction of productivity in their staff.

In knowledge industries, additional contributions to the total cost of this disruption is hidden, for example losses of skilled workers, high staff turnover and recruitment and so on.

The choice is clear: managers and leaders need to find a better way…

Reading:

Hanlon G. (2015) The Dark Side of Management: A secret history of management theory, Routledge

Roscoe, P. (2015) How the takers took over from the makers. Times Higher Education, 26 November, p48

Seddon, J. (2003). Freedom from Command and Control. Buckingham: Vanguard Press.

Experience versus Vitality and Innovation

Daniel Sturridge, Steven Gerrard & Chris Smalling

‘Experience’ is an often-quoted strength in a job candidate or team member.

If this is relevant to the work that needs to be done, then great. However the term experience is often read as ‘knowledge’ and that is not always the case. An experienced person may refer back to situations that are not relevant to the present. An experienced person may rely on apoproaches which are not the best, but which merely work ‘OK’.

An experienced person’s views may now be out of date. In the 1960s a Japanese delegation visited a British car factory in the midlands and were guided around the operation by a proud production manager. The visitors had many questions about the facility and how it worked but felt they were not being given the answers that they wanted. One of the Japanese vistors asked the manager ‘How long have you worked in this factory?‘ to which the manager answered ‘Over 20 years!‘.

The Japanese visitor was oveheard to mutter ‘more like 20 minutes…

The manager did not know what was really happening in the production facility – they did not have relevant knowledge, nor an understanding of how to improve the work or quality of output.

A valuable, experienced professional is one who has the humility (and experience!) that allows them to ask the right questions and not to be the source of all the answers.

Reading:

MacDonald, J. (1998) Calling a Halt to Mindless Change, Amacom, UK

Seddon, J. (2005) Freedom from Command and Control, Vanguard Press, Buckingham, UK.

Four years of reflection: many years of learning

search arrowThis article sees the completion of four years of blogging on this site and this is the 112th article. There is a wide range of material available across the site.

Use our search facility for any keywords you wish, to find a relevant resource.

Key themes we have highlighted over the years include:

Back to work• Don’t do it to people: understand the system of work first

trend line•   Don’t chase things that don’t
exist (like supposed trends in data)

•   Build knowledge, not opinion

•   Don’t rely on top down changeCulture change is not something that you 'do' to people

•   Change can be quick & painless at the right point of intervention

•   Leadership is about followers more
than about the leader

Bradford City ‘picked up the ball & ran with it’, working together, playing to strengths, committing effort, taking responsibility, keeping discipline, and always believing the dream!

•   Decision making can involve people in many different ways

•   Teamwork is about Purpose, Goals & Process more than about Behaviour

Some key searches which may be of interest include:
Team; Improvement; Leadership; Motivation

Key source articles include those by:
Deming; Herrero; Seddon; Senge; Covey; Scholtes

 

Do costs of ‘improvement’ really indicate the relevance of an initiative?

Improvement: shoot this nag and replace it with a new horse, or ox or a steam tractor. Instead why not just give it decent food, water & exercise? What is the impact? What is the cost?

A lot of money is spent on ‘change’ and ‘improvement’. Often a major restructure or implementation of IT are at the fore in improvement investments with new facilities or equipment upgrades (both of which are costly) are not far behind on the list.

It is also common for money or time (usually both) to be spent on customer surveys or staff surveys to glean ‘data’ which it is hoped will inform what type of improvement is needed. Is this always necessary? Is the money which is spent on improvement a good indicator of whether that improvement will be worthwhile – is it a decent ‘return on investment’. This is not always clear, since a change may set off a spiral of outcomes (which will generate positive cost savings and new negative cost burdens) but which may or may not be included in the overall analysis of ‘total cost’ (when they should really be included).

Pat Nevin identified how a small (low cost) change to the surface around top flight professional football pitches could improve the quality of football during competitive matches; an analysis achieved just by looking at the ‘system’ of football in modern stadiums. The cost-benefit might be hard to gauge, but at very least, reduced likelihood of player injury (e.g. slipping on the surface and twisting a knee) is likely to be worth hundreds of thousands of pounds.

Are IT system introductions always based on knowing how the system should operate to deliver its correct purpose? Are restructures based on knowing how the system will deliver the team’s correct purpose? Will a new piece of new equipment enable a worker to deliver their correct job purpose? Or will these changes just enable a piece of work (which may in itself not be relevant any more) to be done faster, more cleanly, in a ‘modern’ way, in a more ‘user friendly’ manner, yet have no impact on delivery the things that matter (the purpose of the work)?

Understanding the impact of incremental improvements is important. We need to assess what is happening in work, whether the patterns are consistent and predictable, then make a reasoned change and monitor if the impact is positive, then continue the cycle. This is continuous improvement and is based upon building knowledge. It is less ‘sexy’, has lower profile and takes time, but the outcomes are far superior – a better way.

Further Reading:

Herrero, L. (2006) Viral Change, meetingminds, UK.

Juran J. (1989) Juran on Leadership For Quality,The Free Press, NY

Scholtes, P. (1998), The Leader’s Handbook: A guide to inspiring your people and managing the daily workflow, New York: McGraw-Hill

Seddon, J. (2005) Freedom from Command and Control, Vanguard Press, Buckingham, UK.

Senge P. (1990) The Fifth Discipline: The Art and Practice of the Learning Organisation, Doubleday, New York.

The Head, Heart and Guts of Leadership Character

leader babyAre leaders born or made? This question dominated leadership thinking until the 1940s and, despite the growth in leadership development (particularly since the 1960s and 1970s) is a question that is still frequently asked.

The question (or its answer perhaps) is usually framed in terms of ‘personality’ on one hand and ‘skills and abilities’ on the other. The suggestion is that ‘personality’ is what we are born with, whilst many of our ‘skills and abilities’ can be learned. We can achieve this learning to some degree of effectiveness or another. However , as human beings we have enormously elastic capabilities – our learning is often governed by choice, not just genes.

When I discuss practical leadership – working with people to get things done, I use a simple three-part model – Head, Heart and Guts. An imbalance in one of these three dimensions would make us appear cold, or gushing, or irrational, or inconsistent, or unpredictable, or a steamroller,  or someone who bends in every wind (or worse).

Covey talks about balancing ‘consideration’ with ‘courage’ (Heart versus Guts), but we also know we need to balance our ‘rational’ side with ’emotional’ empathy (Head versus Heart), and we also need to balance Guts with Head! If you want to develop as an effective leader, then your skills in planning and decision-making need to be combined with interpersonal skills and the development of sound judgement.

Reading:

Covey, S. (1989) 7 Habits of Highly Effective People, Simon & Shuster, New York, NY.

Jacobs, C.J. (2009) Management Rewired: Why Feedback Doesn’t Work and Other Surprising Lessons from the Latest Brain Science. Penguin Group Portfolio, NY

Good Performers will fail in a bad system

Examples of failing systems are numerous, although often the finger of blame for failure is pointed at the people who are at the sharp end (for example, over-worked social workers spring to mind).

ladder of success people

If we cast our minds back to the pre-2012 Olympic Games security shortfall scandal, the British Army had to bring in thousands of troops at the last minute to work as security staff at the venue entry gates, due to critical shortfalls in numbers of trained security personnel promised by a commercial provider. This shortfall was not caused by a lack of recruitment, but apparently by failures in the system of appointing people into the jobs, plus late scheduling of training and induction to prepare recruits to start on-time in their role. Allegedly, some new recruits were never confirmed dates to get their training, others, despite being recruited months before the Games did not complete their training until just a few days before the Games programme ended (so late were the arrangements that many recruits didn’t bother to attend since the Games had only a couple of days to run,  and some people had already found jobs elswhere). These failures were not “nobody’s problem” – they were the problem of managers in the security company.

In a blame culture managers will identify the problem as being the people at the sharp end (so blame those pesky security recruits for not showing up to training just before the Games ended – what a lack of commitment!). Blame is both a self-fulfilling and a self-deluding philosophy.

There is a neat way to define the power of the system, versus the expectations placed on people, in a quote which I understand is attributed to Geary Rummler:

    “Put a good performer in a bad system
and the system wins every time”

But why blame the manager, then? Well, simply, because their job is to manage the system (and to improve it). In fact, that is pretty much all that their job should involve.

Further reading:

Deming W.E. (1982) Out of the Crisis, MIT CAES, Cambridge MA.

Rummler G. and Bache A. (1995) Improving Performance: how to manage the white space in the organization chart. Jossey-Bass, San Francisco.

Don’t let sight of knowledge be blinded by emotions

There are often occasions when we are presented with information or a situation which gets our hackles rising. A picky complaint, a misplaced rumour, an assumption, a one-off gaffe. We know that the situation does not reflect the general reality (our team doesn’t usually screw things up) but we still get annoyed.

Think about it – we get wound up, we try and button the emotion, perhaps it will annoy us for the next hour, the day, the whole week even. It really defeats us one way or another – and it might only be a trivial thing (although sometimes it can be more than trivial – for example if a senior colleague complains).

What can we do? Chew on it all (and get ourselves down or our blood boiling), stand up for it (and risk being seen to be defensive), roll over and take the negativity (and appear passive and weak)?

At the 2014 football World Cup we saw the first use of goal line technology – aimed to remove the subjective decision of a referee on whether a ball had crossed the line to indicate a goal. The goal camera’s  analytical video was shown on the stadium screens. In the match between France and Honduras a shot by a French player hit the goal post ran back across the goal, rebounded into the goalkeeper and headed towards the goal. Had it crossed the line? The referee indicated goal, then the video replay showed the movement of the ball onto the post and the indication ‘no goal’.

Honduras rage
Honduras player react to the ‘injustice’, but their outrage was based on imperfect knowledge

The Honduran players were apoplectic – it was no goal surely! But wait, what had really happened? The video instantly replayed the next sequence – the ball travelling across the goal, hitting the goalkeeper and crossing the line – and the video indicated for this second sequence ‘GOAL’. The referee’s decision was correct (he gets automatic signals only for GOAL).

Honduras ball line 2
…the ball instantly bounces back to the keeper who pushes it over the line, this time the cameras show GOAL. Simples.
Honduras ball line 1
The ball initially hits the post, the cameras are triggered, and identify that the ball does not cross the line…NO GOAL…but…

This is not about goal-line technology.

This issue is that the Honduran team not only had an unjustified emotional reaction, but also their reaction distracted them from their work (football) -they lost 3-0. If they had been rational about it they would have waited for the verdict on the goalkeepers ‘save’ on the goal-line.

The problem we have as human beings is that the emotional centres in our brains operate much more quickly than our rational centres, so we are triggered into an emotional response when a rational response would be better (Peters 2012).

What could be the solution to this? I suggest one. When you are confronted with a difficult situation that you are included to react towards emotionally – seek knowledge (Deming 1982). What do we know, does this always happen, why did they ask this, why did the incident occur, what does data tell us, is it a one off or a repeating occurrence?

Don’t focus on the people, but examine the situation first.

 

Reading:

Deming W.E. (1982) Out of the Crisis, MIT CAES, Cambridge MA.

Peters S. (2012) The Chimp Paradox: The Mind Management Programme to Help You Achieve Success, Confidence and Happiness. Vermillion, London.

 

Links:

BBC Sport (2014) World Cup 2014: Goallien technology TV process Reviewed. http://www.bbc.co.uk/sport/0/football/27864393

 

 

Improving service starts with a leap of fact, not faith

  • Leap of FactWhat should we improve and why?
  • What has changed?
  • How do we improve things, where … when?
  • Who should we involve?

If we start to address these questions and filter out assumptions and  preconceptions, we are able to make some sensible decisions about how to make effective changes that will have a positive effect on performance.

The world is not perfect and we are unlikely to always have the time and resources to gather the complete picture of what is happening. Nevertheless it is important that we seek out and analyse relevant data in order to make some reasonably robust assumptions about what we can do.

There are two common failures of action, lets call them type 1 and type 2 (which is what statisticians call them), or perhaps a mistake in identification between ‘common causes’ and ‘special causes’ of variation. Without understanding the difference we risk just ‘tampering’, where we feel like we are doing something useful but actually only making things worse (Deming, 1982).

“Common Causes”

Common cause situations are those where performance goes up and down over time and if analysed properly can be seen to occur over a relatively predictable pattern: if we change nothing, the performance level will most likely continue. The problems arise when  someone thinks they see a real difference between points of data when in fact no such thing exists. This a type 1 error: we observe  a change which is really only a natural effect of background ‘noise’ yet we choose to act on that ‘change’. For example someone in the office achieves a great result whilst others do not achieve the same result. Is the difference because of the person, or something else in the wider context? Perhaps, as is often the case, they just got lucky and happened to be the one that achieved the good result. Next week it might be someone else. The analogy  is a fire alarm going off indicating a fire when in fact there is no fire. It is easy to fall into type 1 errors assuming highs and lows of performance which don’t exist. This is a ‘mistake of commission’  – doing something that should not have been done (Ackoff et al 2006).

“Special Causes”

Some special causes are obvious, for example a major increase or decrease in performance or a freak accident. However, sometimes hidden patterns of performance can indicate a real change which might easily go undetected if we consider each data point as a ‘one off’. This is a bit like a fire breaking out but the fire alarm not ringing. The fundamental problem is that these genuine changes are due to ‘Special Causes’ something real which is impinging on the system. The issue here is that the solution sits outside the system – don’t redesign what you have as it will not replicate the situation – that is just meddling and will make things worse. For example, cycles of deteriorating work output followed by improving work output by one person might indicate an underlying special cause which needs to be addressed (health for example), so meddling with the design of the work in itself would be counterproductive. Furthermore if the manager does not look at performance over time, these cycles might not be detected anyway – on average they might look like a reasonable level of output. Ackoff calls this a mistake of omission – not doing something that should have been done.

Of course to detect differences between special cause and common cause varuiations in performance requires new skills and disciplines of thinking. When you understand the organisation as a system, improving service starts with a leap of fact, not faith.

Reading:

Ackoff, R.L.; Addison, H. J. Bibb, S. (2006) Management f-Laws: How Organizations Really Work. Triarchy Press

Deming W.E. (1982) Out of the Crisis, MIT CAES, Cambridge MA.

Seddon, J. (2005) Freedom from Command and Control, Vanguard Press, Buckingham, UK.

 

Focusing arguments upon sound knowledge: common fallacies of logic and rhetoric

debate 2Dealing with change usually involves debate: what to change, why, where, when, how and who?

There is often the danger that skeptical inquiry can creep towards defensiveness and cynicism. Here are some things to challenge when these attributes appear in negative arguments presented by others (adapted from Paine 2013):

POOR responses in discussions include:

  • Attacking the person not the argument, or stereotyping a position to make attacks easier.
  • Relying on ‘authority’. Hierarchy should make no difference, one person’s opinion should be no weightier than another’s (both are, after all, just opinions) – what are the facts?
  • Observational selection (counting positives and forgetting the negatives, or vice versa).
    ● Statistics of small numbers
    (such as drawing conclusions from inadequate sample sizes)
    ● the ‘sample of one’: using a single case which could be an extreme outlier rather than the norm
  •  ‘conveniently’ considering only two extremes to make the opposing view look worse:
    ● Excluding the middle options in a range of possibilities
    ● Short-term v long-term: “why pursue research when we have so huge a budget deficit?”.
    ● Slippery slope – unwarranted extrapolation “give an inch and they will take a mile”  – would they…always?
  • Misunderstanding the nature of statistics
  • Confuse correlation & causation (cause & effect):
    ‘it happened after so it was caused by’ – is this really justified?
  • Appeal to ignorance
    (but – absence of evidence is not evidence of absence).

To address these arguments ask: what is the purpose of the discussion? what do we know? what are the facts? what are we assuming? what knowledge can we reasonably base our decision making upon? how can we examine, predict and monitor outcomes?

As Deming says, most of what is important is unknown or unknowable, but we don’t assume that it doesn’t exist.

Bring the skeptics into the argument, involve their questions in the testing and development of ideas. Make resistance useful.

Further reading:

Deming W.E. (1982) Out of the Crisis, MIT CAES, Cambridge MA.

Deming W.E. (1993) The New Economics, MIT CAES, Cambridge MA.

Herrero, L. (2006) Viral Change, meetingminds, UK.

Paine M. (2013) Baloney Detection Kit prepared excerpt from The Planetary Society Australian Volunteer Coordinators http://www.carlsagan.com/index_ideascontent.htm#baloney