Lean Management Systems and Mysterious Performance Metrics

An effective lean management system, among other things, drives process adherence and process performance. The daily accountability portion of the system includes brief tiered meetings with the stakeholders.

At the tier I level, the core meeting participants are pretty much the natural work team (with hopefully key support people and rotating attendance by the manager(s)). You know, the folks who actually do the value-adding work.

The backdrop for tiered meetings is often a performance metric board, as supplemented by things like task accountability boards and thoughtful reflection on what is being seen by the leaders when they conduct their standard work.

Mystery

Sometimes the performance metric board, its purpose, "story," relevance and "actionability" are a mystery to the tier I stakeholders. It fails the "So what?" test. If it can't pass that test, the meeting is muda.

How can that be?

Well, my experience is that it's part of a lot of things, including part training and communication, part "presentation" (board design and execution), part change management, part performance management...and so on.

The categories of lean performance metrics are simple. True north metric families are pretty much quality, delivery, cost and human resource development. To that, you can add continuous improvement. Everything else is more or less a derivative from those families.

A performance metric board should answer relevant questions about the team's balanced process performance within the value stream. Questions like, "Are we satisfying customer requirements relative to time, accuracy, completeness? Are we becoming more productive? Are we performing our work more safely?" And the answers should give us insight into the what, why, where, when, how and how many.

Often the focus is around the last 24 hours and the next 24 hours. But, we must care about trends, we must understand targets, and there has to be appropriate vertical and horizontal alignment within the organization. It's all part of the dynamic of PDCA.

When performance metrics are a mystery, then we miss out on a whole dimension of engagement.

Assume that you're a tier I stakeholder who has just been indoctrinated within the tier I meeting process. The experience too often goes like this (in your head), "Hey look, there's a board...with lot's of metrics on it. What does it mean? Heck, I can't even read it. Too small, too many numbers. Where do those numbers come from? I don't even think the team leader knows what it means. Why do we suddenly care about this stuff? What is the target? The leaders keep talking about the elimination of waste - this meeting is 10 minutes of waste, 'Blah, blah, blah...'"

Take the Mystery out of It

Clearly, folks must be trained in the system and elements of the lean management system. This will provide a necessary foundation for understanding, application and change.

When it comes to team specific performance metrics, the training must be pretty deep for the stakeholders. Unfortunately, we often take short cuts here.

Instead, when metrics are under development (think PDCA), there must be a kind of precision to ensure that the critical few, balanced metrics do pass the, "So what?" test. In order to do this, consider creating a metric profile for each and every metric. The profile forces rigor and it can then be used to help train people on the metric itself.

Furthermore, the metric profile should be hung up on the metric board underneath the metric. Think of it as metric standard work. Update it as you clarify it and make improvements.

So, what should be included in a metric profile? Here's some elements that I usually include:

  • Metric name. This one is obvious.
  • A picture of the metric. It helps to know what it looks like...or should look like - line graph, stacked bar chart, etc. It's OK for the template to be computer generated, but the data, bars and/or lines, etc. should be hand drawn - the quicker to generate and easy to read from 10+ feet away.
  • Purpose of the metric. It's very important to understand the "why" of the metric. For example a cumulative production run chart provides insight into the linearity/level production day over day.
  • Implications, a.k.a. the "So what?" To continue the example from above, if the cumulative production run chart reflects less than level production (here an upper and lower control limit can provide a target), then the leader should investigate the root cause(s). Potential root causes can include demand variation, overproduction, capacity constraints, etc. The implications follow suit.
  • Metric target. Good PDCA usually requires targets. Folks need to understand expectations and the magnitude of the performance gap(s).
  • Data source. It's important to specify where the data reflected (directly or through calculation) within the metric comes from in order to ensure accuracy and consistency.
  • Calculation, if applicable. Many times data is taken directly from a report, stick count, etc. and posted/charted on the metric template. Sometimes the metric calls for a calculation using source data. For example, prior day productivity (number of units/person/hour) may require someone to take prior day output, divided by day staffing, then divided by hours worked. There should be no guess work on how to perform the calculation.
  • Frequency. Metric "actionability" typically calls for more frequent measures. Much of the time this means daily measurement, however, weekly and even monthly may be more pragmatic for less dynamic metrics (for example, employee satisfaction survey results).
  • Owner. It makes sense to specify the keeper of the metric so that there is no ambiguity. This does not preclude rotating the preparation and presentation of a given metric(s) on a rotating basis among meeting stakeholders to facilitate understanding and engagement.

What else should be on the profile? Related posts: How to Audit a Lean Management System, “So What?” – A Powerful Lean Question

 

There are 5 Comments

David Williams's picture

Great post Mark. When I think back over a year ago when I started this process in my previous employment, everything you have said above makes sense. I couldn't agree more with your advice to draw or write down the results rather than computer generated reports. Getting the team members to fill in the metrics was key to their understanding. My new company is on it's lean journey and visual metrics will be a big part of this. I'm playing a part in this journey and will be using what you thought me previously and from what I learned in my last employment to ensure we get successful visual metrics implemented!

markrhamel's picture

Hi David,

Thanks for the comment and it's great to hear that you are doing well! There definitely is a certain psychology around having people manually record data rather than relying on a computer.

Best of luck with this new chapter.

Warm regards,
Mark

Andy Bonczkowski's picture

Hi Mark,

I'm about 10 months into leading a plant into a lean transformation and I totally agree with all of the points you make on what goes into a good metric. The two biggest 'Ah Ha' moments for me was when I put an owner for each measure, started getting the charts updated, and took out the calculations. We had a hard time updating OEE due to the calculations and it didn't pass the 'so what' criteria due to not everyone understanding, so I took that out and just put in production and yield.
I can't think of anything I would add to the metric profile but what I like to do is put an action item list under the metric so that items that fall outside the control limits can be followed up on. The action item list is just date, owner, problem, countermeausure, and due date.

Thanks,
Andy

markrhamel's picture

Hi Andy,

Thanks for the comment! Great insight.

I really like the action item list application. Typically, I try to capture that stuff either on a separate countermeasure tracker sheet (kind of a kaizen newspaper template) for tier I or a task accountability board for tier II and above.

Best regards,
Mark