Measuring Process Performance

Download PDF

BPTrends – May 2010

Written By: Alan Ramias, Cherie Wilkins

One of the most important – but frequently most challenging and vexing – aspects of installing business process management in an organization is metrics. There is seldom much argument anymore about the necessity of having metrics at the process level to enable process owners and performing teams alike to monitor performance, diagnose variation, and make effective course corrections. Once a business process has been created or redesigned, measurement of process performance is critical. Measurement can be used to ensure the process is installed properly, produces desired results, and design integrity is maintained. Ongoing measurement is the basis for continuous improvement.

But selecting, designing, implementing, and using metrics is a complex set of activities and loaded with pitfalls, and enabling software can either help or make it worse, depending on the human intelligence being applied to such questions as what to measure, when to measure, who should be watching performance, what to do with data, and how to diagnose and react to performance issues.

This Column will be the first of three devoted to process metrics. We will start by citing some of the pitfalls we have encountered over the years – some of the mistakes we have made ourselves or seen others commit. We’ll describe metrics that don’t work as effective indicators of process performance, or that end up being barriers to understanding and collaboration among those who perform, manage, or support a process. Some  of these pitfalls have to do with the design of metrics themselves; others have to do with how they are used as management tools.

Then, in future Columns, we will describe the approach and tools we use to overcome these obstacles to process performance measurement, the aim being to help you avoid some  of the difficulties we have experienced and to speed you forward to effective design, implementation, and use of process metrics.

The Bolt-On Approach

The most common pitfall we have witnessed in organizations that have installed process metrics is what we  call the “bolt-on approach.” Metrics have been identified (usually after an improvement project) and the data is being dutifully collected and looked at by someone. But measurement and accountability for the performance of the process as a whole is lacking. A close look at the metrics themselves reveals that they usually look a lot like the old functional metrics. There may be some area of performance that has not been measured before – cycle time, let’s say – but for the most part these new metrics are simply added to the existing pile. Old metrics never really went away; there was no fundamental rethinking of what is important in the
organization and what should be measured; there is just more data being collected – making this a bolt-on measurement system. Nobody is held accountable for process performance.

And bolt-on measurement goes hand-in-hand with bolt-on management.  The forum to review process metrics is often a separate management meeting – an extra event that those dubbed process owner or process manager or process management  team need to attend, even though everyone knows that the real management decisions are made at the “regular” management meetings. Very often these process management  meetings are not connected to anything. The data being reviewed are not integrated or correlated with other data regarding business performance. And thus the decisions or corrective actions are also not integrated with the real management decisions and, in fact, are quite often undone by them.

Metrics without Management

The next most common scenario we’ve seen in organizations is the development of process metrics, and that’s it: Nobody asked for them, nobody is accountable, nobody is in charge of process performance so nobody in particular wants to see these metrics or the data.

Why would this happen? Because metrics is something that most business people understand. An organization that enters BPM territory may get confused and uncertain about all the concepts and terminology and tools, but metrics is a familiar device, and it gets latched onto as something that can be quickly designed and implemented, but with little understanding of how these metrics should be used, and what transformative effect should take place as horizontal management is integrated into the existing vertical, functionally oriented management system.

Another version of this we  have seen is that the right people are not looking at the process performance data. So, instead, only staffers are gathering and looking at the information, and they lack the authority to act on what the data tells them. So, once again, it’s a measurement “system” that is dangling, not connected to the existing business performance management system.

A Chaos of Metrics

Without some structure or logic for selecting which metrics will be useful in understanding performance, an organization can end up with metrics for reasons that make  little sense. Measuring virtually everything – every activity, every output, and every variable.  Or not measuring the most important things. Or measuring activities simply because we can (and the use of software makes this more and more tempting).

Sometimes we find multiple, even redundant measurement systems, where people in different parts of the company are measuring essentially the same thing but with a slightly different slant or by different names. Every department has its own version of the truth.

But these practices can lead to a mess: a pile of data, of reports, of indicators that don’t add up, don’t provide a clear picture of performance, and that, ultimately, confuse the people who are trying to cut through the noise and understand what is going on. Evidence that something like this has happened can be found in organizations where people seem to be drowning in reports but have difficulty explaining succinctly what’s happening in and to the business. Adding a set of process metrics just deepens the clutter.

“Bad” Metrics

Aside from how process metrics are often used inside organizations, the metrics themselves can be flawed in design. In addition to being bounded by functions (because the process has been defined functionally), we frequently see the following:

  • Metrics not connected to customer requirements. Instead, they only focus on internal requirements.
  • Only defining metrics for which there are existing data (instead of figuring out how to get the data we need)
  • Defining the metrics and not the rest of the management system – (who will watch this, why, what do they do about it?)
  • Not distinguishing between temporary metrics (those used to ensure that implementation happens properly) and ongoing management metrics (those needed to manage the process post-implementation).

The funniest example of temporary metrics we observed was in a company that had everyone in the process verifying that the data going into a system matched the data coming out. This metric was first created as a data integrity check of newly implemented software and intended to be in place for only a short time, but years later people were still verifying the data, even though it always matched.

Integration into Flawed Management  Systems

As we have said, very often process metrics are stand-alone devices, not integrated into the existing management system, thereby greatly reducing their potential. But in those cases where the process metrics have been included in existing measurement and management systems, there can still be problems if the existing management system has some design flaws of its own.  These are some of the defects we see far too frequently:

  • The Balanced Scorecard approach popularized by Kaplan and Norton* made the use of measurements “dashboards” ubiquitous in corporations. Many executives love these red-yellow-green indicator boards as a quick way to take a snapshot of organizational performance. The problem is that these indicators are often shallow. They tend to either be readings of “spot” data, instead of trends or identification of major spikes in trend data. In either case, they trigger knee-jerk responses. In fact, they are designed to do just that. A “yellow” or “red” creates a flurry of action, often ineffectual, rather than reasoned analysis and careful response. And the opposite can happen too. Everyone is happy with the green lights until one day something turns red, to everyone’s great surprise. But, in fact, the performance had been trending to red for months yet never hit the yellow threshold and nobody knew to look into the gradually eroding results.
  • Dashboard metrics tend to be a single unit of measure (that is, they measure one variable, such as financial). So if the metrics on the dashboard are not correlated against each other, the diagnosis can lead to a superficial understanding of causes.
  • Dashboard metrics also tend to be lagging indicators. Despite a “flashing yellow” light, it is often too late to do much about the performance. Effective linkage of the corporate metrics to the work processes can help alleviate this lag, but if the linkage is not there, all data is necessarily lagging.
  • Finally, it is sometimes unclear who has responsibility for diagnosing and acting on the data shown on a dashboard.   That’s why the everyone‐out‐for‐a‐pass reaction becomes so common.

Effective use of process performance information requires more than the metrics, and more than the designation of someone to “be in charge” of the process. It requires a transformation of the business from a vertical orientation to a horizontal one, from management of functional areas as if they   were independent fiefdoms to management of business processes that require interdependent decisions and actions.

Requirements for Effective Measurement of Process Performance

Now that we have trashed much of the well-intentioned measurement work we have seen out there among process practitioners, it is incumbent on us to provide some requirements for good measurement. These are the requirements we use on our own measurement design work:

  • Metrics should measure the right things, which are outputs and results, not activities.
  • Metrics should measure the relevant variables, or dimensions, of a given output or result.  The variables may be the usual ones of time, cost, and quality, or they may be special and unique to a given output, but, in any case, you need to know what those variables are.
  • It is often necessary to have multiple metrics correlated to multiple variables (whatever is important to the customer and the business)
  • Whatever is measured at the process, subprocess, or task level should be traceable upward to business and customer requirements. There should be a clear line of sight from process to total business variables.
  • Metrics should track trends, not single snapshot data. Overreaction and under-reaction are both less likely when using trend data.
  • Metrics should be assigned at each management level so it is clear who is responsible for tracking, reporting, diagnosing, acting, following up. (We often see cascading measurement systems that skip whole levels of management or have gaps from, say, the business to the job level.)
  • At least some metrics should be leading indicators of future performance problems. These are singled out for special attention.

With these requirements in mind, we will talk next time about building the metrics for a given business process.

*Kaplan, R.S. and Norton, D.P., “The Balanced Scorecard: Measures that Drive Performance,” Harvard Business Review, February 1992.

Copyright 2010 © The Performance Design Lab.  All Rights Reserved.

Related Links

This article is Part 1 of a 3-part series on the topic of metrics and measurement.  For other articles in the series: Part 2, Part 3.

For more detailed information on the management system, consult our book Rediscovering Value.

For a 3-part series of articles on the topic of process ownership, click here: Part 1, Part 2, Part 3.

For an article on the role of management in performance support, click here.

For information on our workshop about Metrics and Process Management, click here.

For information on our consulting service where we assist organizations in designing a measurement and management system (or “Organization IQ”), click here.

For information on our consulting service where we assist organizations in designing a process measurement and management system, click here.