binary BPM
Mark Norton 20 May 2015 11:17:02 AM
This post was written by guest author John Salamito.It is truism to say that everything is a process. It is reasonable to suggest that processes can be modelled and improved. Thus the usual transformation effort starts with mapping the processes, identifying waste and rework to avoid those, and then automating steps as far as possible. It is a seemingly foolproof method and has boundless opportunities.
In practice we know it isn't that simple; there is a 'knowing-doing' chasm. But do we know why? The blame could be laid upon 'doing' factors, like project or change management, but the more likely root cause lies at the 'knowing' end; that is to say, it is much more difficult than presumed to map the processes in the detail required. There are just too many variations and too many compromises imposed by pre-existing processes and promises. This often comes as a surprise because at first process improvement feels "all-so-doable"; the exponential variations don't raise their head until later.
Mountain climbers know that getting to Everest base camp is a modest achievement. It is climbing the last few hundred meters that is a monumental achievement requiring mammoth preparation as well as good fortune. Scaling the process mountain is similar: level 1 is straightforward and builds early confidence (of the type that climbers know not to fall prey to) but getting to the final goal of OPOTOP - i.e. what one person or machine does at one time in one place - is elusive as each exception, constraint, and new requirement breeds exponential complexity.
The way around this is to avoid pre-designing processes altogether! Of course processes always exist because they are the way activities string together. While this is always evident in hindsight, the open question is whether there is value in knowing it with foresight. After all, process adds no value; it is the activities that add value, and so we must recognise process to be a means to an end and not an end in itself. So... what if the processes simply emerged as needed rather than be predefined? By definition this would not only best fit the situation at hand but would also be simpler.
The capability that enables real time orchestration we call decision management. A decision model will determine what the best next action is given the context at that one time for that one customer. If you use a sat-nav system you are already familiar with that concept; the sat nav constantly adjusts in real time to whatever new context you find yourself. A bank or insurer or any other organisation can do the same thing.
Even in a complex bank the number of activities (or services) is finite, say only a few hundred. These can be arranged in millions of ways, but for any given context there is only a small number of sensible next steps to take, perhaps even only one according to company policy. A decision model can determine what this next action is. Decision models can be built quickly (weeks) because the only input is company policy without regard to existing operational design, systems, processes, etc.
The decision model captures company policy for (a) risk and compliance, (b) product offerings, and (c) customer service standards. Together these policies define exactly what we want to happen in any situation. The decision model is the 'knowing' model. It is not a 'doing' model and does not say how these policies get implemented via the activities that involve people, computers and suppliers.
Armed with a decision model and a catalogue of business activities/services, a massively simpler description of the end to end business emerges. We can now see it as a finite set of activities that are triggered according to a finite set of policies. Each activity is self contained, and most of them can be automated. The medium in which they are invoked is flexible - e.g. by a device or a human channel, etc. - and does not impact the core activity itself. Similarly, the event that triggers the activity is flexible - e.g. it could be a calendar date, an ad hoc phone call, etc. - and does not change the core activity. To clarify this, suppose an activity is to value a property. This could be triggered by a calendar event like 'end of policy' or by a request from the customer, and the medium could be a smart app or a letter, but in all cases the property valuation activity itself can be the same.
Let's summarise. The business can be reframed into its policies (via the decision model) and its activities, which are a finite set of reusable building blocks. Together they fully describes what the business thinks and what the business does. New products are mostly rearrangements of existing building blocks. New channels and events can be connected to existing building blocks (easily if service enabled). New regulation are adjustments to the existing decision model, and will mostly use existing building blocks, perhaps with minor modification. So far in the story there is little justification for ongoing complexity and one can see how the business can handle new products, new channels, and new regulations easily.
However, the truth is companies can't do these things easily and have become enormously complex. Why? It is largely because predefining how the building blocks of activity are orchestrated is enormously complex, and this is aggravated by building block duplication. It now has a chicken-and-egg conundrum as to whether unnecessary new building blocks arose in order to satisfy orchestration patterns, or visa versa, but the net result is seen universally: too many duplicated building blocks and too many process variations, often only minutely different but nonetheless causing major difficulties for flexibility, cost, and compliance.
It is worth further explaining that these complexities are often not conscious choices but adopted implicitly when particular solutions are bought or developed. For example, a product admin system might impose an approach to process management. A customer relationship system might have product rules embedded in it. And so on until inevitably a mishmash of paradigms, processes, rules, controls and data pools exist in uneasy co-dependency.
But despite all this, we can reframe the company as being little more than a decision model that directs a finite set of self contained activities. For the most part the traditional complexities of process management can be eliminated, and this makes it practical to find a path to this much simplified operating model that is distinguished by its level of automation, agility, and real time response to each individual customer interaction.
Because of the binary nature of the linkages between the activities that make up the processes, we refer to this approach to process analysis as 'binary BPM'.
Further detailed discussion on this topic can be found in this IDIOM article 'Taming the IT Beast with Decision Centric Processes'.
- Comments [0]