Wednesday, December 06, 2006

Process Analytics - separate is better

When it comes to sales scenarios, customers typically ask BPMS vendors for evidence of their built in process reporting and analytics capabilities. For many process deployments, having built in tools is desirable, since they typically come with canned reports that show the generic information a process manager could need. What I'm recognizing is that there is considerable value in having analytics tools that are completely separated from the BPM engine and able to provide information across a range of systems and processes.

My previous post included a discussion about why process analytics is more than just analyzing the process. Even in a BPMS that has a 'full-bodied' view of process data that includes descriptive business metadata, built in analytics tools will probably struggle to present more than just basic business and process information. This is partly because built in tools tend to work directly off of live data as standard SQL queries, and partly because they are still limited to the data within the system, however broadly defined that may be.

It seems that separating out the business process analytics tool from the BPMS has some advantages:

  1. Analytics is complex - separate tools are more specialized and capable
  2. Analytics should look at more than the process - separate tools are designed to access integrated BPMS and third-party information systems
  3. Analytics is processor intensive - avoid overloading the production system
  4. Analytics should be incorporating your SOA strategy - analytics can provide a more complete performance view if it can provide and use SOA based information

1, 2 and 3 are fairly obvious. Specialized business process analytics tools can provide analytical information broader, better, faster. I didn't include 'cheaper', as there is a cost attached, and this will be greater if a tool is selected that requires specific custom integration (or a lot of configuration) with the BPMS. Pre-integrated BPMS + analytics tools, although separate packages, will almost always be more manageable in the long-run than a custom integration.

So what about the SOA strategy? Here I will sidestep the SOA v. BPM debate about where they intersect. How ever you look at it, SOA presents several challenges and opportunities to process analytics. First, how do you make use of the data and process that overlaps from human BPM into the SOA orchestrated process, and second, how do you design an SOA that enables access to analytics tools?

The first point acknowledges that there is a large overlap between SOA and BPM. Imagine a human-centric business process that also consumes business services and orchestrates integrations - for example, a loan application process that receives electronic applications from a business partner, and coordinates a process involving human assessments, automated decisioning and updates of the main banking system of record and CRM system. Embedded BPMS analytics tools can easily provide a range of analytical information within the bounds of the human-interactive process. The problem is how to gain value from the systems interactions, in terms of:

  • business data - for the business
  • technical performance - for IT
  • cost - for B2B interactions
Using a BPMS that can seamlessly orchestrate service invocation and integration alongside human-interaction will greatly assist in providing the appropriate information for process analytics, since much of the data will be made available in one place. If this is not the case, the analytics tool must be able to look inside service requests and integrated systems and be able to reconcile the information it finds. If the analytics tool is open and extensible then this will be an option.

In many cases it will make sense for the analytics to focus on purely systems-based business services in the SOA. An example is an online loan customer inquiry. In a call center environment, process analytics would typically focus on attributes like the time to answer, abandon rate, loan value, type of request, all divided by class of customer. In an online world, similar information should be provided to the business to assess the effectiveness of their website, marketing and backend application processing, but the website analytics may not be the place to most manageably gather it, especially since much of the required information would not be exposed at this level.

By tying process analytics into the system that orchestrates real-time request services, valuable analytical information can be extracted from business information. This again is made simple if the BPMS that handles the human-centric backend processes also orchestrates and publishes the services used by the website. Even if the human and systems orchestrations are handled in separate systems, using a process analytics tool that is not trapped inside either engine will simplify access to the required information.

I'm really starting to understand the value of having true process analytics existing outside the process systems, enabling it to look across any information it requires. In the converging BPM/SOA space this makes more and more sense to me, though I'm sure that as I start to dig into this further with real tools I'll find out that I need many refinements to my thinking.

Technorati tags:

2 comments:

Anonymous said...

An old data guy told me once that applications should spit out data, all the data. This can be stored in a data warehouse. Then the data can be analysed separately, to the level based on the available funding, but, and this was his point ... first collect all the data.

Anonymous said...

You may find this post interesting. Our experience with process analytics in financial services world
http://diamondinfoanalytics.com/blog1/2006/12/14/core-process-cycle-times-drive-call-volumes/