The Challenge of Process Mining with Legacy Applications

The Challenge of Process Mining with Legacy Applications

When process mining is the goal, the availability of basic data describing the behaviour of the system becomes an essential component.

Most modern systems can report when a process starts, what it does (adding additional information) and when it ends in the form of “process logs”. Most modern process mining tools have the capability to decode, understand, and graphically describe the data described in such logs in the form of a business process model.

As technology advances, many legacy systems are left behind without the ability to generate process information, or in fact any logging information at all.

Such legacy systems are often based on old software technology, have little (or no) design information, have poor documentation and no access to the original architect and coders.

These systems are not “observable” under the process modelling point of view.

Objektum Modernization specializes in providing powerful technologies that accelerates the understanding of legacy systems with the goal of gaining knowledge about its structure, functionality and behaviour, from both a static and dynamic perspective. Our technology is currently employed by customers whose goal is to improve the efficiency of maintaining such systems, or even migrating to new platforms.

We recently embarked on a study to explore how our dynamic analysis technology can be extended to support business process modelling, without the need of in-depth analysis whilst taking advantage of our automated approach.

Why not use a small fraction of such power and automation to make legacy systems “observable”?

We discovered that all legacy systems need is a limited amount of logging instructions embedded in key places to generate a process log containing the essential information required for process mining. Our study found that such logging can be achieved with a light weight, top-level, and mostly automated analysis of the legacy application without having to develop a deep understanding of its behaviour, platform and/or code.

We have now extended our current technology to include capabilities dedicated to extending the “observability” of such legacy systems as they execute. This technology can also be employed to extend, and enhance, the information reported by more modern technology-based applications. Our technology is capable of analysing most programming languages including (but not limited to) .NET, Java, C, C++, COBOL, C#, Fortran, Algol, and Ada etc.

How does it work?

The following steps explain what we do in general terms:

1.     We automatically produce a dynamic model of the running application code. The output of this step is a set of sequence diagrams that graphically show how the how the application behaves and how control flows between the components as it executes over time.

2.     Using the sequence diagrams, we annotate where processes start and end, as well as other key variables that are needed to group process actions together. We tag the actions on the sequence diagrams with directives that tell our automation why and how they are special.

3.     We automatically regenerate the application, combining the original source code with the directives on the sequence diagrams. The result is an application that behaves identically to the original, with the additional capability of process logging as described by the sequence diagram directives.

No alt text provided for this image

This diagram shows a typical sequence diagram (Step 1) which has been automatically generated using Objektum technology. It has been annotated with process information (Step 2) in the form of stereotypes and tags. This information is then used to instrument and regenerate the application (step 3). 

The application is then redeployed into the field with the process logging modifications. As the application executes, process mining logging information is generated, hence the system is now suitable for achieving process mining.

Once the business process model is extracted from the new logs, using commercially available tools, the above steps can be repeated iteratively to further refine and reveal the details of specific critical processes that are identified by the process mining activities.

Context diagram and solution overview.

No alt text provided for this image

The above diagram provides an overview of our solution. On the left shows our Legacy Explorer, which is capably of analysing legacy applications and automatically constructing both a static and dynamic model used to describe the system. On the right, shows our new capability to use the model to perform business process mining.

 Contact us:

If you would like any further information, or would like a demonstration of our capability, please contact me, thank you.

To view or add a comment, sign in

More articles by Derek Russell

Others also viewed

Explore content categories