Jump to content

Streaming conformance checking

From Wikipedia, the free encyclopedia

Streaming conformance checking is a type of doing conformance checking where the deviation (if exists) is reported directly when it happens. Instead of event log, streaming conformance checking techniques take event stream and process model as input and for each received event from the stream, it will be compared with the model.

Differences with conformance checking

[edit]

The conventional conformance checking techniques use event log as input. An event log is a static data source recording the business activities over a time span. After the event log is completely recorded, conformance checking techniques are applied and the deviations, if they exist will be shown. However, this kind of doing conformance has several drawbacks:

  • Not enough resources: In the large-scale companies, the size of event log is enormous. The traditional conformance checking techniques are not able to deal with such large data sets, i.e., the data does not fit the main memory.
  • Not monitor in real-time: The conformance checking algorithms perform in a-posteriori fashion, hence, the deviations can only be detected after they have occurred for an amount of time. In some circumstances, where timely corrective action is crucial, e.g., monitoring the health status of patient, finance, etc., the late detection can cause severe consequences.

On the other hand, streaming conformance checking techniques use event stream as input. An event stream is a continuous stream of events executed in the context of an underlying business process.[1] Each event from the event stream is denoted as (c, a) where c is the case identifier and a is the activity name of this event.

With this kind of data, the conformance checking can be continuously performed along the stream, i.e., for each executed activity, the analysis will be directly calculated if that activity causes any deviation based on a given process model. Hence, this kind of conformance checking provides a continuous way of monitoring a process and detecting the deviations in real-time.

Algorithms

[edit]

The fundamental difference between online and offline conformance checking is the completeness of the input. The behavior seen for each event in the event log is complete, i.e., we would know if the according case is still running or already stops. It is not the case with event stream. At the moment, in which one activity from a case is successfully executed, we would not know if the case ever stops or is already complete, i.e., in the future, no new event will belong to this case. Due to this difference, the conventional conformance checking algorithms are not (fully) applicable in the online context and needed to be adjusted.

Footprints

[edit]

Input: An event stream and a footprint matrix of the according process model.

Algorithm:

For each received event (c, a)

  • If a is the start activity of case c, the algorithm examines if a is allowed to start.
  • Otherwise, if a is not the start activity of case c, the algorithm examines if a is allowed to directly follow the last executed activity of case c. To keep track of the last executed activity, there is a dictionary storing this information.

Token-based replay

[edit]

Input: An event stream and a petri net model.

Algorithm:

For each received event (c, a)

  • The algorithm checks if the given Petri net model can fire the activity a based on the previous executions regarding case c.
  • Store the status of the Petri net for each case with information about the last executed activity for each case and attributes to count the amount of missing tokens.

Temporal profile

[edit]

Temporal profile measures the average time between two activities and the standard deviation between events having these activities.

Input: An event stream and a temporal profile model.

Algorithm:

For each received event (c, a)

  • The algorithm computes the durations between the activity a and the executed activities regarding the case c and checks if they are allowed by the model.[2]

Log skeleton

[edit]

Log skeleton consists of constraints which describe the relationship between activities in a process.[3]

Input: An event stream and a log skeleton model.

Algorithm:

For each received event (c, a)

  • The algorithm checks if the execution of the activity a violates any declared constraints regarding the current status of the process.
  • Update the current status of the process with the new execution of the activity a.

References

[edit]
  1. ^ van Zelst, Sebastiaan J.; van Dongen, Boudewijn F.; van der Aalst, Wil M. P. (2017-05-15). "Event stream-based process discovery using abstract representations". Knowledge and Information Systems. 54 (2): 407–435. arXiv:1704.08101. doi:10.1007/s10115-017-1060-2. ISSN 0219-1377. S2CID 13301390.
  2. ^ Stertz, Florian; Mangler, Juergen; Rinderle-Ma, Stefanie (2020-08-17). "Temporal Conformance Checking at Runtime based on Time-infused Process Models". arXiv:2008.07262 [cs.SE].
  3. ^ Verbeek, H. M. W.; de Carvalho, R. Medeiros (2018-06-21). "Log Skeletons: A Classification Approach to Process Discovery". arXiv:1806.08247 [cs.AI].