What is Complex Event Processing?

本文详细介绍了复杂事件处理(CEP)的概念和技术架构,包括事件预处理、事件细化、情景细化、影响评估及过程细化等关键步骤,并探讨了CEP在不同业务场景中的应用。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

 

What is Complex Event Processing? (Part 1)

(Originally Published by Tim Bass, TIBCO Software Inc. , April 23, 2007)

Complex event processing (CEP) is an emerging network technology that creates actionable, situational knowledge from distributed message-based systems, databases and applications in real time or near real time. CEP can provide an organization with the capability to define, manage and predict events, situations, exceptional conditions, opportunities and threats in complex, heterogeneous networks. Many have said that advancements in CEP will help advance the state-of-the-art in end-to-end visibility for operational situational awareness in many business scenarios. These scenarios range from network management to business optimization, resulting in enhanced situational knowledge, increased business agility, and the ability to more accurately (and rapidly) sense, detect and respond to business events and situations.

Possibly, one of the easiest ways to understand CEP is to examine the way we, and in particular our minds, interoperate within our world. To facilitate a common understanding, we represent the analogs between the mind and CEP in a table:

Human Body

Complex Event Processing

Functionality

Senses

Transactions, log files, edge processing, edge detection algorithms, sensors

Direct interaction with environment, provides information about environment

Nervous System

Enterprise service bus (ESB), information bus, digital nervous system

Transmits information between sensors and processors

Brain

Rules engines, neural networks, Bayesian networks, analytics, data and semantic rules

Processes sensory information, “makes sense” of environment, formulates situational context, relates current situation to historical information and past experiences, formulates responses and actions

Table 1: Human Cognitive Functions and CEP Functionality

 

In a manner of speaking, CEP is a technology for extracting higher level knowledge from situational information abstracted from processing business-sensory information. Business-sensory information is represented in CEP as event data, or event attributes, transmitted as messages over a digital nervous system, such an electronic messaging infrastructure.

In order to effective create an processing environment that can sustain CEP operations, the electronic messaging infrastructure should be capable of one-to-one, one-to-many, many-to-one, and many-to-many communications. In some CEP application, a queuing system architecture may be desirable. In other CEP application, a topic-based publish and subscribe architecture may be required. The architect’s choice of messaging infrastructure design patterns depends on a number of factors that will be discussed later in the blog.

Deploying a messaging infrastructure is the heart of building an event-driven architecture (EDA). It follows an EDA is a core requirement for most CEP applications. It is also safe to say that organizations who have funded and deployed a robust high-speed messaging infrastructure, such as TIBCO Rendevouz® or TIBCO Enterprise Messaging System® (EMS) will find building CEP applications easier than organizations who have not yet deployed an ESB.

When an organization has substaintiated an EDA and event-enabled their business-sensory information, they can consider deploying CEP functionality in the form of high-speed rules engines, neural networks, Bayesian networks, and other analytical models. With modern rules engines, organizations can take advantage of powerful declarative programming models to optimize business problems, detect opportunity or threats, improve operational efficiency and more. If the business solution requires statistical models such as likelihood, confidence and probability, event are processed with mathematical models such as Bayesian networks, neural networks or Dempster-Shafer methods, to name a few.

Event Processing Reference Architecture

Figure 1. The JDL Model for Multisensor Data Fusion Applied to CEP

Solving complex, distributed problems requires a functional reference architecture to help organizations understand and organize the system requirements. Figure 1 represents such an architecture. The Joint Directors of Laboratories (JDL) multisensor data fusion architecture was derived from working applications of blackboard architectures in AI. This model has proven to be reliably applicable to detection theory where patterns and signatures, discovered by abductive and inductive reasoning processing, have been the dominant functional data fusion model for decades.[1] Vivek Ranadivé, TIBCO’s founder and CEO, indirectly refers to this model when he discusses how real-time operational visibility, in the context of knowledge from historical data, is the foundation for Predictive Business®.[2]

In part 2 of What is Complex Event Processing, I will begin to describe each block of our functional reference architecture for complex event processing. [3] In addition, I will apply this reference architecture to a wide range of CEP classes of business problems and use cases. I will also address the confusion in terminology and the differences between CEP and ESP (event stream processing), here on the CEP blog, in the coming weeks and months.

 

What is Complex Event Processing? (Part 2)

(Originally Published by Tim Bass, TIBCO Software Inc. , April 23, 2007)

In a previous blog entry, What is Complex Event Processing? (Part 1), we introduced a few basic event processing concepts and a functional reference architecture for CEP based on the JDL model for multisensor data fusion. One of the most important concept in our reference architecture is the notion of events, which is the topic of this blog entry today.

What is an Event?

Similar to many topics in science and engineering, the term event has different meanings based on who is observing the event and the context of the observation. Let’s review of few of the different definitions from the point of various observers, keeping in mind that in CEP we are primarily interested in processing events related to business. First, we take a wider survey.

If you are a mathematician, you might view an event via the lens of event probability theory, which states than an event is a set of outcomes (a subset of the sample space) to which a probability is assigned. So, for example, if we were processing many banking application log files, in real-time, looking for fraud, there exists some conditional probability at any moment that a fraud is being orchestrated against the bank. The event is the fraud (detected or undetected outcome); and based on a number of factors, the probability of a fraudulent event against the bank changes over time.

On the other hand, if you are a particle physicist, an event is a single collision of two particles or a decay of a single particle! A collision, in particle physics, is any process which results in a deflection in the path of the original particles, or their annihilation. This view seems to imply that atomic and subatomic exceptions and state transitions are the foundation for events, which may be significant if you are a particle physicist. Assuming most of the readers of the blog are not particle physicists, you may be interested in the draft definition of an event from the Event Processing Technical Society (EPTS) CEP glossary working group, summarized below:

Event: Something notable that happens.

Examples:

- a financial trade

- an airplane lands

- a sensor outputs a reading

- a change of state in a database, a finite state machine

- a key stroke

- a natural or historical occurrence such as an earthquake

- a social or historical happening, e.g., the abolition of slavery, the battle of Waterloo, the Russian revolution, and the Irish potato famine.

Event (also event object, event message, event tuple): An object that represents, encodes or records an event, generally for the purpose of computer processing. Notes: Events are processed by computer systems by processing their representations as event objects. Events are immutable objects. However, more than one event may record the same activity.

Examples:

- a purchase order (records a purchase activity)

- an email confirmation of an airline reservation

- stock tick message that reports a stock trade

- a message that reports an RFID sensor reading

- a medical insurance claim document

Overloading: Event objects can contain data. The word “event” is overloaded so that it can be used as a synonym for event object. In discussing event processing, the word “event” is used to denote both the everyday meaning (anything significant that happens) and the computer science meaning (an event object or message). The context of each use indicates which meaning is intended.

As one can see, none of these definitions are completely satisfying! For example, if we look at financial market data, some might observe that it appears a bit pedestrian to say that each trade is an event. Why? Because the market data is the entire sample space and each trade is an element of the set of trades of that particular equity (for example) on a particular day. To call each trade an “event” may be unsatisfactory for some people.

On the other hand, when a business is processing market data using a VWAP algorithm, for some the event occurs, for example, when the price of a buy trade is lower than the VWAP. Conversely, if the price is higher than the VWAP the event would be an indication to sell.

This example tends to more closely align with the mathematicians view of events, an outcome from the sampled set with an assigned probability. The fact that the event is “significant” is due to the context of the theory and application of the VWAP strategy - without VWAP there might be no event, in this context. Similar analogies can be illustrated for fraud detection, supply-chain management, scheduling and a host of other CEP related business problems.

Events are Context Dependent

For example, if you have thousands of packages with RFID tags traveling the globe, is the event when the RFID reader registers the RFID tag? Or is the event when an exception occurs, for example, a lost package? One view is that the RFID reader is simply recording data and the associated RFID data is the sampled set (not necessarily the event). The outcome of interest, with an assignable probability based on the business context, are exceptions, which, in term, become business events. On the other hand, another view might be that each RFID recording is an event, and CEP is detecting “situations,” in this use case, the situation we refer to as “lost package”.

In you are interested in other terms related to CEP, please visit the Draft Event Processing Glossary. Your comments on the glossary are both welcome and much appreciated!

In What is Complex Event Processing (Part 3), I will begin to discuss the functional components of event processing based on the functional reference architecture introduced in What is Complex Event Processing? (Part 1).

 

What is Complex Event Processing? (Part 3)

(Originally Published by Tim Bass, TIBCO Software Inc. , April 27, 2007)

In an earlier blog entry, What is Complex Event Processing? (Part 1), we introduced a functional reference architecture for event processing. Now, we discuss another important component of distributed CEP architectures, event preprocessing.

Event preprocessing is a general functionality that describes normalizing data in preparation for upstream, ”higher level,” event processing. Event preprocessing, referred to as Level 0 Preprocessing in the JDL model (see figure below), is often referred to as data normalization, validation, prefiltering and basic feature extraction. When required, event preprocessing is often the first step in a distributed, heterogeneous complex event processing solution.

Event Processing Reference Architecture

As an illustrative example, visualize a high performance network service that passively captures all inbound and outbound network traffic from a web server farm of 300 e-commerce web servers. We must first normalize the network capture data so that it can be further processed. How to you extract HTTP session information from an encrypted click stream in real time? What information do you forward as an event? Do you just send HTTP header information or other key attributes of the payload? Do you strip out the HTML images files; or do you replace them with the image metadata? These are examples of important questions that must be considered in a web-based event processing application.

In another example, we are building a network management related CEP application and will be correlating events using a stateful, high-speed rules engine. The event sources, for example, are SNMP traps and log file data from two network applications. How do we normalize (transform) the data for event processing? How much filtering is performed at the data source versus at the upstream event processing agent?

Heterogeneous, distributed event processing applications normally require some type of event preprocessing for data normalization, validation and transformation. Some simple applications, for example self-contained processing of well formatted homogeneous streaming market data, require very little preprocessing. However, most classes of complex event processing problems require the correlation and analysis of events from different event sources. BTW, this is a major difference between true CEP classes of problems and event stream processing (ESP) classes of problems. I will discuss this in more detail in a later blog entry.

Often our customers at TIBCO use our BusinessWorks® product to prefilter, normalize and transform raw data into JMS or TIBCO Rendezvous® messages. What is important to remember is that raw data must be transformed (normalized), securely transmitted as an electronic message across the network and formatted in a manner that optimizes event processing throughput.

What is Complex Event Processing? (Part 4)

(Originally Published by Tim Bass, TIBCO Software Inc., April 30, 2007)

In What is Complex Event Processing? (Part 3), we discussed event preprocessing in event processing applications. Now, in Part 4, we discuss event refinement, also referred to as object refinement or track and trace.

Event Processing Reference Architecture

Event refinement is the functional aspect of event processing that describes refining a single event object, an iterative process of operating on event data to determine attributes of individual event objects and also to build and track their behavioural characteristics. Here, the term event object refers to a distinct object. A track is often constructed based on detections of an individual identified event object, but it can also be indirectly based on detecting actionable parameters of event objects. In addition, event refinement also includes the functionality of state estimation and prediction for individual event objects – the trace aspect.

Examples of event refinement (track and trace) are:

- tracking market transactions in equities and calculating a VWAP on each tracked equity;

- tracking user sessions in an on-line e-commerce application and ranking sessions for likelihood of fraudulent behavior;

- tracking an individual container or package (with RFID, for example) as it travels around the globe and looking for delays or other exceptional conditions;

- tracking a log file in a network device or applications and searching for a specific pattern or anomalous behavior;

- tracking the path of a single aircraft, vessel or train in motion; or,

- tracking a patient in a hospital as they move through various stations and treatments.

Kindly note that in the examples above the event objects are a stream of single stock transactions: an on-line user, a package or container, a log file, an aircraft or a patient. We can all think of many different examples of objects in our businesses than are, or should be, tracked and traced in order to efficiently run the business and search for threats and opportunities to the business.

Event refinement, or track and trace, when applied to digital event information is very similar in functionality to data stream, or event stream processing, ESP. Event stream processing is a very important component of both event processing and complex event processing applications. Steams of events generally consist of event objects related and comparable by time; for example, market transactions of an individual equity are related by the time of execution. Entries in a single log file, the tracking data of an individual aircraft, and other sensor readings are also generally recorded with a time stamp.

Additionally, events may not have known time stamps but are related by causality, ontology or taxonomy. Often the causality is hidden or unknown. Finding hidden causality when debugging distributed systems was the genesis of the work in complex event processing by Dr. David Luckham. Here, the relationships are more complex than tracking and tracing objects in a stream of comparable objects within a known time series.

In my next post, What Is Complex Event Processing, Part 5, we will get into the heart of CEP: analytics where various (multiple) objects are compared, aggregated, correlated and/or analyzed to detect various business situations and scenarios.

 

What is Complex Event Processing? (Part 5)

(Originally Published by Tim Bass, TIBCO Software Inc., May 9, 2007)

In What is Complex Event Processing? (Part 4), we discussed event refinement, also referred to as object refinement or track and trace. Today, in part 5 of What is Complex Event Processing, we discuss the heart of CEP, situation refinement.

Event Processing Reference Architecture

Situation refinement is the functional component of event processing that describes refining multiple event objects in order to identify business situations and scenarios in real-time. Situation refinement analyzes multiple event objects and aggregated groups of events against existing detection templates, patterns, algorithms and historical data to provide an estimate of the current situation and to suggest, identify and predict future opportunities and threats.

Examples of situation refinement are:

- debugging a distributed computing application to determine cause-and-effect relationships between heterogeneous event sources;

- calculation a VWAP on a tracked equity or basket of equities and correlating these event objects with high confidence news reports in real-time ;

- correlating tracked user sessions in an on-line e-commerce application with credit card activities and geolocation information of the same user in real-time;

- associating containers and packages (with RFID, for example) with weather and traffic information to predict delays in shipments in real-time;

- correlating multiple log files selected network devices or applications and searching for a specific patterns or anomalous behavior in real-time;

- analyzing the projected tracks of multiple aircraft, vessels or trains in motion looking for potential collisions before they happen,

- correlating patient information in multiple hospitals looking for trends in viral epidemics and predicting future outbreak areas.

- correlating locations, crews, schedules, cargo, stations and other constraints in a transportation network to optimizing network resources; or,

- correlating the customer information of multiple retail banking channels with real-time customer interaction personnel to enhance the user experience and maximize marketing effectiveness.

It is interesting to note that situations are often referred to as complex events. The terminology (glossary) working group of the Event Processing Technical Society (EPTS) uses the following definitions:

Complex event: an event that is an abstraction or aggregation of other events called its members.

Composite event: Composite event types are aggregated event types that are created by combining other primitive or composite event types using a specific set of event constructors such as disjunction, conjunction, sequence, etc. Note: This definition is from the Active Database terminology

Derived event (also synthesized event): an event that is generated as a result of applying an algorithmic function or process to one or more other events.

Relationships between events: Events are related by time, causality, aggregation, abstraction and other relationships. Time and causality impose partial orderings upon events.

This leads us to the current working EPTS definition of complex event processing:

Complex-event processing (CEP): Computing that performs operations on complex events, including reading, creating, transforming or abstracting them.

In my next post, What Is Complex Event Processing, Part 6, we will discuss another important area in CEP, impact assessment - where detected business situations are compared, correlated, and/or analyzed in “what if” type of scenarios to determine and predict business consequences.

What is Complex Event Processing? (Part 6)

Posted by: Tim Bass

In What is Complex Event Processing? (Part 5), we discussed situation refinement, the functional component of event processing that describes refining multiple event objects in order to estimate and identify business situations and scenarios in real-time. Today, in Part 6 of What is Complex Event Processing, we discuss impact assessment - where detected business situations are compared, correlated, and/or analyzed in “what if” type of scenarios to determine and predict business consequences.

 

After event detection and situation refinement, businesses are very concerned with ascertaining or predicting outcomes and financial gains or losses if a detected situational threat or opportunity materializes. Impact assessment is the functional component of event processing that is focused on the estimation and prediction of the priority, utility or cost of an estimated business situation, complex event or scenario.

At this stage of the CEP reference model (above), we estimate the impact of an assessed situation, which includes likelihood and/or cost/utility measures associated with potential outcomes. From this inference, loss projections and liabilities (or gains) may> In addition, resource allocation and processing priorities may be estimated.

Opportunities and threats in business generally need to be predicted based upon an estimate of the current situation, known plans and predicted reactions. Example of real-time predictive types of business use cases are:

- determining the expected consequences of a fraudsters actions in an ecommerce scenario given the current estimated threat to the business;

- estimate the consequence of a failure in a distributed computing application and the effects on other systems that interact with the failed component;

 

- estimating the potential profit if an algorithmic trade is executed on a tracked equity or basket of equities;

 

- predicting how delays in shipping effect the supply chain, including consumer choices and behavior;

- predicting network congestion and outages based on specific patterns of anomalous network behavior in real-time;

 

- assessing risk and losses in a potential aircraft collision based on information about the planes, the location and their cargo or passengers;

 

- predicting the impact of a viral epidemic on different geographic areas and populations;

 

- predicting costs saving based on optimizing network resources in a transportation or supply chain network; or,

 

- predicting potential losses if an identified class of missile reaches its projected target.

Impact assessment generally requires real-time correlation of historical data which resides in databases. This is represented by the Database Management component of the event processing reference architecture.

In my next post, What Is Complex Event Processing, Part 7, we will discuss another important area in CEP, process refinement – actions taken, parameters adjusted, resources allocated (for example) based on detected (and/or predicted) business situations and scenarios.

 

What is Complex Event Processing? (Part 7)

Posted by Tim Bass

In What is Complex Event Processing? (Part 6), we discussed impact assessment in event processing applications. Today, we introduce process refinement - the feedback loop, resource management and work flow components of event processing architectures. Business process management (BPM) also is a part of process refinement, depicted in slide 13 of my March 14, 2007 IDC presentation in Lisbon, Portugal.

This overarching event-correlation-assessment-decision-action concept in CEP is also discussed very nicely by John Bates and Giles Nelson, Progress Apama, in their on-demand webinar, Using CEP and BAM for Fraud & Compliance Monitoring. John and Giles do an excellent job covering a number of fraud detection, risk and compliance use cases that illustrate how raw events, like RFID tags in poker chips, are correlated in real-time to effect actionable processes. In addition, Paul Vincent, TIBCO Software, does a nice job illustrating another application of direct feedback based on real-time analytics in his blog post, CEP and the alphabet soup (Part 2): BI.

Event Processing Reference Architecture

In a nutshell, we can easily see that all of the components of the CEP functional reference architecture we discussed earlier, events, event pre-processing, event refinement, situational refinement and impact assessment, add value only if they lead to high confidence, resource efficient actions. This is one of the motivations behind David Luckham’s recently posted white paper, SOA, EDA, BPM and CEP are all Complementary.

Process refinement is the functional component of event processing that takes action based on detected situations and predicted impacts.

Examples of process refinement in real-time are:

* executing a trade in equities based on a series of events that lead to a high yield opportunity;

* alerting security by initiating incident workflow in an on-line e-commerce application when the likelihood of fraudulent behaviour and loss potential is high;

* notifying downstream suppliers and customers when an actionable exceptional condition was detected in the supply chain;

* adding a new firewall rule when high-confidence anomalous behaviour in detected on the network;

* notifying airlines, the FAA and the media as early as possible when a real time air disaster may be about to happen;

* automatically moving a camera in a casino to game tables where suspicious dealings have been detected while notifying security; or,

* turning on sensors ahead of the projected path, while turning off sensors behind the historical path, of a long-range missile in flight.

Alan Lundberg and I referred to this as event-decision architecture, when we collaborated on my keynote at the First Workshop on Event Processing - Processing Patterns for PredictiveBusiness, Other folks on the net, Brenda Michelson for example, refer to the process we are describing as a business-decision architecture.

What is important to note is that the overall goal of processing events is to take raw events as input and process the events to detect actionable situations with high confidence; and then affect the right process, with the right action, at the right time, as James Taylor correctly points out in his post on business decisioning.

In my next post in this series, What Is Complex Event Processing? Part 8, we will review another important aspect of event processing, visualization and the user interfaces to the components of the CEP reference architecture.

What is Complex Event Processing? (Part 8 )

Posted by Tim Bass

In What is Complex Event Processing? (Part 7), we introduced process refinement - the feedback loop, resource management and work flow components of event processing architectures. Today we review another critical area in the functional CEP reference architecture - visualization, which includes both user interfaces and scientific visualizations.

Solving complex distributed problems require visualization at all levels of the inference model we have been describing in our blog series, What is Complex Event Processing?; and all of us have heard anecdotal stories about how today’s computers have the equivalence in intelligence of an earthworm. Alas, it is true! Present day high-speed computers, operating at a million operations a second still cannot compete with the human brain in many areas - especially in the area of pattern matching.

Event Processing Reference Architecture

This brings to memory “the good old days” back around 1992-1993 when I was consulting for Sprint. Sprint (Sprintlink) had the contract from the NSF to transition an academic-oriented NSFNet to a commercial Internet backbone. I was leading efforts to develop and build the network and security management for Sprintlink. It is interesting to note that good friends David Luckham and John Bates, two leaders in event processing, also have their roots in network management and security.

Well, to make a long story short, I had installed HP OpenView to help manage the network and was monitoring the Internet traffic at the major backbone routers, including FIXEAST and FIXWEST. Looking at the graph, the traffic looked odd; it would peak very high and then stop to almost zero, over and over again at regular intervals. I called everyone over, excited like a kid with a new toy (the Internet backbone is a nice toy, BTW) and exclaimed, “The Internet has a heartbeat! - It’s alive!!

Well, as an electrical engineer, it started to make perfect sense. The Internet is based on a queuing model for communications, the packets transmitted, queued and retransmitted across the Internet, just like a heartbeat. It was the visualization that brought theory and practical application into sharp focus. There is nothing in the known universe that compares to the human mind and the impact visualization has to help us understand and solve complex problems.

Complex event processing requires visualization at every level of the event processing model (above).

Examples of visualization in CEP are:

* Event Pre-processing: using an XPATH tool to map raw sensor input data to a JMS message format;

* Event Refinement: tracking and graphing a stock price, foreign exchange, or other event object.;

* Situational Refinement: providing a visual list in a network management center of the top 20 detected threat-related situations in an on-line banking application, with a estimated “name” and conditional probability;

* Impact Assessment: providing a visual list of the top 10 detected equity trading opportunities with estimated profit, along with with a risk KPI, if a trade is executed;

* Process Refinement: providing a visual graphic of alternative routes for commercial aircraft during a snowstorm; or,

* User Interfaces (UIs): tools to model and design event processing scenarios, rules and other analytics.

Business Activity Monitoring, or BAM, is another example of a current buzzword (another over-hyped subject!) for visualization in business applications.

So, What is Complex Event Processing?

I hope this brief eight part series on CEP was useful to readers interested in event processing and how to apply CEP to their area of expertise.

 

  1. Hall, D. and Llinas, J. editors, Handbook of Multisensor Data Fusion, CRC Press, Boca Raton, Florida, 2001.
  2. Ranadivé, V., The Power to Predict, McGraw-Hill, NY, NY, 2006.
  3. Bass, T., Fraud Detection and Event Processing for Predictive Business, TIBCO Software Inc., Palo Alto, CA, 2006.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值