How knowledge graphs based on atomic events provide continuous decision intelligence
The award-winning Metaverse Moonshot Catalyst of 2023 focuses on enabling an industrial metaverse to solve complex ecosystem challenges and transform travel experiences. This requires virtual and cyber-physical systems to share real-time information that provides continuous intelligence for optimal decision-making and effective action. And this, in turn, requires event-driven systems that adopt common models for interoperability.
Event-driven systems produce, consume, and react to events, where an event represents an entity’s state change, such as a change in room temperature, the completion of a purchase order, or an aircraft landing.
Gartner notes that digital businesses must quickly detect and leverage new opportunities presented by digital events and predicts that most business systems will be built around event-driven architecture (EDA) and IoT data rather than traditional master data. EDA aligns IoT models and concepts with those of business systems.
Event-driven systems are designed for unpredictable and asynchronous environments, providing a simple and scalable infrastructure for real-time information exchange and highly distributed workflows.
With event brokers, IoT, cloud computing, blockchain, in-memory data management, and AI, businesses can detect and analyze these events more efficiently.
EDA uses autonomous messages to communicate digital representations of state-change events. Unlike traditional synchronous models, it leverages pub/sub to push notifications out to connected systems asynchronously. This allows for independent and frequent changes with minimal impact on other systems.
EDA is often integrated with message-driven architectures that route events, as message payloads, to specific addressable destinations.
Billions of connected devices within the Internet of Things generate unprecedented volumes of event data that must, to create value, be eﬃciently indexed, shared, stored, queried, and analyzed.
However, data from these devices and cloud services is often stored and communicated in various formats, making it difficult to understand the data's meaning and context. This lack of semantic information means that significant effort is required to normalize the data before it can be used by systems to generate value effectively.
Thus, it’s necessary for real-time systems to supplement state change events with contextual metadata to ensure that connected systems can understand the data being exchanged – in other words, be semantically interoperable.
The atomic event model tied to the common ontology provides a "lowest common denominator" for machine-to-machine (M2M) distribution of entity state changes, where an entity represents an instance of an entity class within the ontology.
EDA implementations have typically defined a custom event model for each entity class, with each event comprising multiple class attributes. Conversely, each event based on the atomic event model represents a state change of a single attribute, which enables one event model to be utilized across any entity class.
The atomic event model includes a time point, entity-attribute-value (EAV) data model elements, the entity’s class, and the source system that produced the event, as shown in the example below.
Each atomic event represents a basic statement of fact that’s valid at a time point (a temporal fact). Atomic events are appended to an event-driven system's knowledge base, serving as the authoritative source of data. Changes and deletions of events are disallowed, allowing the knowledge base to provide a reliable audit trail of entity state changes in a distributed environment.
By reading atomic events within the knowledge base, an analytics or simulation service can recreate the state of related entities at a specific point in time. This event-sourcing pattern is particularly beneficial for highly distributed data ecosystems such as supply chains, airports, and cities. It provides a simple, scalable, and traceable alternative to the traditional CRUD model and enables intelligent services and complex event processing to build previous, current, and projected states from a consistent source of historic atomic events.
Entity relationships are supported by atomic events that contain an entity identifier within their “value” attribute. This enables temporal knowledge graphs to be generated for any time point.
Event-driven systems and services are consumers of atomic events, where the events are identified, and the appropriate reaction is selected and executed. This can also lead to additional atomic events being produced and processed. For example, an inbound atomic event may indicate that “at 10:43 on 10/25 the air temperature of floor 4 is 76 degrees Fahrenheit” (event 1 in the figure below). An event-driven system can orchestrate the processing of this event among services based on their rules. This can trigger an action that produces an outbound atomic event to “set rotation speed of actuator fan6 to 30 RPM” (event 2).
This is an example of simple event processing, where a specific, measurable state change initiates downstream action(s). Simple event processing can drive real-time, cross-domain workflow (events 3-5).
Complex event processing (CEP) evaluates a confluence of atomic events based on rule patterns and then acts. Machine learning can automate the generation of these rule patterns. CEP is commonly used to detect and respond to anomalies, threats, and opportunities. For example, evaluating historic atomic events reflecting changes in temperature and occupancy on floor 4 may result in changes to process rules within an energy management system. This, in turn, can impact simple event processing.
Common data services adopted by all connected systems can support distributed state management through atomic events, including state synchronization between digital twins and their physical counterparts. These common services can create, store, update, access and share the state of digital entities in a distributed environment, including the ontology itself.
This atomic event model can effectively support the semantic heterogeneity of events in large and open ecosystems and provides event consumers with the minimal information necessary to react to any state change occurrence. This design pattern can support an overall architecture that is simple, scalable, and sustainable.