T-Cell Architecture for DUET Digital Twins
DUET project creates digital twin platforms for urban regions that transcends the scope of a single use case to support:
Adding new (possibly third-party) data sources to an existing digital twin case
Adding simulation models to an existing case for comparison
Adding a simulation model to an existing case to serve as (additional) input for another model
Adding visualization/interaction clients to enrich the reporting for a case
Extending the digital twin ontology to support more city domains or expand existing ones
For maximum flexibility, the team have created a DUET-Cell architecture as a plug-in interface to support all these features. The DUET data 3 broker corresponds to the DUET-Cell. It is shielded by APIs that allow the components to connect to the T-Cell’s internal message streaming system on which all data flows between the different components.
The Digital Twin Data message streaming system lies at the core of the DUET T-cell and facilitates data streams between all components. The streaming features of the message streaming system allows components to subscribe to data events. It also facilitates quick access to data in the case of responsive digital twin scenarios where data needs to be kept ready for use in models and/or visualizations. One of the most essential tasks of a digital twin system such as DUET is to enable the combination of different data sources to use them in (simulation) models or visualizations. More often than not, data sources are not fully compliant with known standards. And even if they do, it may not be the standard the user was hoping for, but a competing one. Thus, integrating data sources from different suppliers remains non-trivial. The DUET architecture should allow use case designers and users to deal with that.
The proposed approach is to map data to a common language, called the DUET ontology. Onboarded data needs to be mapped to that ontology at the entrance of the platform. The DUET common ontology is stored in the knowledge graph. It reflects DUET’s understanding of the smart city and its domains. This ontology will be largely inspired by existing common standards of course. But it can be extended when needed to support specific cases and scenarios. In order to connect different models to DUET, we need mechanisms to control these models from DUET on the one hand and multiple data APIs that allow these models to consume DUET data on the other. The latter are an integral part of the DUET data broker API. This API is discussed in more detail in DUET Deliverable D3.5. Models will also produce output. As such they act as DUET data sources and they should be registered as such to make them available to other models or visualization and interaction clients. The Data Catalog plays an essential role in registering data sources along with their metadata. This does not imply uploading data, but instead letting the DUET data broker know where to find it. Not only is this essential to make data sources discoverable, the data catalog is also instrumental in determining if a data source is compatible with a certain use case.
Visualisations and interaction clients are the most visible parts of the DUET architecture but they are clients of the DUET data broker. Consuming the data on the one hand and sending back messages (for instance interaction messages that trigger model recomputations). The agent APIs referred to here are API specs that enable the DUET data broker to interact with models. It is up to the model provider to implement their behavior. That is, unless some standard way of model orchestration can be applied, for instance using Kubernetes.
Implementation Note: The presented architecture of DUET does not rely on storing entire data sets inside the system. This will in general not even be a possibility when data sets get very large, which is often the case for IoT historical data and geo data. Although it is possible to store (smaller) data sets inside the data catalog, we do not intend to make use of that feature for now. This implies that the data is being streamed across the platform.