OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 20.03.2026, 22:47

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Data Protection in Enhanced EXplainable Digital Twins: Insights from Hybrid AI

2025·2 Zitationen
Volltext beim Verlag öffnen

2

Zitationen

3

Autoren

2025

Jahr

Abstract

Deep Learning (DL) based artificial intelligence (AI) is becoming pervasive in almost all the day to day applications. These DL algorithms require large amounts of data, which can be easily collected nowadays by deploying sensors in the environment of the system under observation. These DL algorithms however, suffer from drawbacks such as being criticised as ‘black-boxes’, being biased to generating predictions based on past data and ignoring the human expertise. Hence, an architecture of Enhanced eXplainable Digital Twins (EXDTs) was proposed. EXDTs remove ‘black-boxness’ of DL algorithms by using the XAI approaches. Further, the digital twins (DTs) underlying the EXDTs are constructed through the sensor data and depict the real-time simulatability. Also, to improve the explainability and use the human expertise/ experience, EXDTs use the domain experts’ qualitative data. EXDTs are a class of Hybrid AI systems, that use the numeric data for underlying DL algorithms and subjective knowledge of the domain expert. This multimodal data in EXDTs opens up new attacks fronts for data manipulation and change the EXDTs’ outputs. In this manuscript, we discuss about the data protection in EXDTs, by simulating various types of attacks and suggesting possible ways the system can counter them. We feel our proposed framework is useful for the involved stakeholders in various domains like precision agriculture, healthcare, manufacturing, etc.

Ähnliche Arbeiten