OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 26.04.2026, 02:20

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Ensemble Transitive Bidirectional Decoupled Self-Distillation for Time-Series Classification

2026·3 Zitationen·IEEE Transactions on Systems Man and Cybernetics SystemsOpen Access
Volltext beim Verlag öffnen

3

Zitationen

7

Autoren

2026

Jahr

Abstract

Numerous existing deep learning models for time-series classification (TSC) tend to overlook the intricate interplay between higher-and lower-level semantic information. While the focus is often on extracting higher-level semantics from lower-level sources, the reciprocal influence of lower-level information on higher levels is undervalued. To address this, we propose an ensemble transitive bidirectional decoupled self-distillation (ETBiDecSD) method for TSC. ETBiDecSD enhances the robustness of higher-level semantic information using an average feature ensemble (AFE) method to amalgamate the output from each level. Simultaneously, the integrated features are transmitted to each lower level through a directional decoupled distillation (DD) structure. Additionally, to promote deep interaction between higher-and lower-level semantic information, ETBiDecSD introduces a transitive bidirectional DD (TBDD) structure, facilitating the transfer of target-class and nontarget-class knowledge between higher and lower levels. Experimental results demonstrate that whether a fully convolutional network (FCN) with four convolutional blocks or InceptionTime with four Inception blocks is used as the baseline, ETBiDecSD outperforms a quantity of well-established self-distillation algorithms across 85 widely used UCR2018 datasets, as evidenced by the metrics “win”/“tie”/“lose” and avg. rank, which are derived from accuracy and <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">F</i><sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub>-scores. Notably, when compared to a nonself-distillation FCN, ETBiDecSD achieves “win”/“tie”/“lose” results of 64/4/17 in terms of accuracy and 65/4/16 in terms of <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">F</i><sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub>-score. Similarly, in comparison to a nonself-distillation InceptionTime, ETBiDecSD attains “win”/“tie”/“lose” results of 60/12/13 for accuracy and 57/12/16 for <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">F</i><sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub>-score.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Time Series Analysis and ForecastingMachine Learning in HealthcareEEG and Brain-Computer Interfaces
Volltext beim Verlag öffnen