OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 18.03.2026, 21:15

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

SGBoost<sup>+</sup>: Efficient and Privacy-Preserving Vertical Boosting Trees for Federated Outsourced Training and Inference

2025·1 Zitationen·IEEE Transactions on Information Forensics and Security
Volltext beim Verlag öffnen

1

Zitationen

8

Autoren

2025

Jahr

Abstract

Vertical federated learning for boosting trees has gained significant attention due to its ability to enable participants to collaboratively train high-quality models while preserving data privacy. However, existing privacy-preserving vertical boosting tree schemes suffer from high computation and communication costs or potential security vulnerabilities. Recently, SGBoost, a federated outsourced training and inference scheme, was proposed to address these challenges. However, its performance and security still require significant improvements. Therefore, we propose SGBoost<sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">+</sup>, an efficient and privacy-preserving vertical boosting tree framework for federated outsourced training and inference. Building upon the strengths of SGBoost, we introduce an RLWE-based lossless and secure internal node construction and an efficient oblivious inference algorithm to finish the model training and inference, significantly enhancing both security and efficiency. To reduce communication cost, we design a ciphertext compression algorithm for model training, which drastically minimizes data transmission costs. Additionally, we analyze the security of a symmetric encryption scheme, specify the required security conditions and parameters, and optimize our model inference based on its improved and secure version. Detailed security analysis confirms that SGBoost<sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">+</sup> offers strong privacy guarantees. Extensive experiments demonstrate that SGBoost<sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">+</sup> achieves efficient model training and inference with significantly lower computation and communication costs compared to state-of-the-art schemes.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Privacy-Preserving Technologies in DataCryptography and Data SecurityArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen