OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 17.03.2026, 14:05

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Decoding Drift: Trustworthy Prompt Optimization in High-Stakes AI Environments

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

5

Autoren

2025

Jahr

Abstract

The reliability and trustworthiness of Large Language Models (LLMs) take shape as more high-stake industries like healthcare, finance, and legal systems embrace them as their applications. Nonetheless, the issue of prompt drift (minor cumulative deviations in the model behaviour because of differences in the prompt structure and contextual framing) is a major threat to model outputs that are consistent. We have introduced a sound solution to the trustworthy prompt optimization (TPO) problem in the form of a systematic and comprehensive methodology that mitigates the drift issue in three major ways: (1) a drift-sensitive evaluation criterion measuring semantic and policy deviation in the responses of LLCs through LLPitals, (2) a prompt tuning algorithm that balances performance and interpretability based on reinforcement learning, and (3) a module that allows the calibration of a human in the loop in high-stakes decision situations. The outcomes of experiments in three benchmark datasets of clinically relevant, fraud detection and legalistic reasoning tasks show that the proposed TPO framework is up to 27 percent more stable in output and 19 percent more faithful to the facts compared to the competing prompt engineering baselines. This study preconditions ethically sound and reproducible prompt design in safety-sensitive AI products, providing a template of compliance of regulations and ethical assurance within the framework of LLM implementation.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationEthics and Social Impacts of AIAdversarial Robustness in Machine Learning
Volltext beim Verlag öffnen