Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Continuous Architecture Assurance: Measuring Governance Decay in Enterprise AI Systems
0
Zitationen
1
Autoren
2026
Jahr
Abstract
Three major AI vendors made 19 to 24 documented behavioral changes in twelve months, yet most enterprises review vendor AI systems only annually or semi-annually. This paper introduces Governance Decay Rate (GDR), a metric that quantifies the governance gap by dividing behavioral change events by completed reviews. Using public changelog data from OpenAI, Anthropic, and Google Gemini, the analysis demonstrates that all three vendors exceed governance coverage targets under typical review cadences. Deployment at a US healthcare organization identified four systems with unreviewed gaps, triggered an unscheduled review that found measurable behavioral drift on clinical queries (Jensen-Shannon divergence = 0.12, p < 0.01), and produced lasting process changes including a shift from annual to semi-annual governance cadence. Multi-agent pipelines compound the problem beyond any feasible human review frequency, suggesting compositional AI architectures require automated governance mechanisms. GDR was originally developed within the GAIF framework but is evaluated independently here. The open-source toolkit is available at github.com/aman210122/gaif-governance-observatory.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.711 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.884 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.502 Zit.
Fairness through awareness
2012 · 3.301 Zit.
AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations
2018 · 3.192 Zit.