Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Epistemic Authority and Clinical Governance in AI-Integrated Radiography: A Practice-Oriented Narrative Review
0
Zitationen
1
Autoren
2026
Jahr
Abstract
Artificial intelligence in radiography reconfigures professional accountability, radiomic data stewardship, and clinical governance obligations across the imaging workflow. This narrative review proposes epistemic authority — the radiographer's accountable role as guarantor of the validity, interpretability, and governance of AI-mediated outputs — as the unifying framework for understanding these transformations. AI integration redistributes procedural tasks without transferring professional responsibility: radiographers retain non-delegable accountability for radiation justification, technical–clinical compromise, and patient-centred judgement, particularly in paediatric, frail, trauma, and non-standard contexts where algorithmic quality thresholds conflict with ethical proportionality. Radiomic validity is contingent on acquisition reproducibility; variability in kVp, mAs, reconstruction algorithm, and voxel dimensions generates epistemically unstable feature distributions, positioning technical standardisation as a form of scientific accountability rather than mere protocol compliance. Systemic vulnerabilities — automation bias, professional deskilling, dataset inequity, and algorithmic drift — are compounding, longitudinal threats requiring governance architectures designed around technical–clinical–organisational interfaces. The EU AI Act formally classifies healthcare AI as high-risk; sustainable compliance requires active professional operationalisation, not passive institutional adoption. Epistemic authority is the defining competency of contemporary radiographic practice, and its cultivation — through Critical AI Literacy embedded in pre-registration curricula, continuing professional development, and institutional governance — represents a genuine expansion of professional authority, not a diminution of it.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.611 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.504 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 8.025 Zit.
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
2019 · 6.835 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.