Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
ArtiQ.QC facilitates spirometry quality control in asthma and COPD clinical trials
1
Zitationen
8
Autoren
2021
Jahr
Abstract
<b>Introduction:</b> Acquiring high quality spirometry data in clinical trials is important, particularly when using FEV1 or FVC as primary endpoints. In addition to quantitative criteria, the ATS/ERS quality control standards include subjective evaluation which introduces inter-rater variability. Within clinical trials, over-readers usually review spirometry curves to ensure data quality. <b>Objectives:</b> This study explores the value of artificial intelligence-based quality control software (ArtiQ.QC) to determine spirometry quality in clinical trials. <b>Methods:</b> A total of 2000 spirometry sessions (8258 curves) were randomly selected from Chiesi COPD and Asthma clinical trials (1000 sessions per disease). Acceptability using the 2005 ATS/ERS guidelines was determined by over-reader review and compared with acceptability defined by ArtiQ.QC (Das et al. ERJ 2020). In addition, two respiratory physicians jointly reviewed a subset of curves. <b>Results:</b> ArtiQ.QC agreed with over-readers in 87% of cases, with 93% sensitivity and 93% positive predictive value (PPV). In a subset of data, when ArtiQ.QC and over-readers agreed, the independent physician review agreed in 47/50 (94%) curves. When ArtiQ.QC and over-reader labels disagreed, the independent physicians agreed with ArtiQ.QC in 103/156 (66%) curves. Inter-rater variability in quality control assessment likely impacts the sensitivity and specificity of the software. <b>Conclusion:</b> ArtiQ.QC software results are comparable to the over-reader’s and could assist in the quality assessment of spirometry in clinical trials. By providing immediate and consistent results, using ArtiQ.QC may benefit clinical trial conduct and reduce the variability in outcomes.
Ähnliche Arbeiten
The PRISMA 2020 statement: an updated guideline for reporting systematic reviews
2021 · 85.193 Zit.
Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement
2009 · 82.801 Zit.
The Measurement of Observer Agreement for Categorical Data
1977 · 76.929 Zit.
Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement
2009 · 62.792 Zit.
Measuring inconsistency in meta-analyses
2003 · 61.509 Zit.