OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 07.04.2026, 23:42

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Innovations In Machine Assessment Of Replicability

2026·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

40

Autoren

2026

Jahr

Abstract

Automated methods for the assessment of replicability of scientific claims offer a scalable complement to replication studies and traditional peer review. Drawing on a large dataset of claims, human judgments, and a limited set of replication outcomes, we developed and evaluated three distinct artificial intelligence systems designed to predict human expert assessments of replicability using diverse methodologies—including synthetic prediction markets, interpretable feature-based modeling, knowledge graph reasoning, and semantic parsing with argument structures. While these systems achieved modest calibration to human judgment distributions, they failed to discriminate between replicable and non-replicable claims. Our findings suggest that while machine assessments of research replicability may complement human reasoning, their current performance limitations and opportunities for bias demand careful evaluation before real-world application.

Ähnliche Arbeiten