OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 22.03.2026, 03:12

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

AuroraMem: Towards Eternal Memory in Conversational AI

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

Conversational agents built on large language models (LLMs) exhibit strong reasoning and dialogue abilities but remain fundamentally constrained by the lack of persistent, structured memory. Without principled mechanisms to extract, store, and reuse salient information, these systems suffer from amnesia-like behavior, leading to incoherence, redundancy, and degraded long-term performance. We present AuroraMem, a modular memory framework designed to systematize the full memory lifecycle: extraction, write, retrieval, and usage. AuroraMem introduces a dual-phase design: an add phase that distills salient facts, generates dynamic tags, and incorporates code-aware ingestion for structured storage; and a search phase that executes parallel hybrid retrieval, salience- and recencyaware ranking, and diversification to provide high-value context efficiently. Evaluations on two challenging benchmarks, LoCoMo and LongMemEval, demonstrate that AuroraMem consistently outperforms strong baselines-including full-context prompting, retrieval-augmented generation (RAG), and prior memoryaugmented agents-achieving higher accuracy, lower latency, and improved judged quality. Ablation studies further reveal substantial token compression and robustness across diverse LLM backbones. These results establish AuroraMem as a scalable, interpretable, and extensible approach to long-horizon conversational memory, moving closer to the vision of conversational AI endowed with enduring and adaptive memory.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Topic ModelingMultimodal Machine Learning ApplicationsArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen