OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 21.03.2026, 22:37

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Passing the Torch: Leveraging Generative Artificial Intelligence to Preserve and Retrieve Knowledge for the Next Generation

2025·0 Zitationen·Propellants Explosives PyrotechnicsOpen Access
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2025

Jahr

Abstract

Nearly a decade ago, two experimentalists entered my office, puzzled by an unfamiliar observation during an exploding bridge wire experiment. The electrical current trace they had obtained had a surprising feature in the signal: a brief preliminary current surge followed by a drop to nearly zero current, and then a current resurgence after a modest delay. As they discussed their findings, I recalled documents published nearly 50 years earlier that detailed the electrical “current pause” phenomenon and the underlying physics. This moment underscored a critical issue we faced then and even more so today: the risk of information loss as seasoned experts retire and valuable knowledge fades from collective memory. In the modern era of information overload, the challenge of capturing, retaining, and passing on critical insights poses a significant risk to the continuity of our collective memory while adversely impacting future scientific developments. As subject matter experts (SMEs) retire, transferring decades of knowledge to new staff, each with diverse learning styles shaped by contemporary academic experiences becomes a significant challenge. How people learned a decade ago differs significantly from today's methods. This, coupled with staff turnover averaging every 5 years, poses a technical risk and potentially increased costs to our research. How do you retain and easily retrieve critical knowledge after SMEs retire? The departure of experienced SMEs jeopardizes knowledge retention. These individuals carry invaluable insights, intuition, and practical wisdom that cannot be easily documented and accessed. Responses from the community to the challenges of knowledge loss have been limited. Suggestions such as hosting technical seminars for new staff or encouraging colleagues to read seminal books or papers fall short of addressing the systemic issues at play. A growing stockpile of valuable information often resides on hard drives belonging to retired senior staff. Additionally, SMEs have been asked to create final presentations that may not be widely disseminated or utilized. Fortunately, technological innovations present new avenues for preserving and retrieving knowledge. Large Language Models (LLMs) are advanced artificial intelligence systems designed to understand and generate vast amounts of data in familiar human-like interactions. They utilize deep learning techniques to predict and produce coherent responses, making them valuable tools for tasks such as natural language processing, content generation, image and video analysis, and information retrieval. However, relying solely on LLMs does not adequately capture the full breadth of knowledge within a given field. The LLM is only as good as the data from which it was trained. New information cannot be easily added to the model. Additionally, if the user poses a query to the LLM for which it has not been trained, the answer prediction quality is significantly lacking, resulting in a ‘hallucination’. Retrieval-augmented generation (RAG) techniques offer a promising alternative to complete reliance on an LLM by utilizing external knowledge bases consisting of chunks of relevant information from a corpus of documents or imagery. Using semantic similarity techniques, a user query is first compared to neighboring chunks of text in a vector datastore. The relevant chunks of text are then extracted and sent to the LLM, to construct a synthesized response. Traditional RAG approaches have been used to satisfactorily answer simple Question and Answer-type questions requiring a single piece of evidence. More advanced generative artificial intelligence (AI) techniques leverage structured representations of information chunks (e.g., knowledge graphs). These techniques are beginning to provide adequate solutions for complex queries requiring the connection of multiple pieces of information through intermediate steps (e.g., multi-hop) to achieve the goal of reasoning. The energetic materials and initiator communities can harness generative AI to create a virtual subject matter expert (SME) that has access to a comprehensive local or global knowledge base comprising various data types, including text, images, and videos. Using the techniques mentioned above the virtual SME would possess the flexibility to respond to a wide range of queries, such as “What is…?”, “How do I…?”, and “Can you explain or analyze this?” Additionally, it could provide historical analyses and assist in developing tailored learning curricula, thereby enhancing the educational experience, and facilitating knowledge transfer within the community. In the case of the unusual “current pause” experimental result, researchers could either query the virtual SME to cite previous cases where “the current paused” or to analyze an image of a current trace and obtain both an explanation of the behavior with cited sources for further learning. Additional examples of how an energetic materials virtual SME could be used include: “What are the properties of material A compared to material B?” and “What are the best practices for conducting a stability test?” “What are the common causes for failures in similar experiments?”, “What topics should be included for new researchers in energetics?”, “Who are the leading experts in the field of propellant formulation?”, or “What safety measures are recommended for handling a specific type of explosive?” The combination of an LLM and a structured knowledge base provides users with an unlimited number of question types that they can submit. The goal of the virtual SME is to serve as a risk reduction strategy for community knowledge and as an effective means of passing information to the next generation. The preservation of knowledge is not merely an academic concern; it is a vital necessity for the continued advancement of our field. By addressing the challenges of information overload, the retirement of experienced SMEs, fostering community collaboration, and embracing technological innovations, we can create a sustainable ecosystem enabling knowledge retention and retrieval that benefits all stakeholders. The future of our community's knowledge depends on our collective commitment to safeguarding its rich legacy.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Reservoir Engineering and Simulation MethodsArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen