OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 05.05.2026, 07:19

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

To Study the "Development of a Self-Hosted, Offline AI Chatbot using Llama 3, Ollama, and FastAPI"

2026·0 Zitationen·Zenodo (CERN European Organization for Nuclear Research)Open Access
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2026

Jahr

Abstract

Artificial Intelligence (AI) chatbots traditionally rely on cloud-based large language models (LLMs), limiting their usability in environments with privacy concerns, restricted internet access, or high latency. This research presents the development of a self-hosted, completely offline AI chatbot using Llama 3, Ollama, and FastAPI. The system is designed to run locally while maintaining strong performance in natural language understanding, multimodal input processing, and real-time response streaming. The architecture integrates an efficient model-serving layer (Ollama) with a lightweight backend (FastAPI) and supports text and image-based queries. Experimental evaluation shows that the offline chatbot achieves competitive accuracy, low latency, and strong privacy guarantees. This work demonstrates a practical framework for organizations seeking cost-efficient, secure, and internet-independent AI solutions.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

AI in Service InteractionsBig Data and Digital EconomyArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen