Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
GPT (Generative Pre-Trained Transformer)— A Comprehensive Review on Enabling Technologies, Potential Applications, Emerging Challenges, and Future Directions
469
Zitationen
12
Autoren
2024
Jahr
Abstract
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the domain of natural language processing, which is propelling us toward the development of machines that can understand and communicate using language in a manner that closely resembles that of humans. GPT is based on the transformer architecture, a deep neural network designed for natural language processing tasks. Due to their impressive performance on natural language processing tasks and ability to effectively converse, GPT have gained significant popularity among researchers and industrial communities, making them one of the most widely used and effective models in natural language processing and related fields, which motivated to conduct this review. This review provides a detailed overview of the GPT, including its architecture, working process, training procedures, enabling technologies, and its impact on various applications. In this review, we also explored the potential challenges and limitations of a GPT. Furthermore, we discuss potential solutions and future directions. Overall, this paper aims to provide a comprehensive understanding of GPT, its enabling technologies, their impact on various applications, emerging challenges, and potential solutions.
Ähnliche Arbeiten
Deep Residual Learning for Image Recognition
2016 · 215.889 Zit.
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 85.845 Zit.
ImageNet classification with deep convolutional neural networks
2017 · 75.547 Zit.
Very Deep Convolutional Networks for Large-Scale Image Recognition
2014 · 75.404 Zit.
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
2016 · 52.596 Zit.