Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Role of chat-generative pre-trained transformer (ChatGPT) in anaesthesia: Merits and pitfalls
4
Zitationen
4
Autoren
2023
Jahr
Abstract
Dear Editor, Large language models (LLM) are artificial intelligence intended to imitate human language processing abilities. Generative pre-training transformer (GPT) is a form of LLM generated by OpenAI (San Francisco, CA, USA). A massive dataset of text (570 GB with 175 B parameters) was used to train the latest model (GPT-3) released in 2020, allowing it to create realistic and coherent text.[1,2] The potential use of this technology in the field of anaesthesia is far-reaching [Table 1]. It could be used to educate patients, provide perioperative instructions and aid the anaesthesiologist in clinical decision-making, such as selecting the type and dose of an anaesthetic agent based on the patient data and surgical procedure.[3,4] It could reduce the documentation load, allowing anaesthesiologists to focus on critical clinical matters.[3,4] ChatGPT can also potentially be used in the emergency area to triage patients, leading to faster delivery of healthcare and efficient use of manpower and resources. It could also enable communication, allowing patients to receive care/conduct pre-anaesthetic evaluation remotely, reducing the need for in-person visits.[4]Table 1: Potential applications and major pitfalls of ChatGPTChatGPT may also help non-native English speakers overcome language barriers, thus enhancing research equity and versatility.[5,6] ChatGPT also has the potential to conduct surveys and track outcomes over more extended periods.[4] It also has the potential to write entire manuscripts [Supplementary File 1]. It may also facilitate the spread of scientific research by converting complex findings into a more comprehensible language for the general community.[1,2] There are myriad concerns and challenges regarding integrating this technology into routine anaesthesia practice [Table 1].[3,5,6] There is a potential for ChatGPT-3 to generate inaccurate content with a risk of hallucination (factually incorrect information which appears scientifically credible), leading to untoward consequences.[4,6] Chat-GPT needs more medical proficiency and background to fully understand the complex causal relationships between different diseases and treatments.[1] ChatGPT training is based on datasets generated before 2021, so it cannot be used presently as an updated source of literature review. There is also a potential for research fraud and copyright hassles as ChatGPT is not an acceptable author based on the existing Committee on Publication Ethics and the International Committee of Medical Journal Editors authorship guidelines.[5] Artificial intelligence can only assist and complement an experienced anaesthesiologist, and thorough human supervision of Chat-GPT is a must. Financial support and sponsorship Nil. Conflicts of interest There are no conflicts of interest.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.493 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.377 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.835 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.555 Zit.