top of page

The great promise of generative pretrained transformer models in healthcare- A glimpse at the future



Pre-trained models enable scale-up AI/ML implementations. Among all, generative pre-trained transformer models have been highly popular (such as GPT-3) in the industry, and implementations through APIs and service models are increasingly used and discussed. Yet, the implementations in the healthcare domain is limited, but there are lots of opportunities to explore. In this post, I will share an excerpt from our recently published study to illustrate how to leverage GPT as-a-service in the healthcare with a specific focus on chatbot triaging and text summarization.


GPT-3 as-a-service: Chatbot triaging and triage text summary services in healthcare

The power of meaningful text generation by GPT-3 makes it an ideal tool for human-machine conversations, especially those conducted via chatbots. In our case, we illustrate the use of GPT-3 within a hospital network. In this hypothetical example, the hospital is providing a chatbot triaging mechanism for incoming patients to reduce the overhead at clinics and increase the safety and quality of care during the COVID-19 pandemic. The chatbot has to be connected to the hospital network, combined with a triage text summary service that is to be reviewed, and stored in the electronic health record (Figure). Putting aside the front-end details in this workflow, this use case outlines a typical implementation of GPT-3 as a service within a health system.

GPT-3 use case (chatbot triaging and patient note summarization)


In this example, triage could be initiated by a patient or a hospital to conduct a health screening. The front-end application is operationalized through a chatbot mechanism over a front-end application, which could be a patient portal app, voice assistant, phone call, or SMS text messaging. Once a connection is established, the hospital system formulates GPT-3 requests by gathering patient health information and formatting this information to be interpretable with the GPT-3 model. Within the secure hospital network, GPT-3 is located outside of the EHR and provided as the “GPT-3-as-a-Service” platform. The application programming interface enables interoperability and acts as a gatekeeper for the data transfer of requests and responses. Once a request is received, the “GPT-3-as-a-Service” platform preprocesses the data and requests, allocates the tasks to be completed, produces outputs in an interpretable format, and sends the outputs to users. The type of tasks allocated depends on the requests, which, in our case, are question answering, text generation or culturally appropriate language translation, and text summarization. The response is sent back to the EHR system and then to the front-end application. At the end of triage, similar to the after-visit summary, the conversation text is summarized. To reduce the additional clinical burden of reading the whole conversation, GPT-3 summarizes the text (similar to a digital scriber) and stores it in the patient’s health records. To avoid or address potential biases, correct errors, and increase the control over patient data use and the model, the human-in-the-loop model can be implemented by using a report back mechanism at the front end, or the clinical team can be given oversight of GPT-3 integrated process in the hospital EHR system at the back end. Furthermore, the error corrections and adjustments in the text can be used to fine-tune the GPT-3 model to increase its accuracy and effectiveness. To be able to execute this use case in a real-world setting, health care practitioners and decision makers should consider and address the following operational and implementation challenges.

In the study, we further discussed the implementation considerations [(1) processing needs and information systems infrastructure, (2) operating costs, (3) model biases, and (4) evaluation metrics] and operational factors that drive the adoption of GPT-3 in the US health care system [(1) ensuring Health Insurance Portability and Accountability Act compliance, (2) building trust with health care providers, and (3) establishing broader access to the GPT-3 tools]. To read more: https://medinform.jmir.org/2022/2/e32875/



Featured Posts
Recent Posts
Archive
Search By Tags
Follow!
  • Twitter Basic Square
bottom of page