What can doctors and patients do with ChatGPT? | Expert Opinion
It’s no secret anymore – everyone Googles their medical concerns. But now there's a new player in town – ChatGPT. But be careful using AI.
Everyone Googles their medical concerns.
Medical information is no longer the exclusive purview of medical professionals — anyone with internet access can consult “Dr. Google.”
Now there’s a new virtual medical assistant in town: ChatGPT, a chatbot that uses artificial intelligence to respond to any question.
The tool has strong entertainment value, acing a Wharton professor’s MBA test and spinning yarns about Gritty. It’s also found concrete uses in real estate, retail, and health care.
As physicians, we see tremendous promise in ChatGPT, but also have serious concerns about how we and our patients may interact with it.
ChatGPT is not currently HIPAA compliant, which limits its use in treating patients. We can use ChatGPT to search the medical literature and other databases — speeding up our ability to diagnose patients and develop treatment plans.
One day, we may be able to use ChatGPT’s voracious appetite for information to manage and organize vast amounts of patient information, and enable us to quickly search for needed details in our (somewhat clunky) electronic medical record systems.
For patients, ChatGPT can make finding health information much easier. Whereas Dr. Google connects users to web pages where you can search for answers to questions, ChatGPT does the searching for you and can engage in a personalized conversation about a topic of interest. ChatGPT may also help patients determine when to seek a real human doctor.
Health systems and physicians could work with the software to build strategies that help patients understand what symptoms or situations require emergency evaluation in a traditional ER versus a trip to a local urgent care, a call to their primary doctor, or just some rest at home.
At the same time, doctors worry about over-reliance on a technology that does make mistakes and sometimes misinterprets complex medical information. AI algorithms may also potentially amplify racial, ethnic, and gender biases.
We asked ChatGPT, “When used for medical purposes, do you prefer to be called Dr. ChatGPT?” This was its response:
No, as an AI language model, I do not have the credentials or qualifications to be a medical doctor, nor am I capable of diagnosing or treating medical conditions. Therefore, I should not be referred to as Dr. ChatGPT in a medical context. It is important to consult a qualified medical professional for any medical concerns or advice.
The future of AI in medicine remains exciting, and we need to lean into this revolutionary technology to ensure that we harness it in effective, safe, and equitable ways.
It has extreme potential to support health for individuals and communities and real potential to exacerbate current disparities or create new ones. Patients and their doctors need to adapt and learn with models like ChatGPT to understand how we grow together to support human health.