American Association for Physician Leadership

Artificial Intelligence and ChatGPT: Physician Friend or Foe?

Neil Baum, MD


David F. Mobley, MD


May 9, 2024


Healthcare Administration Leadership & Management Journal


Volume 2, Issue 3, Pages 107-109


https://doi.org/10.55834/halmj.7392561549


Abstract

ChatGPT is a language model trained on massive volumes of Internet data. It attempts to imitate human text and can perform various roles in healthcare and health research. The program enables users to tap into vast amounts of information and understand words in context. ChatGPT can access large amounts of data and computing techniques to make predictions and string words together meaningfully. This article discusses the history of artificial intelligence, provides an example of using artificial intelligence to obtain prior authorization, and discusses the ethical considerations of this exciting new technology.




The past 500 years have seen a number of events that have changed the world. The invention of the printing press by Johannes Guttenberg in the 16th century might be considered the beginning of education for large masses of the population. Next, the steam engine appeared in 1712 and made transporting and moving goods and services more accessible and economical. In 1879, Thomas Edison invented the light bulb, which began to illuminate the world and dramatically improved the quality of all lives. This was followed by Henry Ford and the invention of the assembly line, making transportation within reach of the middle class. Then came the Internet in the 1960s, achieving widespread use in the early 1980s. This revolutionized nearly every area of business, medicine, industry, and our personal lives. In December 2022, OpenAI (OpenAI.com) released ChatGPT. This use of artificial intelligence (AI) will change the delivery of healthcare throughout the world. ChatGPT has applications that can be used for managing both clinical and nonclinical aspects of medical practices. This article presents a brief history of Chat GPT; discusses its potential medical applications, including obtaining a prior authorization; and considers the ethical and legal concerns of using AI in the healthcare profession.

A BRIEF HISTORY OF CHATGPT

ChatGPT was developed by OpenAI and was launched as a prototype on November 30, 2022. By mid-December 2022, the program had over one million users. OpenAI allows the public to use and experiment with the technology at no charge. Like Grammarly.com, free versions allow users to try the technology without charge. The company plans to monetize the program with upgrades, ChatGPT Plus, which currently costs $20 per month. OpenAI’s market valuation in 2023 was $29 billion.(1)

APPLICATIONS AND POTENTIAL USES OF CHATGPT IN MEDICAL PRACTICES

An easy way to start the ChatGPT experience is by using it to assist with mundane tasks such as writing back-to-work certificates, medical exemptions from jury duty, and letters to insurance companies to pay for patients’ expensive medications. Initially, ChatGPT functions as a high-level personal assistant to speed up bureaucratic tasks and increase time for patient interaction.

ChatGPT also could assist in more serious medical activities, such as triage algorithms to choose which patients can access kidney dialysis or intensive care beds, which is critical in settings where resources are limited. The technology could be used to enroll participants in clinical trials. The potential in the medical field is almost endless.

It is too early to know all the ethical implications of adopting ChatGPT in healthcare and research. The more this technology is used, the more precisely the implications will be understood. However, questions regarding potential risks and oversight of ChatGPT in medicine will inevitably be part of future conversations.

Here is an example of one way ChatGPT could be used to automate administrative tasks. Obtaining prior authorization is time-consuming, accounting for many frustrating hours of uncompensated physician time each week. Obtaining prior authorization is considered a veritable nightmare for most medical practices. ChatGPT can be used to streamline the process. What usually takes 30 to 60 minutes per prior authorization can be reduced to seconds. Here is an example that could be used in a medical practice:

A urologist wants to write a letter to the patient’s insurance company requesting prior authorization for a multiparametric MRI (mpMRI) to aid in the diagnosis of prostate cancer. They open ChatGPT in openAI.com.

The physician or an appropriately trained assistant provides the patient’s information but does not include the patient’s name or any other identifying information, because they cannot be sure the site was encrypted or if they are at risk for a HIPAA violation. They give ChatGPT the following instruction:

Write a letter to United Health regarding patient John Doe (membership ID XXXXXXXX, Birthdate XX\XX\XXXX), who has an elevated PSA, 8.3 ng/mL, a free/total PSA ratio of 19% (normal is greater than 25%). I am requesting a multiparametric MRI and a prostate biopsy to complete his evaluation for prostate cancer. Please provide recent references regarding the appropriateness of this procedure. Signed Dr. X, MD.

This letter was generated in less than 45 seconds, with another 60 seconds required for the nurse or the medical assistant to add the patient’s name and identification number.

This was the letter created by ChatGPT:

Dear United Health,

I am writing to request approval for a multiparametric MRI and prostate biopsy for patient John Doe, who holds a membership ID of XXXXXXXX and was born on XX/XX/XXXX.

Upon examination, Mr. Doe’s PSA levels were elevated at 8.3 ng/mL, with a free/total PSA ratio of 19% (normal is greater than 25%). Given these findings, I strongly recommend a multiparametric MRI and prostate biopsy to assess the condition of Mr. Doe’s prostate gland and determine if further treatment is necessary.

Could you please approve this study for Mr. Doe’s benefit and make sure he receives the care he needs?

I am including recent references that show how important it is to manage men with an elevated PSA level. These are references from peer-reviewed literature on the value of mpMRI in managing patients with elevated PSA levels. It is essential to remain up-to-date on the latest advancements and recommendations.

Thank you for your attention to this matter. Please feel free to reach out to me with any questions or additional information.

Sincerely,

Dr. X, MD

Recent references on mpMRI

Multiparametric magnetic resonance imaging (mpMRI) helps diagnose prostate cancer. Here are some references that can be useful:

  1. National Comprehensive Cancer Network (NCCN) guidelines for prostate cancer. The NCCN guidelines provide recommendations for the use of mpMRI in diagnosing and managing prostate cancer. The guidelines can be found at www.nccn.org/professionals/physician_gls/pdf/prostate.pdf .

  2. European Association of Urology (EAU) guidelines for prostate cancer. The EAU guidelines also provide recommendations for the use of mpMRI in diagnosing and managing prostate cancer. The guidelines can be found at https://uroweb.org/guideline/prostate-cancer/ .

  3. American Urological Association (AUA) guidelines for prostate cancer. The AUA guidelines provide recommendations for using mpMRI in the initial diagnostic evaluation of men with suspected prostate cancer. The guidelines can be found at www.auanet.org/guidelines/prostate-cancer-diagnosis-and-treatment-clinically-localized .

  4. Prostate Imaging-Reporting and Data System (PI-RADS). PI-RADS is a standardized reporting system developed by the American College of Radiology (ACR) to provide a consistent reporting method for prostate mpMRI findings. The latest version of PI-RADS can be found at www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/PI-RADS .

  5. European Society of Urogenital Radiology (ESUR) guidelines for prostate MRI. The ESUR guidelines provide recommendations for the use of mpMRI in the diagnosis and management of prostate cancer. The guidelines can be found at www.esur.org/guidelines/genitourinary-radiology/prostate-mri/ .

These references provide comprehensive information on using mpMRI for diagnosing prostate cancer, including indications, interpretation of findings, and recommendations for clinical practice.

This prior authorization was faxed to the insurance company, and approval was obtained in 24 hours. The permission could have taken days or weeks if a nurse or physician assistant made this request using the phone. This is just one example of ChatGPT’s effectiveness in the medical office setting.

The world of possibilities of ChatGPT is almost endless. Other AI platforms are being developed, and as physicians, we will constantly be discovering new ways AI can assist and improve patient care. Algorithms for differential diagnosis in rare or obscure conditions will be developed. A physician confronted with a confounding clinical situation may be able to tell AI the symptoms and objective findings and rapidly receive help in preparing a differential diagnosis. Treatment algorithms likewise will be developed and rapidly available. The uses in clinical practice will be limited only by our imagination. Improvements in AI will be constantly occurring, improving the lives of clinicians and patients alike.

POTENTIAL ETHICAL, MORAL, PRIVACY, AND LEGAL CONSIDERATIONS WITH AI AND CHATGPT

In the world of education, professors and teachers are being inundated with student work that is, in fact, ChatGPT-generated work. Algorithms are being developed to uncover work submitted by students that is not their original work. There is a challenge among students who will be using ChatGPT to help them with research, to use the information generated appropriately as an aid rather than as the writer of papers.(2)

In the medical field, we must ensure we don’t fall prey to “the easy way out.” New technology, especially technology in its infancy with which we are only now developing familiarity, leaves openings and questions regarding moral, ethical, and perhaps legal issues yet to be uncovered. We are unaware of any lawsuits against medical practices regarding documents created with ChatGPT. Still, caution with these AI-developed documents is encouraged. For example, if a physician asked ChatGPT, “What is the best first-line treatment for……?” Relying on that recommendation without confirming its accuracy might result in improper patient care, with all the attendant liabilities. High-quality, accurate data is the cornerstone of how we practice medicine and provide evaluation and treatment options to our patients. Physicians now can benefit from ChatGPT to help obtain clinical information. It can be an aid in making medical decisions. Still, data from peer-reviewed published papers, along with data presented at medical conferences, will likely remain our best source of reliable medical decision-making material for the present time.

Incorporating this new technology in medical research, especially in reporting results of medical research, will be a huge boon. Still, it also could lead to incorrectly reporting data. This opens the possibility of misleading conclusions because of potentially unreliable results, with unwelcome consequences to authors and patients. Consent and confidentiality also are considered in using AI in medical research.

Keeping in mind that the “A” in AI stands for “artificial,” it is essential for clinicians and staff generating ChatGPT that documents must be reviewed for accuracy. The same holds true for using AI for clinical decision-making. Ultimately, physicians are responsible for the use of information from their medical practice and for their patient’s care. With this new technology, we will enjoy many time-saving and educational opportunities. Still, it remains up to us to ensure the validity of what we prepare and present.

Bottom Line: Artificial intelligence and ChatGPT have exploded into nearly every business and sector, including the healthcare profession. However, privacy and ethical concerns must be addressed to ensure data security and responsible AI use. Overall, ChatGPT promises to improve healthcare efficiency, accessibility, and medical breakthroughs. The ChatGPT train is leaving the station—all aboard!

References

  1. Varanasi L. ChatGPT creator OpenAI is in talks to sell shares in a tender offer that would double the startup’s valuation to $29 billion. Insider. www.businessinsider.com/chatgpt-creator-openai-talks-for-tender-offer-at-29-billion-2023-1 . Accessed January 18, 2023.

  2. McDade A. Studies show ChatGPT cheating is on the rise among students — young and old — as teachers remain divided on bans. Business Insider. February 8, 2023. www.businessinsider.com/teachers-caught-students-cheating-chatgpt-survey-shows-2023-2

This article is available to AAPL Members.

Log in to view.

Neil Baum, MD

Neil Baum, MD, Professor of Clinical Urology, Tulane Medical School, New Orleans, Louisiana, and author of Medicine is a Practice: The Rules for Healthcare Marketing (American Association for Physician Leadership, 2024).


David F. Mobley, MD

David F. Mobley, MD, is a practicing urologist in Houston, Texas. He is an associate professor of urology at Weill-Cornell Medicine.

Interested in sharing leadership insights? Contribute


For over 45 years.

The American Association for Physician Leadership has helped physicians develop their leadership skills through education, career development, thought leadership and community building.

The American Association for Physician Leadership (AAPL) changed its name from the American College of Physician Executives (ACPE) in 2014. We may have changed our name, but we are the same organization that has been serving physician leaders since 1975.

CONTACT US

Mail Processing Address
PO Box 96503 I BMB 97493
Washington, DC 20090-6503

Payment Remittance Address
PO Box 745725
Atlanta, GA 30374-5725
(800) 562-8088
(813) 287-8993 Fax
customerservice@physicianleaders.org

CONNECT WITH US

LOOKING TO ENGAGE YOUR STAFF?

AAPL providers leadership development programs designed to retain valuable team members and improve patient outcomes.

American Association for Physician Leadership®

formerly known as the American College of Physician Executives (ACPE)