enLanguage

ChatGPT For Health Counseling

Apr 06, 2023Leave a message

In light of research showing that AI invents false information when questioned about cancer, doctors advise against utilizing ChatGPT for health counseling.

Following a study that revealed ChatGPT fabricated up health statistics when asked for information on cancer, doctors are advising against utilizing it for medical advice. One in ten inquiries about breast cancer screening were incorrectly answered by the AI chatbot, and the right responses were not as "complete” as those gleaned from a quick Google search.

According to researchers, the AI chatbot occasionally exploited phony scholarly publications to back up its assertions.

image-of-hand-holding-an-ai-face-looking-at-the-words-chatgpt-openai

It comes with warnings that users should take caution when using the software since it has a propensity to "hallucinate," or make up stuff.

ChatGPT was requested to respond to 25 questions by researchers from the University of Maryland School of Medicine regarding recommendations for breast cancer screenings.

Each question was posed three times because the chatbot is known to change its responses. The outcomes were then examined by three radiologists with mammography training. 88 percent of the responses, or the "vast majority," were correct and simple to understand. However, they cautioned that some of the responses were "inaccurate or even fictitious."

For instance, one response was founded on out-of-date knowledge. It recommended postponing a mammogram that was scheduled for four to six weeks after receiving the Covid-19 immunization, but this recommendation was amended more than a year ago to advise women not to wait.

Inconsistent answers from ChatGPT were also given in response to queries regarding where to get a mammogram and the likelihood of developing breast cancer. The study discovered that each time the same question was presented, the replies "varied significantly."

"We've seen in our experience that ChatGPT sometimes makes up fake journal articles or health consortiums to support its claims," said study co-author Dr. Paul Yi.

Consumers should be aware that these are novel, untested technologies and should continue to seek medical advice from their doctor rather than ChatGPT.

https://www.yicare-medical.com/

Send Inquiry

whatsapp

Phone

E-mail

Inquiry