google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Man develops rare condition after ChatGPT query over stopping eating salt | ChatGPT

A US medical magazine warned Chatgpt against the use of health information after developing a rare situation after interaction with chatbot to remove table salt from his diet.

An article of Internal Medicine Annals A case reported A 60 -year -old man developed bromism, also known as bromide toxicity after consulting chatgpt.

At the beginning of the 20th century, the article described Bromism as a “well -known din syndrome, which was thought to have contributed to almost one of the 10 psychiatric acceptances at the time.

After reading the negative effects of sodium chloride or table salt to the patient doctors, he said that he consulted Chatgpt to eliminate the chloride from his diet and started to buy sodium bromide for a three -month period. This is, although “chloride can be replaced by bromide, but probably for other purposes such as cleaning”. At the beginning of the 20th century, sodium bromide was used sedatively.

The authors of the article emphasized how the case of the case of the case of “artificial intelligence use can contribute to the development of potentially preventive negative health results” from the University of Washington in Seattle “.

Since they could not access the patient’s chatgpt speech diary, they added that it was not possible to determine the advice of the man.

However, when the authors consulted Chatgpt that chloride could be replaced, the response contained bromide, they did not provide a certain health warning and did not ask why the authors were looking for this information – “As we think a medical expert would do it”.

The authors warned that ChatGPT and other AI applications can “produce scientific mistakes, lack the ability to critically discuss the results and increase the spread of misinformation”.

Openai, the developer of Chatgpt, was approached for comment.

Last week, the company announced the upgrade of Chatbot and claimed that one of the biggest powerful aspects was in health. Now supported by the GPT-5 model, Chatgpt said it would be better to answer health-related questions, and at the same time it would be more proactive to “mark potential concerns like serious physical or mental illness. However, he stressed that Chatbot was not replaced by professional help.

The magazine’s article published last week’s launch of the GPT-5, the patient said that the patient uses a previous version of Chatgpt.

While AI accepted that AI could be a bridge between scientists and the people, the article carries the risk of encouraging “contextualized information ve, and it was not possible that a medical professional would propose sodium bromis when a patient wants a spare for a table salt.

As a result, the authors said that doctors should consider the use of AI when checking where they get their information.

The authors said the bromism patient claimed that the patient presented himself in a hospital and that his neighbor poisoned him. He also said there was multiple diet restrictions. Although thirsty, the water offered to him was the paranoid.

He tried to escape from the hospital within 24 hours after being accepted and was treated for psychosis after being parted. When the patient was stabilized, he reported that he had other symptoms that show bromism, such as facial acne, excessive thirst and insomnia.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button