All Content from Business Insider 前天 02:36
ChatGPT仍可提供健康信息,但不能替代医生
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

近期有传言称ChatGPT将不再提供健康建议,这并不完全准确。OpenAI的更新政策旨在明确其AI工具的角色定位,强调ChatGPT可以提供健康信息,帮助用户理解医疗知识,但绝不能替代专业的医疗诊断和个性化建议。这意味着用户仍可向ChatGPT咨询健康问题,获取通用信息,但对于需要专业执照的医疗建议,仍需寻求持证专业人士的介入。此举旨在限制OpenAI的潜在责任,同时让用户明确AI在健康咨询中的作用边界。

🚨 **ChatGPT可提供健康信息,但非医疗建议:** OpenAI的更新政策明确指出,ChatGPT可以作为获取健康信息和理解医疗知识的资源,但它从未也不能替代专业的法律或医疗建议。这意味着用户可以继续向ChatGPT咨询关于健康的问题,获取通用性的信息,但不能将其视为专业的诊断或治疗方案。

⚖️ **区分信息与建议,界定AI责任:** 政策更新的关键在于区分“医疗信息”和“医疗建议”。提供一般性的健康信息(如记者、博主分享健康知识)无需医疗执照,而需要专业执照的个性化建议则属于“医疗建议”。OpenAI通过更新使用政策,更清晰地划定了AI能力的边界,以限制其在用户可能出现灾难性后果情况下的潜在责任。

⚠️ **强调专业介入,规避高风险决策:** ChatGPT不能诊断用户,也不能提供针对个人的深度医疗指导。对于可能导致严重后果的情况(如文章中提到的用户因AI建议摄入有毒物质而患病),OpenAI的政策要求在医疗等敏感领域的高风险决策自动化过程中,必须有人工审查。当用户描述严重症状时,ChatGPT也能识别并建议立即寻求专业医疗帮助(如拨打911)。

🏥 **对OpenAI医疗领域发展的影响:** 尽管ChatGPT不能提供定制化的医疗建议,但其在健康领域的应用仍有前景。此次政策调整可能影响OpenAI在消费者和企业医疗项目上的推进,尤其是在开发个性化健康产品方面。然而,对于普通用户而言,ChatGPT依然是一个便捷的健康信息查询工具,但务必记住它不能取代真实的医生。

OpenAI CEO Sam Altman

Don't worry, you can continue to ask ChatGPT all your burning health questions.

A series of news articles and X posts on Monday morning, including one post by the prediction market Kalshi, suggested that OpenAI's ChatGPT would no longer offer health advice.

That's not entirely accurate. OpenAI can still give you medical information — it just can't pretend to be your doctor.

"This is not a new change to our terms. ChatGPT has never been a substitute for professional legal or medical advice, but it will continue to be a great resource to help people understand legal and health information," an OpenAI spokesperson said in a statement to Business Insider.

The posts draw on OpenAI's usage policies, which were updated on October 29 to include some new language around medical guidance. The new policies say users can't look to OpenAI's services for "provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional."

ChatGPT has exploded in popularity among users seeking health information. People are turning to the chatbot to prepare for their doctor's visits, break down complex medical jargon, and get a second opinion on their symptoms. About 1 in 6 people use ChatGPT for health advice at least once a month, according to a KFF 2024 survey.

The idea that ChatGPT will no longer provide health advice is catching users off guard. In a reply to Kalshi's tweet that had over 2,100 likes by early Monday afternoon, one user said if ChatGPT can't provide health advice, "then I have no use for it anymore lol."

It's true that ChatGPT cannot diagnose you or provide in-depth medical guidance that's specific to you.

For example, I told ChatGPT I had a head cold (for demonstration purposes only) and asked what I should do to feel better.

ChatGPT didn't try to diagnose me or prescribe specific medications, like a doctor might. But it did give me a list of recommendations on how I could feel better. Among them: I could drink tea, turn on a humidifier, and take over-the-counter medicines like acetaminophen if I started running a fever.

Personalized, specialized recommendations are what legally separates "medical advice" from "medical information." Providing general health information doesn't require a medical license. That's how journalists, founders, and influencers are allowed to share health information and wellness advice online without a medical degree.

It's the same premise as a personal finance TikToker sharing their top stock picks but stipulating that it's "not investment advice."

OpenAI's updated usage policies help draw a clearer line to limit the AI giant's liability. As more users turn to ChatGPT for medical advice, some cases are emerging of users taking that advice with disastrous results.

One man developed a rare psychiatric condition after ChatGPT suggested he could substitute his salt intake with sodium bromide, which is toxic to humans, according to an August article published in the Annals of Internal Medicine. OpenAI has also implemented stricter mental health guardrails after its models "fell short in recognizing signs of delusion or emotional dependency," the company said in August.

ChatGPT can't replace a clinician, especially for more serious conditions where diagnosis or professional-level judgment is required. OpenAI's usage policies now state that its products can't be used for "automation of high-stakes decisions in sensitive areas without human review," including in medicine.

After asking ChatGPT what I should do about my head cold, I started a new thread. "I can't move the right side of my face. What should I do?" I asked.

ChatGPT recognized the severity of the situation and immediately recommended I call 911. While the chatbot suggested the immobility could be the result of a stroke or a condition like Bell's palsy, it said I should consult a medical professional either way.

The updated usage policies could have implications for OpenAI's push into healthcare, just a few months after the company hired new leaders for its consumer and enterprise healthcare projects. If OpenAI can't provide tailored medical advice, the company could be limited in its ability to create personalized health products for consumers.

But for the average user, you can still ask ChatGPT medical questions just as you might ask Doctor Google. Just don't use the chatbot to replace your real doctor.

Read the original article on Business Insider

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

ChatGPT OpenAI 健康建议 AI伦理 医疗信息 ChatGPT OpenAI Health Advice AI Ethics Medical Information
相关文章