Ooof. This sounds like the worst idea I've heard of for a while.
-
Ooof. This sounds like the worst idea I've heard of for a while. I don't feel like this should need to be said, but don't trust an LLM for medical advice. Seriously.
-
Ooof. This sounds like the worst idea I've heard of for a while. I don't feel like this should need to be said, but don't trust an LLM for medical advice. Seriously.
@mike "OpenAI's new 'dedicated experience' that taps into your medical records..." What could go wrong š
On the other hand, as a woman⢠I'm actually kind of excited for the advanced pattern recognition that AI could bring. It might help with better identifying women's health issues.
-
@mike "OpenAI's new 'dedicated experience' that taps into your medical records..." What could go wrong š
On the other hand, as a woman⢠I'm actually kind of excited for the advanced pattern recognition that AI could bring. It might help with better identifying women's health issues.
@Gina Theoretically, AI may be able to better identify women's health issues than some rando doctor, but as a American⢠the whole idea of giving all my medical information to a company that's more than likely to turn around and sell it to the highest bidder creeps me the hell out. Especially since those companies could very well be health insurance companies that could decide to deny me coverage based on conditions I may or may not even have. And that's assuming the "AI" is not some random LLM.
-
@Gina Theoretically, AI may be able to better identify women's health issues than some rando doctor, but as a American⢠the whole idea of giving all my medical information to a company that's more than likely to turn around and sell it to the highest bidder creeps me the hell out. Especially since those companies could very well be health insurance companies that could decide to deny me coverage based on conditions I may or may not even have. And that's assuming the "AI" is not some random LLM.
@Gina unless the AI is localised with a database for health records at the very least, it shouldnāt be trusted. @mike concerns are real and totally valid. I urge people to be careful with giving out data because once itās out there it can also be used in unintended ways (like against you).
I would run the AI locally and query symptoms to webMD or an internal database of health conditions so that you could keep data as private and secure as possible.
-
Ooof. This sounds like the worst idea I've heard of for a while. I don't feel like this should need to be said, but don't trust an LLM for medical advice. Seriously.
@mike avoiding to even mention any sort of compliance with established requirements (HIPPA?).
If this feature is enabled in the EU they should get fined heftily, as it is against our current AI established regulation (health & law enforcement purposes fall under high risk AI criteria).
"Daddy" Trump will probably swing in with harsher tone to get us to loosen our legislation, if that's the case. -
@mike "OpenAI's new 'dedicated experience' that taps into your medical records..." What could go wrong š
On the other hand, as a woman⢠I'm actually kind of excited for the advanced pattern recognition that AI could bring. It might help with better identifying women's health issues.
@Gina @mike Unfortunately, AI isn't really good at identifying women's health issues. So, "what could go wrong" could actually be fatal.
https://www.newscientist.com/article/2510065-ai-chatbots-miss-urgent-issues-in-queries-about-womens-health/
https://femtechnology.org/2025/07/30/built-to-fail-her-ais-invisible-bias-in-womens-health-and-how-we-fix-it-for-everyone
https://www.business-humanrights.org/en/latest-news/uk-ai-tools-downplay-womens-health-issues-study-finds/ -
Ooof. This sounds like the worst idea I've heard of for a while. I don't feel like this should need to be said, but don't trust an LLM for medical advice. Seriously.
@mike ... or a tech corp to keep your data safe, never mind not use it against you
-
@Gina @mike Unfortunately, AI isn't really good at identifying women's health issues. So, "what could go wrong" could actually be fatal.
https://www.newscientist.com/article/2510065-ai-chatbots-miss-urgent-issues-in-queries-about-womens-health/
https://femtechnology.org/2025/07/30/built-to-fail-her-ais-invisible-bias-in-womens-health-and-how-we-fix-it-for-everyone
https://www.business-humanrights.org/en/latest-news/uk-ai-tools-downplay-womens-health-issues-study-finds/@kdkorte @mike yeah chatbots are pretty useless, but applications like improved breast cancer screening sound interesting.
https://www.sciencedirect.com/science/article/pii/S0720048X24001736
-
Ooof. This sounds like the worst idea I've heard of for a while. I don't feel like this should need to be said, but don't trust an LLM for medical advice. Seriously.
@mike Hey AI, should I drink Drano for my sniffles? AI: "You're absolutely right."
/End scene
-
@mike "OpenAI's new 'dedicated experience' that taps into your medical records..." What could go wrong š
On the other hand, as a woman⢠I'm actually kind of excited for the advanced pattern recognition that AI could bring. It might help with better identifying women's health issues.