WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.0K

Archive: https://archive.today/iG6XA

From the post:

>Health practitioners are becoming increasingly uneasy about the medical community making widespread use of error-prone generative AI tools. The proliferation of the tech has repeatedly been hampered by rampant "hallucinations," a euphemistic term for the bots' made-up facts and convincingly-told lies. One glaring error proved so persuasive that it took over a year to be caught. In their May 2024 research paper introducing a healthcare AI model, dubbed Med-Gemini, Google researchers showed off the AI analyzing brain scans from the radiology lab for various conditions.

Archive: https://archive.today/iG6XA From the post: >>Health practitioners are becoming increasingly uneasy about the medical community making widespread use of error-prone generative AI tools. The proliferation of the tech has repeatedly been hampered by rampant "hallucinations," a euphemistic term for the bots' made-up facts and convincingly-told lies. One glaring error proved so persuasive that it took over a year to be caught. In their May 2024 research paper introducing a healthcare AI model, dubbed Med-Gemini, Google researchers showed off the AI analyzing brain scans from the radiology lab for various conditions.
[–] 1 pt

Bet some people would want that new part added, because the computer recommended it.

[–] 1 pt

You guys 'member AI mouse penis? If we never pass the point of "needing to double check the AI" then don't replace the people, especially in essential services

[–] 1 pt

Meh, I've heard people refer to the basal ganglia as "basilar" it's simply lazy corruption of speech and jargon.

AI is only as good as the shit put into it.

[–] 1 pt

It doesn't exist in "real" humans...

[–] 1 pt

Doesn't exist in humans... yet.

[–] 1 pt

The AI will force compliance. You are the one that is not correct and a shot will be given shortly to correct the matter.