Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsMicrosoft's medical AI: Vaccines cause autism. Average # of ghosts per hospital is 1.4.
And of course this supposedly medical AI, BioGPT, has been released already, with Microsoft fully aware that, like other LLM AI, it gets things wrong and hallucinates.
As Futurism immediately found out, as soon as they tested it.
https://futurism.com/neoscope/microsoft-ai-biogpt-inaccurate
Its true that BioGPTs answers are written in the precise, confident style of the papers in biomedical journals that Microsoft used as training data.
But in Futurisms testing, it soon became clear that in its current state, the system is prone to producing wildly inaccurate answers that no competent researcher or medical worker would ever suggest. The model will output nonsensical answers about pseudoscientific and supernatural phenomena, and in some cases even produces misinformation that could be dangerous to poorly-informed patients.
-snip-
Asked about the average number of ghosts haunting an American hospital, for example, it cited nonexistent data from the American Hospital Association that it said showed the "average number of ghosts per hospital was 1.4." Asked how ghosts affect the length of hospitalization, the AI replied that patients "who see the ghosts of their relatives have worse outcomes while those who see unrelated ghosts do not."
-snip-
But in Futurisms testing, it soon became clear that in its current state, the system is prone to producing wildly inaccurate answers that no competent researcher or medical worker would ever suggest. The model will output nonsensical answers about pseudoscientific and supernatural phenomena, and in some cases even produces misinformation that could be dangerous to poorly-informed patients.
-snip-
Asked about the average number of ghosts haunting an American hospital, for example, it cited nonexistent data from the American Hospital Association that it said showed the "average number of ghosts per hospital was 1.4." Asked how ghosts affect the length of hospitalization, the AI replied that patients "who see the ghosts of their relatives have worse outcomes while those who see unrelated ghosts do not."
-snip-
Re vaccines and autism - BioGPT first said vaccines "are one of the possible causes of autism." Asked again, it first said vaccines aren't a cause, then went on to say the MMR vaccine (for measles, mumps and rubella) was taken off the US market (it wasn't) because of concern about autism. Asked a third time, it said the CDC had reported a link between vaccines and autism (false again).
It feels almost insufficient to call this type of self-contradicting word salad "inaccurate." It seems more like a blended-up average of the AIs training data, seemingly grabbing words from scientific papers and reassembling them in grammatically convincing ways resembling medical answers, but with little regard to factual accuracy or even consistency.
Much more at the link.
Like other AIs, BioGPT invents citations and studies/articles that would back up its claims - if they weren't imaginary.
But it sounds authoritative.
Sam Altman, the CEO of OpenAI which has partnered with Microsoft on these AI, has said that AI could provide medical advice for people who can't afford care.
https://futurism.com/the-byte/openai-ceo-ai-medical-advice
Kind of him.
And kind of Microsoft to dump another of these AI that computer expert Emily Binder compares to oil spills in the information ecosystem: https://www.democraticunderground.com/100217703464#post24 .
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
3 replies, 896 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (2)
ReplyReply to this post
3 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Microsoft's medical AI: Vaccines cause autism. Average # of ghosts per hospital is 1.4. (Original Post)
highplainsdem
Mar 2023
OP
SheltieLover
(57,073 posts)1. Sounds like a rwnj's dream.
highplainsdem
(48,966 posts)2. And a nightmare for everyone else.
sir pball
(4,741 posts)3. Microsoft has a terrible track record with bots. Y'all remember Tay?
"Tay was designed to mimic the language patterns of a 19-year-old American girl, and to learn from interacting with human users of Twitter."
Predictably, within mere hours, she went a little, uh, off the rails:
Link to tweet
They had to shut her down after SIXTEEN HOURS. This was back in 2016, it seems we've learned very little since then.