Anyone consulting LLMs for anything already had a preexisting psychological condition called Stupid as Fuck.
You get totally different answers to “is X healthy” vs “is X unhealthy”
But yeah, if ChatGPT tells you to order restricted substances on the internet, probably don’t do that
The thing that bothers me about LLMs is that people will acknowledge the hallucinations and lies LLMs spit out when their discussing information the user is familiar with.
But that same person will somehow trust an LLM as an authority on subjects to which they’re not familiar. Especially on subjects that are on the edges or even outside human knowledge.
Sure I don’t listen when it tells me to make pizza with glue, but it’s ideas about Hawking radiation are going to change the field.
This is literally the Dunning-Kruger effect in action - people can’t evaluate the quality of AI responses in domains where they lack the knowledge to spot the bs.
Oh fun! When Americans lose their healthcare I expect these kind of stories to rise.
He did his own research! /s
I told my wife about this story and she told me that everything on the internet should be taken with a grain of salt. I told her this guy took a few too many grains.
It sounds more like a crazy person did a crazy thing and happened to use AI.
A man uses the internet to poison himself. The story as old as time But if we stick the AI in the title, we can get some sweet clicks out if it.
Yup reads exactly the same.
Finally evolution back in effect




