Hundreds of tourists are flocking to hot springs in a small town in Tasmania, Australia. However, there was one small wrinkle in their trip planning. The hot springs are actually an AI “hallucination” that erroneously appeared on a travel advice website and was widely shared before people figured out what was happening.
“that erroneously appeared”
I knew it was gonna be idiots going somewhere they didn’t verify, suggested by ai
Strange way to spell ‘consciously added by a dipshit who didn’t do any checks whatsoever’ but ok
This just keeps happening. People assign agency to the chat bot and refuse to take responsibility for their own mistakes.