Discussion about this post

User's avatar
Kelsey Wharton's avatar

I had not heard of this term but the exact thing happened to me when I asked ChatGPT to generate a list of furniture for a room. It had links and plausible sounding IKEA names! But all the links were dead-ends. I thought maybe it was using outdated data, now I realize it was hallucinating. Creating a tool that will appease us even if it has to lie…sounds concerning to me!

Expand full comment
1 more comment...

No posts