2 Comments
User's avatar
Kelsey Wharton's avatar

I had not heard of this term but the exact thing happened to me when I asked ChatGPT to generate a list of furniture for a room. It had links and plausible sounding IKEA names! But all the links were dead-ends. I thought maybe it was using outdated data, now I realize it was hallucinating. Creating a tool that will appease us even if it has to lie…sounds concerning to me!

Expand full comment
Cheyenne Dominguez's avatar

Interesting hallucination! AI can be a fantastic tool but it has limitations. I like to think of it as an assistant who is still learning from us rather than an all-knowing expert.

Thanks for sharing!

Expand full comment