The term "hallucination" with regards to AI (specifically LLMs) is just way of subtly suggesting that AI has a sensory experience that it inherently can't have
We're being conditioned to accept AGI as an inevitability (it's not), and we need to see the subtle ways language used to convince us it is
about 15 hours ago