I worry about anthropomorphising generative AI tools. They don’t ‘hallucinate’ - they output errors.
When AI companies talk about ‘hallucinations’, what they mean is their product failed. Its output was garbage. But this language is embedded now. We’re stuck with it.
add a skeleton here at some point
4 months ago