Like with hallucinations in medical transcription services, AI models do well with clean and structured input but as soon as anything is a little hinky they hallucinate in order to make their output "make sense"
add a skeleton here at some point
3 days ago