This.
LLMs just give you something that has the form of a statistically plausible response to your input. They work on resemblance, not content. That's why they hallucinate. If you ask for an address, they won't (can't) look it up; they'll just give you output that looks like a plausible address.
add a skeleton here at some point
10 months ago