Dragon Cobolt 3 months ago
THis is actually a really good illustration of why LLMs are just...kind of not that good at a lot of the things they're sold as being good at: The thing doesn't fucking understand or know what the question is, it's doing pattern recognition and that breaks down on a long enough timescale.
add a skeleton here at some point