We as humans tend to personify things, and this whole thread is a prime example. LLMs cannot think, they can only respond. They may emit a thought process, but that's not thinking. An LLM is just a large matrix that gets multiplied to get the next response. It's no more feeling than predictive text.
add a skeleton here at some point
2 months ago