Really I just think that anthropomorphizing LLMs is a dangerous road in many ways and really it’s mostly marketing BS anyway.
I haven’t seen anything that shows evidence of LLMs being anything beyond a very sophisticated computer system.
Really I just think that anthropomorphizing LLMs is a dangerous road in many ways and really it’s mostly marketing BS anyway.
I haven’t seen anything that shows evidence of LLMs being anything beyond a very sophisticated computer system.