Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Idk I think saying it’s “computing” is more precise because “thinking” applies to meatbags. It’s emulating thinking.

Really I just think that anthropomorphizing LLMs is a dangerous road in many ways and really it’s mostly marketing BS anyway.

I haven’t seen anything that shows evidence of LLMs being anything beyond a very sophisticated computer system.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: