Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's also important to understand how bad LLMs actually are.

It's very easy to imagine that LLMs are smart, because they can program or solve hard maths problems, but even a very short attempt to have them generate fiction will demonstrate an incredible level of confusion and even an inability to understand basic sentences.

I think the problem may have to do with the fact that there are really many classes, and in fiction you actually use them. They simply can't follow complex conversations.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: