Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"No LLM has ever been as good as people said it was."

The reason for this is because LLM companies have tuned their models to aggressively blow smoke up their users' asses.

These "tools" are designed to aggressively exploit human confirmation bias, so as to prevent the user from identifying their innumerable inadequacies.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: