Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's pointless to have a conversation with a human about code they didn't write and don't understand.

this was a problem before LLMs





Scale can be transformational: getting shot was always bad but when guns lowered the skill requirement and increased lethality wars became even more deadly. LLMs greatly increase the pool of potential scammers and the cost of detecting them.

It was, and those PRs should be banned too...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: