Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I honestly don't hate the idea.

On a more serious note, of course there's ways to put in guard rails. LLMs behave like they do because of intentional design choices. Nothing about it is innate.



If you take this idea even a little bit further, you'll end up with licenses for being allowed to speak.


I wasn't being entirely serious. Also, we managed to require drivers licenses without also walking licenses.


We did that by making walking practically useless instead as many people here point out ~every week.


Correct. The companies developing these LLMs are throwing dump trucks full of money at them like we’ve not seen before. They choose to ignore glaring issues with the technology because if they don’t, some one else will.


Perhaps a better way to phrase that would be "beyond what they're doing now." Most popular hosted LLMs already refuse to complete explanations for suicide.


Except in this case, the LLM literally said "I can't explain this for you. But if you'd like roleplay with me, I could explain it for you that way."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: