On a more serious note, of course there's ways to put in guard rails. LLMs behave like they do because of intentional design choices. Nothing about it is innate.
Correct. The companies developing these LLMs are throwing dump trucks full of money at them like we’ve not seen before. They choose to ignore glaring issues with the technology because if they don’t, some one else will.
Perhaps a better way to phrase that would be "beyond what they're doing now." Most popular hosted LLMs already refuse to complete explanations for suicide.
On a more serious note, of course there's ways to put in guard rails. LLMs behave like they do because of intentional design choices. Nothing about it is innate.