Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
atbvu
87 days ago
|
parent
|
context
|
favorite
| on:
A small number of samples can poison LLMs of any s...
Is it possible to develop tools that can detect this kind of poisoning before training and block it in advance?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: