

Clearly you know fuck all about cars or guns.
Safeties on guns are to prevent accidental, unintended discharge. I’m pretty sure someone using one for suicide is performing an intentional discharge.
Edit: Safeties prevent a gun from going off when dropped. Using such safeties becomes automatic, so automatic that they’re useless for preventing an unintentional discharge by a person pulling the trigger at the wrong time (which they weren’t really intended for). Hell, the Glock safety is built into the trigger itself, so it clearly doesn’t prevent a person pulling the trigger at the wrong time. The safety is disengaged by pulling the trigger.
Safety systems in cars are similar, to prevent injury from the vehicle itself in a crash.
Seat belts keep us from being thrown from a car. Airbags prevent us crushing our chest on the steering wheel, or head trauma from hitting a window.
Crumple zones absorb the energy of a collision so it’s not transferred to the occupants of a car.
None of this is to prevent a person from intentionally doing harm.
I’ve lost 2 friends to suicide by car - none of the safety systems had any chance of preventing it, and there is no way to prevent it.










Granted with something like AI systems it’s easier and faster, but libraries could be faulted the same way - they have the same information, the only difference is learning how to look for it.
There’s a problem here, for sure, but how can it be addressed? Frankly I have no idea, especially since you can host these LLM’s on your own computer these days.