regarding the "probability of superintelligent AI killing us all within the next ten years" debate I think a more important...
regarding the “probability of superintelligent AI killing us all within the next ten years” debate I think a more important question that came up is “probability of self-driving cars actually working within the next ten years”
On a half-notch more serious note, there are a lot of circuit breakers in most human systems such as “having to travel 15 miles to a particular building” that AI has no in on
Like, can humans create a nuclear apocalypse? Yeah, I guess they’ve had that capability for a while. Can any given human? Fuck no! Even the specific few that could would need the cooperation of a lot of the others.