Humans face three existential threats, from super-intelligence, nuclear weapons and the climate crisis, says blockbuster director as he announces new Hiroshima project
I mean it would be, if our civilization were likely to last long enough for that to happen. But we’re doing a fine job of creating our own apocalypse, thank you very much.
So fucking stupid. There are real things to worry about, and an LLM skynet isn’t one of them.
I mean it would be, if our civilization were likely to last long enough for that to happen. But we’re doing a fine job of creating our own apocalypse, thank you very much.