
Film director James Cameron considers artificial intelligence to be one of the main existential threats to humanity. He believes that the Terminator scenario may become close to reality.
The danger of AI, in his opinion, is on par with nuclear weapons and the destruction of the natural environment. He explains that modern warfare requires extremely fast decision-making, and in such conditions, human reaction may not be effective enough.
“I do think there’s still a danger of a Terminator-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defense counterstrike, all that stuff”, — Cameron says.
That is why, according to him, there is an increasing need for superintelligence. But even if people stay informed, mistakes cannot be canceled — and this has repeatedly brought the world to the brink of large-scale international incidents.
“I warned you guys in 1984! And you didn’t listen”, — the director jokes, referring to the premiere of The Terminator that year.
At the same time, Cameron does not leave the topic of AI in his work. У the next movie — “Avatar: Fire and Ashes” — promise to speak out against artificial intelligence. The film will be released on December 19. Sam Worthington and Zoe Saldana will reprise their roles as Jake Sully and Nateeeri, and the new the enemy will be the people of the ashes — the Na’vi tribe led by Warang (played by Una Chaplin), who has teamed up with Colonel Kvorich.
“I feel like we’re at this cusp in human development where you’ve got the three existential threats: climate and our overall degradation of the natural world, nuclear weapons, and superintelligence. They’re all sort of manifesting and peaking at the same time. Maybe the superintelligence is the answer. I don’t know. I’m not predicting that, but it might be”, — James summarizes.
Source: Rolling Stone
Spelling error report
The following text will be sent to our editors: