Before his death in 2018, renowned physicist Stephen Hawking wrote a moving letter to humanity highlighting the risks associated with artificial intelligence. Even though Hawking passed away, his dire forecasts about artificial intelligence and the risks of communicating with extraterrestrial life still have a significant impact.
“The development of full artificial intelligence could spell the end of the human race,” Hawking cautioned in a 2014 BBC interview. He predicted how technology would advance and affect our lives even before it took shape. This is especially true if technology surpasses human intelligence.
He said, “It would take off on its own and re-design itself at an ever-increasing rate.”
Hawking continued, stating, “Humans would be overtaken because of their slow biological evolution, which makes them incapable of competing.”
The astrophysicist wasn’t warning us about the AI for the first time. In 2015, Hawking was one of about a hundred experts who signed an open letter to the UN warning them of the dangers of unregulated AI development.
Furthermore, in 2017, the year before his death, he told the magazine Wired, “I fear AI may replace humans altogether.”
In his book Brief Answers to the Big Questions, which was released a few months after his death, he provided additional information about the threat. He stated, “We may face an intelligence explosion that ultimately results in machines whose intelligence exceeds that of snails.”
“It’s easy to dismiss the idea of intelligent machines as science fiction, but this would be a mistake—and possibly our biggest error ever,” he said.