Stephen Hawking seems to enjoy exploring the universe. However,
he is one of the most outspoken critics of actually trying to communicate with them, an act that he says would potentially endanger humanity, because a distant alien civilisation might view us as inferior, weak, and perfect to conquer. “If so, they will be vastly more powerful and may not see us as any more valuable than we see bacteria.”
Hawking’s warning is rooted in the idea that an alien civilisation, especially one that can pick up our signals and understand where they’re coming from, has the potential to be billions of years more advanced than us, making us an easy target to overthrow or invade.
…A few weeks ago, at a lecture at the University of Cambridge, Hawking said artificial intelligence might prove to be “either the best, or the worst thing, ever to happen to humanity”, a feeling that other experts and leaders – such as Elon Musk – have agreed with.
This fear stems from the fact that AI has the power to learn for itself, making it possible to surpass our human abilities, because we rely on biological evolution – a slow process, to say the least – to become better.
“[Artificial intelligence] would take off on its own, and re-design itself at an ever increasing rate,” he told Rory Cellan-Jones at the BBC. “Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded.”