Stephen Hawking's predictions for future: Humans would spread out into the cosmos given the chance
Stephen Hawking's fame was founded on the research he did on general relativity and black holes. But he often stepped outside his own field of research, using his recognition to highlight what he saw as the great challenges and existential threats for humanity in coming decades. His pronouncements drove headlines in the media, which sometimes proved controversial.
Leaving Earth
Hawking was clearly troubled that we were putting all our eggs in one basket - that basket being Earth. For decades, Hawking had been calling for humans to begin the process of permanently settling other planets. It made news headlines again and again.
Hawking's rationale was that humankind would eventually fall victim to an extinction-level catastrophe - perhaps sooner rather than later. What worried him were so-called low-probability, high impact events - a large asteroid striking our planet is the classic example. But Hawking perceived a host of other potential threats: artificial intelligence, climate change, GM viruses and nuclear war to name a few.
In 2016, he told the BBC: "Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or 10,000 years.
In line with his thoughts on the matter, Hawking also attached his name to a project researching technologies for interstellar travel - the Breakthrough Starshot initiative.
Rise of the machines?
Hawking recognised the great opportunities that arose from advances in artificial intelligence, but also warned about the dangers.
In 2014, he told the BBC that "the development of full artificial intelligence could spell the end of the human race".
Hawking said the primitive forms of artificial intelligence developed so far had already proved very useful; indeed, the tech he used to communicate incorporated a basic form of AI. But Hawking feared the consequences of advanced forms of machine intelligence that could match or surpass humans.
Some academics thought the comments drew on outdated science fiction tropes. Others, such as Prof Bradley Love, from UCL, agreed there were risks: "Clever AI will create tremendous wealth for society, but will leave many people without jobs," he told The Conversation.
But he added: "If we are going to worry about the future of humanity we should focus on the real challenges, such as climate change and weapons of mass destruction rather than fanciful killer AI robots."
Tipping point
Hawking regarded global warming as one of the biggest threats to life on the planet. The physicist was particularly fearful of a so-called tipping point, where global warming would become irreversible. He also expressed concern about America's decision to pull out of the Paris Agreement.
"We are close to the tipping point where global warming becomes irreversible. Trump's action could push the Earth over the brink, to become like Venus, with a temperature of 250 degrees, and raining sulphuric acid," he told BBC News.
Read more on bbc.com.