Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Alton Parrish (Reporter)
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

A Dystopian Future? Catastrophic and Existential Risk for Humans

% of readers think this story is Fact. Add your two cents.


Some 40 scholars from around the world gathered last spring at UCLA to discuss catastrophic risks facing humans that may threaten the entire future of humanity — climate change, an asteroid strike, global pandemics, artificial general intelligence, and nuclear and biological terror, among them.

Scenarios for a possible sci-fi movie about a dystopian future?

Credit: Wikimedia Commons

Actually, the event that brought them to Westwood was the first colloquium on catastrophic and existential risk hosted by the B. John Garrick Institute for the Risk Sciences at UCLA, housed in the UCLA Henry Samueli School of Engineering and Applied Science. The institute was launched in 2014 with a $9 million gift from UCLA Engineering alumnus and Distinguished Adjunct Professor B. John Garrick and his wife, Amelia.

UCLA materials science engineer Ali Mosleh, the Evelyn Knight Chair in Engineering and the inaugural director of the institute, and his staff are focused on reliability engineering, preventing failures of complex systems, and managing disruptions to society and the environment, caused by such threats as major industrial accidents, natural disasters and climate change.

In light of a global ransomware attack that disrupted computer systems around the world Tuesday, is world catastrophe just science fiction? Consider this: On June 30, designated by the United Nations as world Asteroid Day, we are being asked to think about how we can protect the Earth from an asteroid strike.

In this edited Q&A, three experts talked to UCLA Engineering about how to realistically think about these risks and how to provide sound information to policy makers. Responding to questions were UCLA Chancellor Emeritus Albert Carnesale, a professor of mechanical and aerospace engineering and public policy; Seán Ó hÉigeartaigh, executive director of Centre for the Study of Existential Risk at the University of Cambridge; and Christine Peterson, co-founder and past president of the Foresight Institute, a think tank and public interest organization based in the San Francisco Bay Area.

All were featured speakers at the conference.

What is existential risk and what is catastrophic risk?

Carnesale: Existential risk, in the extreme case, means eliminating the human species on Earth — literally, the very existence of life on this planet. That’s the extreme case. Everything from that on down to something that would perhaps kill tens of thousands of people all qualifies in different people’s minds as catastrophic risk.

And some [of these catastrophes] don’t have to kill tens of thousands of people immediately. You can imagine a financial disaster that, over time, would affect many people.

UCLA Chancellor Emeritus Albert Carnesale and Christine Peterson of the Foresight Institute were featured speakers at the first colloquium at UCLA on catastrophic and existential risk.

UCLA

Why is it important to study these types of risks?

Carnesale: The reason it’s important to study this is so that hopefully we can find ways to reduce these risks and make it less likely that these risks will occur, and, if they do occur, that their consequences would be reduced. So people are studying them in order to offer better advice on how you can reduce those risks.

Ó hÉigeartaigh: When we’re talking particularly about risks in the extreme scale, we’re often talking about uncertain, low-probability or speculative scenarios. I think it’s tremendously important to look at which of these are scientifically plausible, which ones do we have good reasons to think might come about on a timescale that we might care about, and then which ones we can say are sufficiently unlikely that we don’t need to think about them that much or [they] just simply don’t look plausible and … we can relegate to the realms of science fiction for the moment.

By doing this kind of analysis, we can better focus our attention on what we really should care about and what we should invest our resources in, in trying to mitigate or prevent to the extent that we can.

What are the technologies under development that also may pose a concern?

Peterson: Probably the technology change that’s gotten the most attention at this meeting is the prospect of artificial general intelligence. This creates tremendous opportunity — think of the diseases that could be cured and the economic advances, tackling tough problems like environmental issues.

But there’s also a concern: Is this something we can control? Is this something that would take over the world in some way?

What we’ve been trying to do at this conference is come up with numbers, whether it’s timeframe, the likelihood, the cost and the payoff for preventing problems. These things are matters of tremendous debate, and they’re very unclear. But we have to try. Without numbers, you can’t make decisions.

What do you think poses the biggest risk to civilization?

Carnesale: The most easily identifiable is climate change. But there is a lot of uncertainty there. That is the classic example with the insurance analogy.

There’s no question that carbon dioxide in the atmosphere increases global warming. You don’t have to have much more than a high school education with a chemistry course to know that. Now you start to ask questions: How much of an effect does it have? Over what timescale? … How much of the CO2 will be absorbed in the oceans? Where might there be a tipping point? Where might there be uncertainties?

But the fact that you don’t know everything doesn’t mean you don’t know anything. We do know more carbon dioxide in the atmosphere is going to go in the wrong direction.

I think that’s the best example of where the scientific community is fairly well agreed that if we don’t do anything, we’re going to have a big problem. You can argue about what we should do and how fast we should do it. … But the idea of “Let’s wait — it’s only a theory” doesn’t work.

Peterson: I divide these catastrophic risks into natural and man-made.

Of the natural ones, the ones that make me the most nervous are pandemics. Looking at this history, we can say with some confidence that these will come. Hopefully, we all remember the so-called Spanish Flu of 1918. There are more people now. We live in bigger cities. Are we well prepared? I don’t think we are well prepared for this. Of course, it will hit developing nations harder. But it could be very bad all over.

Looking at man-made problems — in the near-term — the cyberattack issue. You can think about a cyberattack in terms of the Internet of Things. For example, there was a hotel where guests were locked out of their rooms. There was a hospital that was threatened. Those are real concerns, but they are not so much, I would say, catastrophic risks currently.

But I think the electrical grid is a catastrophic risk. A cyberattack on the grid could cause a lot of damage and quite a few fatalities, large numbers. From what I’ve been able to find out, it’s not being addressed quickly enough. It could take decades to fix this, and that’s not acceptable because the threat is real right now.

To read the entire Q&A, go to the UCLA Engineering website.

 
 
Contacts and sources: 

Amy Akmal

University of California Los Angeles 


Source: http://www.ineffableisland.com/2017/08/a-dystopian-future-catastrophic-and.html


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Please Help Support BeforeitsNews by trying our Natural Health Products below!


Order by Phone at 888-809-8385 or online at https://mitocopper.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomic.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomics.com M - F 9am to 5pm EST


Humic & Fulvic Trace Minerals Complex - Nature's most important supplement! Vivid Dreams again!

HNEX HydroNano EXtracellular Water - Improve immune system health and reduce inflammation.

Ultimate Clinical Potency Curcumin - Natural pain relief, reduce inflammation and so much more.

MitoCopper - Bioavailable Copper destroys pathogens and gives you more energy. (See Blood Video)

Oxy Powder - Natural Colon Cleanser!  Cleans out toxic buildup with oxygen!

Nascent Iodine - Promotes detoxification, mental focus and thyroid health.

Smart Meter Cover -  Reduces Smart Meter radiation by 96%! (See Video).

Report abuse

    Comments

    Your Comments
    Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

    Total 2 comments
    • wunmansho

      Completely blinded by ego. So self important they completely ignore what were being openly told by the deep state. Population control through forced innoculations. Marching people to thier deaths in pre planned wars. Lab created diseases that the elite are already immunized for. Weather control. Fake narratives. Either that or they’re paid professional liars…

    • Anonymous

      Wrong. CO2 is plant food you moron.

    MOST RECENT
    Load more ...

    SignUp

    Login

    Newsletter

    Email this story
    Email this story

    If you really want to ban this commenter, please write down the reason:

    If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.