« Singularity Summit Opens | Main | Cory Doctorow on Copyright Wars »
May 13, 2006
Existential Risks of Artificial Intelligence
Nick Bostrom, Oxford's Director of the Future of Humanity Institute is on stage at Stanford talking about existential risks of artificial intelligence. I talked to him in the green room prior to him entering stage left. He has an interesting mix of a Swedish and British (London) accent.
There are a number of Londoners among us, natives as well as those who have spent a few years studying or working there.
Made me nostalgic for Europe. An interesting pattern lately, perhaps because of my greater distance from the continent than when I lived back east. Just knowing that. Perhaps.
Bostrom talks about our experience with disasters over the years and shows us the pattern, that we have always faced natural and non-natural disasters, and that when we face a new crisis, it is not new. And then there are the risks that emerge from our human behavior.
He talks about some of the types of existential risks of artificial intelligence, in the following categories:
Bangs: earth-originating intelligent life goes extinct in relatively sudden disaster.
Crunches: humanity's potential to develop into posthumanity is permanently lost, although human life continues in some form.
Shries: a limited form of posthumanity is durably attained, but it is an extremely narrow band of what is possible and desirable.
Whimpers: a posthuman civilization is temporarily attained but it evolves in a direction that leads gradually to either the complete disappearance of things we value or to a state to where these things that are realized to only a minuscule degree of what could have been achieved.
Bangs are:
--nanotechnological weapons systems
--badly programmed superintelligence
--we are living in a simulaton and it gets shut down
--nuclear holocaust
--biological weapon
--nanotechnology non-weapons accident
--natural pandemic
--runaway global warming
--supervolcano eruptions
--physics disasters
--impact hazards (asteroid and comets)
--space radiation (solar flares, supernovae, black hole explosions, gamma-ray bursts, galactic center outbursts)
Crunches can include:
--resource depletion or ecological destruction
--misguided world government or another static social equilibrium stops technological progress
--dysgenic pressures (evolutionary)
--technological arrest
--social collapse
Whimper examples are:
--our potential or even our core values are eroded by evolutionary development or self modification
--killed by an extraterrestrial civilization
--loss of human fertility/escapism
WOW, a lot to capture in twenty minutes and certainly dense material for an audience of this size, with the amount of diversity that exists around me. This is intense stuff; I would be particularly interested in talking to him in detail about the whimper category and a handful of the 'bangs.'
If we can reduce these risks by even a marginal amount, its better than nothing, and can have more of a dramatic result than we think. Even if its 1%, its worth it.
What can we do?
--More research into methodology and into specific risk categories
--Build institutions and scientific/policy communitiies with ERs
Spoken like a pure academic....I'm sitting here thinking, "I'm a consumer - if I'm not into research, what can I do in my daily life?"
Tag: Singularity Summit
Tag: Singularity
Tag: Future
May 13, 2006 in Conference Highlights, Events, On Science, On Technology, San Francisco | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d83451c79e69e200d8352c759953ef
Listed below are links to weblogs that reference Existential Risks of Artificial Intelligence:
Comments
Recent scientific research has shown that the phenomenon of Magnetic reconnection is responsible for solar flares. Magnetic reconnection is the name given to the rearrangement of magnetic lines of force when two oppositely directed magnetic fields are brought together. This rearrangement is accompanied with a sudden release of energy stored in the original oppositely directed fields.
Posted by: generic viagra | Apr 9, 2010 1:18:55 PM
--our potential or even our core values are eroded by evolutionary development or self modification
--killed by an extraterrestrial civilization
--loss of human fertility/escapism
Posted by: Jerseys | Jul 19, 2010 8:26:29 PM
The comments to this entry are closed.