On July 4 I began a series of diaries on the prospects for human extinction. To be catchy, I called it "saving the world". However, since saving the world can mean different things, I'm renaming the series Human Extinction. Much more clear to the point. I'm focusing on the threats to our very survival over the next century or so. (The scary part: these threats are not very unlikely.)
Here I'm going to review the other DailyKos diaries I could readily find on the topic. I do this to examine how what appears to be the most important issue has already been covered here and to help us avoid repetition, as well as to highlight the DK discussion for my non-DK acquaintances. The discussion here largely mirrors today's broader public discourse: lots on climate change, nuclear weapons, and pandemics, particularly H5N1/bird flu; little on AI and "insurance policies" (space colonization, etc). Also, interesting cultural discussions. I think we would be wise to discuss all of these further, especially the "what to do about it".
In the comments, please tell me: Are my critiques fair? Am I missing anything?
The diaries here come mainly from the extinction and doomsday tags and from a quick Google search. There may be others I missed.
Technical Discussions
Diaries discussing possible causes of our extinction as well as solutions:
Timothy Scriven had an excellent series of posts in November 2006. First, in What do y'all think?The "Alliance to rescue civilization", he described a prominent proposal to use space colonization as an "insurance policy" against something very bad happening here on Earth. A lengthy discussion followed. Of note, one commenter suggested "It would be far simpler and cheaper to make a hermetically sealed colony here on earth". This is a good point, and mirrors discussion had at my Human Extinction 2: Government Response among other places. Then, in In defence of doomsday, Scriven described some threats to our survival and argued that given these, concern for our survival is justified. I agree. Lastly, in Doomsday news, Scriven linked to a few stories that gave him cause for concern, including one about a guy who built a cruise missile in his garage. Most of these came from the Lifeboat Foundation world news page which appears to continue to get updated.
See also the space colonization tag.
In Past Seven Minutes 'til the End of the World, Nathan Jaco pulls the legendary Einstein quote "I do not know what weapons World War III will be fought with. World War IV will be fought with sticks and stones." to discuss prospects for nuclear war, including possible failure in automated launch systems. He also links to the Bulletin of the Atomic Scientists’ Doomsday Clock and supports their recommendation of nuclear disarmament. While I'm inclined to agree with this, I'm not the expert and I do know there is disagreement on the wisdom of disarmament. For example, from the Wikipedia article on Nobel Laureate game theorist Robert Aumann, "Simplistic peacemaking can cause war, while arms race, credible war threats and mutually assured destruction can reliably prevent war."
The nuclear weapons tag has much, much, more.
A few diaries on the AI threat: Artificial Intelligence and the US Gov't. - similar models of control? by PlaneCrazy discusses J. Storrs Hall's Is AI Near a Takeoff Point?, particularly the parallels to government checks and balances. Importantly, the diary asks, "How do you control, how do you regulate such a "self-creating, self-modifying intelligent system"?" I'm concerned that Hall is over-optimistic about our capacity to fare well among something (AI or otherwise) vastly more intelligent and capable than we are, much like how our livestock animals don't fare well among us. rlepre discusses attending The Singularity Summit (2006). The 2007 Summit is September 8-9 in San Francisco. Finally, in Brother, Can You Paradigm?, Devilstower discusses the prominent "singularitarian" Ray Kurzweil. Kurzweil typically gives the impression of being unconcerned about AI as a threat, although he is on the Singularity Institute for Artificial Intelligence board. (SIAI aims to ensure that if an AI singularity occurs then it will be a good thing. Note: I am a guest blogger on the SIAI blog, which reminds me, I'm overdue for a post there...)
Unsurprisingly, there were by far the most diaries connecting climate change to human extinction- and I didn't even touch the climate change tag. This is a pretty familiar matter, so I'll list some diaries but won't elaborate.
One climate change diary worth checking out is Geoengineering: The last resort option to save the Earth from runaway global warming. Here, Orangebeard discussed, well, the title speaks for itself. I think this is a story to follow over the years, and Nobel Laureate Tom Schelling does too. (Schelling shared the 2005 Econ prize with Aumann for "having enhanced our understanding of conflict and cooperation through game-theory analysis". Nice to see both popping up here.)
The pandemic tag points to a lengthy series of diaries, especially on H5N1/bird flu. I won't attempt to review these becomes there's so much and others here know the topic much better than I do.
In How to destroy the Earth, Max Wyvern discusses a feature on livescience.com about how to take out the whole planet, not just us. These are mostly physics-based disasters strangelets, black holes, etc, some of which are possible outcomes of certain high-energy physics experiments. While these are not likely, they are on the list. Shutting down these experiments has been proposed and might not be a bad idea. Even a one-in-a-million risk of annihilation seems much too high to me. Over the extremely-long-term, however, we might need the exotic physics knowledge to survive astronomical obstacles like proton decay- see Future of the Universe.
In Why Few Discuss Catastrophic Collapse: GBB, David Sternfeld discusses the implications of "supply shortages of energy reserves". I don't typically see supply shortages on lists of events that could cause our extinction, mainly because we can survive as a species without so much energy and because I expect that price increases would help ease the societal impact. However, I could be wrong. Also, the interplay between fossil fuel depletion and climate change is interesting. How much can we damage our climate before we run out of fuel? I don't know, but my impression is a lot.
In Extinction by Inches, Nonpartisan discusses the current extinction event, i.e. the Holocene or Anthropocene. Most of the diaries on the extinction tag discussed this; I only bring this one up because of how it connected the extinction of other species to our own extinction. Nonpartisan claimed that it is somehow "the greatest threat to humanity". I'm pretty sure this is false- we can survive and even thrive without the existence of many of these other species. While biodiversity may help us (medicinal value?), I strongly doubt it's crucial. That doesn't mean we shouldn't protect biodiversity- it just means we shouldn't always do so for our own survival.
On that note, a few posts discussed the Norway/Doomsday/Svalbard Global Seed Vault. In Gone to Seed, Shadow of a doubt draws parallels between the vault and "the Science Fiction classic, A Mote in God's Eye". Interesting. In Global Seed Vault in Norway, smokeymonkey notes that Norway's vault "is the only facility of its kind. That's because the majority of the world's seed banks are regional, commercial facilities." and points to similar projects in the US, UK, and Russia. Lastly, the doomsday garden mentions the story without commentary.
Extinction & Culture
Several diaries intertwined extinction prospects with cultural (including political) phenomena:
In It's the American Way or the Highway: Your Extinction Will Quell Your Moral and Intellectual Confusion, Jason Miller cites simple greed as a driving factor in the path towards extinction. There may be some truth to this. We see it in, among many other places, cases where people use an SUV when they know it's hurting others via climate change. Measures like a gas/carbon tax would be particularly useful here, as does simple social pressure: See Prius Popularity Powered by Vanity. I'm not sure how well the same measures can be extended to, say, nuclear war, but we would be wise to consider actual human psychology in our thinking.
In Has The Rapture Already Begun?, Vyan intertwines with trends on climate change, pandemics, war in the Middle East to the religious imagery (famine, pestilence, war) that one could at least interpret these days as being the End. I've pondered this one too, as probably have many of you. I just hope no religious fanatics actually try to hasten humanity's extinction. I've asked around a bit and am not aware of any who would, but I'm still not the religion expert. (For more, see Wikipedia: Eschatology.)
In Stephen Hawking Asks, BlueTide discusses everything from humans living millennia to religious fundamentalism to anarcho-captitalism to the military-industrial complex to climate in the context of Hawking's question, "In a world that is in chaos politically, socially and environmentally, how can the human race sustain another 100 years?". Lengthy discussion follows, especially on Hawking's proposal to colonize space as an insurance policy (and more). The Double Doomsday Scenario is shorter but similar.
In W of Mass Destruction, Michael Alton Gottlieb claims that George W Bush "is the proverbial and infamous Doomsday Machine". I don't want to speak to soon, but it appears as if humanity will live to see the 44th US President. GWB clearly has been less effective on mitigating climate change than others would have, and seems to have been less effective of a diplomat too, though I'm sure many will haggle for years on the wisdom of GWB's foreign policy.
Finally, in The Paradigm Shift part II (see also part III and part I), pegleghippie mixes thoughts on economics (including Marx & Keynes) with veganism, transhumanism, religion, morality, and more. The series picks up on several important trends, and I like its idea of "paradigm shift". I will admit having had similar thoughts before. While we probably disagree on at least some details of what the shift should look like, and we might both be way off in thinking it's even remotely feasible, it does make for interesting perspective.
And we'll end on that note. To repeat verbatim from the top: The discussion here largely mirrors today's broader public discourse: lots on climate change, nuclear weapons, and pandemics, particularly H5N1/bird flu; little on AI and "insurance policies" (space colonization, etc). Also, interesting cultural discussions. I think we would be wise to discuss all of these further, especially the "what to do about it".
In the comments, please tell me: Are my critiques fair? Am I missing anything?
Off-topic: I was surprised by the unusually large portion of diaries by real-name users here. Does human extinction somehow beckon the non-anonymous?