Info

We Are Not Saved

We Are Not Saved is a podcast covering Eschatology. While this concept has traditionally been a religious one, and concerned with the end of creation, in this podcast that study has been broadened to include secular ways the world could end (so called x-risks) and also deepened to cover the potential end of nations, cultures and civilizations. The title is taken from the book of Jeremiah, Chapter 8, verse 20: The harvest is past, the summer is ended, and we are not saved.
RSS Feed
We Are Not Saved
2024
March
February
January


2023
December
November
October
September
August
July
June
May
April
March
February
January


2022
December
November
October
September
August
July
June
May
April
March
February
January


2021
December
November
October
September
August
July
June
May
April
March
February
January


2020
December
November
October
September
August
July
June
May
April
March
February
January


2019
December
November
October
September
August
July
June
May
April
March
February
January


2018
December
November
October
September
August
July
June
May
April
March
February
January


2017
December
November
October
September
August
July
June
May
April
March
February
January


2016
December
November


Categories

All Episodes
Archives
Categories
Now displaying: Category: Eschatology
Feb 24, 2022

At some point, in some episode (and probably several episodes) I asserted that:

The world is changing faster than we can adapt to it

Then (and now) this statement seemed obvious, so I remember being surprised when I got some pushback on it. But upon reflection it was also illuminating. Many disagreements come down to core values and assumptions which are so deeply embedded that we’ve forgotten they’re there. It’s what makes these disagreements so intractable. We’re arguing from different, unseen foundations. I decided it was past time to unearth this particular foundation, and examine its various parts. What do I mean by “the world” and “change” and “speed” and “adaptation”? And if we can come to an agreement on all of that, what are the consequences of change moving faster than our ability to adapt?

Feb 15, 2022

I take a break from talking about the collapse of society and the world to rant about reading. In particular all the people who say I'm doing it wrong. 

Jan 31, 2022

It's time for my newsletter again, and after going step by step through the ideas of Taleb we finally arrive at his crowning idea: antifragility. Perhaps the biggest contribution Taleb makes to our understanding of the world that by grappling with the idea of the opposite of fragility he was able to define fragility, and point out that the modern world is chock full of it.

Jan 25, 2022

It's not the end of the pandemic or even the beginning of the end, but we might be at the end of the beginning, and since I just read three books on the subject I thought I'd see what could be said at this point. Come for the discussion of school closure and why it might have seemed so important in the beginning, stay for an overview of the lab leak hypothesis. But most of all just listen to the episode!

Jan 15, 2022

I return to a discussion of Douthat's "Deep Places" in particular what it tells about modern epistemology, or as I like to call it, "reality construction". I examine the reality constructed by Douthat, but also the differences between how we constructed reality during the 1918 pandemic vs. how we construct it now. Come for the history, stay for the murderous story of aspirin.

Jan 8, 2022
  1. Why Liberalism Failed by: Patrick J. Deenen
  2. Leviathan Falls by: James S. A. Corey
  3. Termination Shock by: Neal Stephenson
  4. The Histories of Herodotus by: Herodotus 
  5. The Golden Transcendence by: John C. Wright
  6. The Boy, the Mole, the Fox and the Horse by: Charlie MacKesy
  7. Doctrine and Covenants
Dec 31, 2021

It's that time of year when people make predictions. I also make predictions though I do them somewhat differently. Mostly I'm interested in Identifying potential catastrophes and dismissing potential salvation. For example, nukes will get used again, and a benevolent AI won't save us. 

The key thing is not to make accurate predictions, but to make useful predictions. And as it turns out there's a big difference between the two.

Dec 24, 2021

I decide to take the end of the year off. But I didn't want to leave my loyal listeners without the normally scheduled episode. So here you go the first ever "We Are Not Saved" Classic!!

It's my review and discussion of Neil Postman's classic "Amusing Ourselves to Death". One of the best books of the last 50 years!

Dec 16, 2021

The ninth book and sixth season of The Expanse were both just released. I haven't watched much of the TV show, but I did just finish reading the final book and as I did so it occurred to me that the way it handled Fermi's paradox might provide a useful way of understanding my own fixation on it. And why I think it presents a huge challenge to anyone who thinks that humanity is on an unending upward slope that will eventually take us to the stars. 

Nov 30, 2021

Lately people have been using the idea that something is a black swan as excuse for being powerless. as an excuse. But this is not only a massive abdication of responsibility, it’s also an equally massive misunderstanding of the moment. Because preparedness has no meaning if it’s not directed towards preparing for black swans. There is nothing else worth preparing for.

The future is the product of the black swans we have yet to encounter.

Nov 28, 2021

A couple of months ago Gwern published a list of improvements since 1990. I thought it gave short shrift to the many changes which have been wrought upon society by technological progress. He does include a section on "Society" but it's woefully inadequate, and despite having a further theme to the list of identifying "unseen" changes he overlooks many of the intangible harms which progress might or might not have inflicted on us. To illustrate this I bring in the story of my great-grandmother, which I don't want to cheapen with a summary.

Nov 17, 2021

I got some pushback on the episodes I did about Afghanistan. Some of it was directed at the idea that "we are no longer a serious people". But this pushback, rather than talking me out of the position made me explore it even more deeply. This episode is the result of that exploration. As part of it I bring in recent difficulties experienced by the CIA, the Vietnam War, and the differences between right and left brained processing.

Oct 31, 2021

There are at least two kinds of randomness in the world: normal, as in a normal distribution or a bell curve, and extreme. As humans we're used to the normal distribution. That's the kind of thing we dealt with a lot over the thousands of years of our existence. It's only recently that the extreme distribution has come to predominate. Nassim Taleb has labeled the first mediocristan and the second extremistan. In this podcast we explore the difference between the two and how the tools of mediocristan are inadequate to the disasters of extremistan.

Oct 28, 2021

As I record this Congress is debating whether they should pass a $3.5 trillion bill or only a $1.5 trillion one. The former would equal $27,000 per household, while the latter would only be $12,000 per household. And yet when people are asked whether they would pay more to deal with problems like climate change only 34% are willing to pay more than $10 a month. People have no skin in the game on the former and they can at least imagine they have skin in the game on the latter, and in this episode I argue that this makes all the difference.

Oct 15, 2021

Risk comes in lots of different forms. In Skin in the Game, Taleb's last, underrated book. He breaks risk down into ensemble probabilities and time probabilities. On top of that he demonstrates that risk operates differently at different scales. And that if we want to avoid large scale ruin—ruin at the level of nations or all of humanity—that we should be trying to push risk down the scale. 

Sep 30, 2021

As human beings we have a unique ability to recognize patterns, even when confronted events that are completely random. In fact sometimes it's easier to see patterns in random noise. We pull narratives out of the randomness and use them to predict the future. Unfortunately the future is unpredictable and even when we have detected a pattern the outcomes end up being very different than what we expected.

Sep 28, 2021

The US-backed regime in Afghanistan lasted 9 days from the taking of the first provincial capital to the taking of Kabul. After the withdrawal of the Soviets in 1989. That government lasted over three years. What was the difference? Why after spending two trillion dollars and twice as long in the country did we do so much worse? Francis Fukuyama has asserted that the are no ideologies which can compete with liberal democracy. And everyone seems to believe that, but if that's so why did we have such a hard time implanting it in Afghanistan?

Sep 14, 2021

Philip Tetlock has been arguing for awhile that experts are horrible at prediction, but that his superforecasters do much better. If that's the case how did they do with respect to the fall of Afghanistan? As far as I can tell they didn't make any predictions on how long the Afghanistan government would last. Or they did make predictions and they were just as wrong as everyone else and they've buried them.  In light of this I thought it was time to revisit the limitations and distortions inherent in Tetlock's superforecasting project.

Aug 31, 2021

In the ongoing discussion of dealing with an uncertain future I put forth the idea that believing in God and belonging to a religion represents "easy mode". 

Aug 28, 2021

When people consider the harms which might be caused by technology, they often point to the "precautionary principle" as a possible way to mitigate those harms. This principle seems straight forward but once you actually try to apply it the difficulties become obvious. In particular how do you ensure that you're not delaying the introduction of beneficial technologies. How do you insure the harms of delaying technology are not greater than the harms which might be caused by that technology. In this episode we examine several examples of how this principle might be applied. It isn't easy, but it does seem like something we need to master as new technologies continue to arrive.

1 « Previous 2 3 4 5 6 7 8 Next » 9