Info

We Are Not Saved

We Are Not Saved is a podcast covering Eschatology. While this concept has traditionally been a religious one, and concerned with the end of creation, in this podcast that study has been broadened to include secular ways the world could end (so called x-risks) and also deepened to cover the potential end of nations, cultures and civilizations. The title is taken from the book of Jeremiah, Chapter 8, verse 20: The harvest is past, the summer is ended, and we are not saved.
RSS Feed
We Are Not Saved
2024
November
October
September
August
July
June
May
April
March
February
January


2023
December
November
October
September
August
July
June
May
April
March
February
January


2022
December
November
October
September
August
July
June
May
April
March
February
January


2021
December
November
October
September
August
July
June
May
April
March
February
January


2020
December
November
October
September
August
July
June
May
April
March
February
January


2019
December
November
October
September
August
July
June
May
April
March
February
January


2018
December
November
October
September
August
July
June
May
April
March
February
January


2017
December
November
October
September
August
July
June
May
April
March
February
January


2016
December
November


Categories

All Episodes
Archives
Categories
Now displaying: May, 2019
May 30, 2019

I had not intended to revisit abortion so soon, but the previous post generated some interesting comments on a wide range of issues, so I decided to collect them and answer in the form of a post. In particular I should have paid more attention to the actual women involved in what is objectively a horrible decision to have to make. But there are other nuances as well that deserve more space.

May 24, 2019

I was reading the Iliad recently and I was struck by the fact that while there were a lot of horses that no one rode them, they were all used to pull chariots. Horses had been domesticated for thousands of years but no one thought to ride them. And it would be another couple thousand years before someone came up with the idea of a stirrup. This illustrates that a technology can be around for a long time and then suddenly someone will figure out a new way of using it which ends up being incredibly effective. Could this happen with Nukes? 

May 18, 2019

Abortion is back in the news, and perhaps unwisely I've decided to give my two cents on the subject. I think most of the things that annoy people about the recent laws are tactics in the larger game of getting the Supreme Court to overturn Roe v. Wade. Though I'm of the opinion that it won't happen regardless, unless Ginsburg dies, which would bring its own level of craziness. But most importantly I think there are genuine disagreements about the morality of abortion which are not going away, and that unless we figure out a way to "agree to disagree" things are going to get ugly.

May 14, 2019

I review the book Walls: A History of Civilization in Blood and Brick, with a particular focus on the way the history of walls has been misinterpreted and distorted by recent examples of wall building. This is a problem, because it's actually more important than ever to understand the correct history of walls as we enter a second age of wall-building. Though most modern walls are built to keep out immigrants not invading armies. 

May 8, 2019

At the moment it seems like nothing can stop the Democratic nominee from beating Trump and nothing can stop Biden from being the democratic nominee. But what are they going to do about immigration? Trump has done two things, made the issue impossible to ignore and also utterly toxic to rational discussion. There are only good people who want de facto open borders and evil people. But any rational assessment of the situation leads to the inevitable conclusion that some restrictions are needed, and not only that, but that the majority of the country wants greater restrictions. What's a Democrat to do? Are they trapped?

May 2, 2019

When people think about AI Risk they think about an artificial superintelligence malevolently and implacably pursuing its goals with humans standing by powerless to stop it. But what if AI Risk is more subtle? What if the same processes which have been so successfully used for image recognition are turned towards increasing engagement? What if that engagement ends up looking a lot like addiction? And what if the easiest way to create that addiction is by eroding the mental health of the user? And what if it's something we're doing to ourselves?

1