It's that time of year when people make predictions. I also make predictions though I do them somewhat differently. Mostly I'm interested in Identifying potential catastrophes and dismissing potential salvation. For example, nukes will get used again, and a benevolent AI won't save us.
The key thing is not to make accurate predictions, but to make useful predictions. And as it turns out there's a big difference between the two.
I decide to take the end of the year off. But I didn't want to leave my loyal listeners without the normally scheduled episode. So here you go the first ever "We Are Not Saved" Classic!!
It's my review and discussion of Neil Postman's classic "Amusing Ourselves to Death". One of the best books of the last 50 years!
The ninth book and sixth season of The Expanse were both just released. I haven't watched much of the TV show, but I did just finish reading the final book and as I did so it occurred to me that the way it handled Fermi's paradox might provide a useful way of understanding my own fixation on it. And why I think it presents a huge challenge to anyone who thinks that humanity is on an unending upward slope that will eventually take us to the stars.
Lately people have been using the idea that something is a black swan as excuse for being powerless. as an excuse. But this is not only a massive abdication of responsibility, it’s also an equally massive misunderstanding of the moment. Because preparedness has no meaning if it’s not directed towards preparing for black swans. There is nothing else worth preparing for.
The future is the product of the black swans we have yet to encounter.
A couple of months ago Gwern published a list of improvements since 1990. I thought it gave short shrift to the many changes which have been wrought upon society by technological progress. He does include a section on "Society" but it's woefully inadequate, and despite having a further theme to the list of identifying "unseen" changes he overlooks many of the intangible harms which progress might or might not have inflicted on us. To illustrate this I bring in the story of my great-grandmother, which I don't want to cheapen with a summary.
I got some pushback on the episodes I did about Afghanistan. Some of it was directed at the idea that "we are no longer a serious people". But this pushback, rather than talking me out of the position made me explore it even more deeply. This episode is the result of that exploration. As part of it I bring in recent difficulties experienced by the CIA, the Vietnam War, and the differences between right and left brained processing.
There are at least two kinds of randomness in the world: normal, as in a normal distribution or a bell curve, and extreme. As humans we're used to the normal distribution. That's the kind of thing we dealt with a lot over the thousands of years of our existence. It's only recently that the extreme distribution has come to predominate. Nassim Taleb has labeled the first mediocristan and the second extremistan. In this podcast we explore the difference between the two and how the tools of mediocristan are inadequate to the disasters of extremistan.
As I record this Congress is debating whether they should pass a $3.5 trillion bill or only a $1.5 trillion one. The former would equal $27,000 per household, while the latter would only be $12,000 per household. And yet when people are asked whether they would pay more to deal with problems like climate change only 34% are willing to pay more than $10 a month. People have no skin in the game on the former and they can at least imagine they have skin in the game on the latter, and in this episode I argue that this makes all the difference.
Risk comes in lots of different forms. In Skin in the Game, Taleb's last, underrated book. He breaks risk down into ensemble probabilities and time probabilities. On top of that he demonstrates that risk operates differently at different scales. And that if we want to avoid large scale ruin—ruin at the level of nations or all of humanity—that we should be trying to push risk down the scale.
As human beings we have a unique ability to recognize patterns, even when confronted events that are completely random. In fact sometimes it's easier to see patterns in random noise. We pull narratives out of the randomness and use them to predict the future. Unfortunately the future is unpredictable and even when we have detected a pattern the outcomes end up being very different than what we expected.
The US-backed regime in Afghanistan lasted 9 days from the taking of the first provincial capital to the taking of Kabul. After the withdrawal of the Soviets in 1989. That government lasted over three years. What was the difference? Why after spending two trillion dollars and twice as long in the country did we do so much worse? Francis Fukuyama has asserted that the are no ideologies which can compete with liberal democracy. And everyone seems to believe that, but if that's so why did we have such a hard time implanting it in Afghanistan?
Philip Tetlock has been arguing for awhile that experts are horrible at prediction, but that his superforecasters do much better. If that's the case how did they do with respect to the fall of Afghanistan? As far as I can tell they didn't make any predictions on how long the Afghanistan government would last. Or they did make predictions and they were just as wrong as everyone else and they've buried them. In light of this I thought it was time to revisit the limitations and distortions inherent in Tetlock's superforecasting project.
In the ongoing discussion of dealing with an uncertain future I put forth the idea that believing in God and belonging to a religion represents "easy mode".
When people consider the harms which might be caused by technology, they often point to the "precautionary principle" as a possible way to mitigate those harms. This principle seems straight forward but once you actually try to apply it the difficulties become obvious. In particular how do you ensure that you're not delaying the introduction of beneficial technologies. How do you insure the harms of delaying technology are not greater than the harms which might be caused by that technology. In this episode we examine several examples of how this principle might be applied. It isn't easy, but it does seem like something we need to master as new technologies continue to arrive.
My hot take on the situation in Afghanistan.
Highlights:
-Why couldn't we have maintained a presence at Bagram, even if we pulled out everywhere else (think Guantanamo and Cuba).
-Biden had more flexibility than he claimed.
-It feels like this might lead to a loss of confidence similar to what we experienced after Vietnam
-The effect on our allies may be the worst consequence of our withdrawal.
I discussed Fermi's Paradox in my last newsletter. In this I discuss the hint it provides that technology may be inevitably linked to extinction. That the reason the universe is not teeming with aliens is that the technology to get to that point presents insuperable risks which cannot be overcome.
As I said this is a hint, but I think it's a hint we need to take seriously.
The massive attention being paid to UFOs in the form of the Pentagon/Naval videos has rekindled interest in the subject and by extension interest in Fermi's Paradox. I think people's interest in these subjects is entirely too trivial. Treating it as a curiosity rather than one of the most important indications of what the future has in store for humanity — either eventual doom or being terribly alone.
In a continuation of the last episode I examine my favorite explanation for the inflection point in 1971: that this is when energy decoupled from economic growth. Economic output which has no connection to energy usage is a new and strange beast, much easier to manipulate in ways that produce inequality and inflation and all the other ills which have afflicted us since the early 70s.
The website wtfhappenedin1971.com presents a series of charts which show that there was inflection in rates of everything from inequality to obesity in 1971. In every case with things getting worse. Why would that be? In this episode I examine at 8 explanations (possibly more depending on how you count). Full warning, my favorite explanation is not included. That will be the subject of my next episode.