Risk comes in lots of different forms. In Skin in the Game, Taleb's last, underrated book. He breaks risk down into ensemble probabilities and time probabilities. On top of that he demonstrates that risk operates differently at different scales. And that if we want to avoid large scale ruin—ruin at the level of nations or all of humanity—that we should be trying to push risk down the scale.
As human beings we have a unique ability to recognize patterns, even when confronted events that are completely random. In fact sometimes it's easier to see patterns in random noise. We pull narratives out of the randomness and use them to predict the future. Unfortunately the future is unpredictable and even when we have detected a pattern the outcomes end up being very different than what we expected.
The US-backed regime in Afghanistan lasted 9 days from the taking of the first provincial capital to the taking of Kabul. After the withdrawal of the Soviets in 1989. That government lasted over three years. What was the difference? Why after spending two trillion dollars and twice as long in the country did we do so much worse? Francis Fukuyama has asserted that the are no ideologies which can compete with liberal democracy. And everyone seems to believe that, but if that's so why did we have such a hard time implanting it in Afghanistan?
Philip Tetlock has been arguing for awhile that experts are horrible at prediction, but that his superforecasters do much better. If that's the case how did they do with respect to the fall of Afghanistan? As far as I can tell they didn't make any predictions on how long the Afghanistan government would last. Or they did make predictions and they were just as wrong as everyone else and they've buried them. In light of this I thought it was time to revisit the limitations and distortions inherent in Tetlock's superforecasting project.
In the ongoing discussion of dealing with an uncertain future I put forth the idea that believing in God and belonging to a religion represents "easy mode".
When people consider the harms which might be caused by technology, they often point to the "precautionary principle" as a possible way to mitigate those harms. This principle seems straight forward but once you actually try to apply it the difficulties become obvious. In particular how do you ensure that you're not delaying the introduction of beneficial technologies. How do you insure the harms of delaying technology are not greater than the harms which might be caused by that technology. In this episode we examine several examples of how this principle might be applied. It isn't easy, but it does seem like something we need to master as new technologies continue to arrive.
My hot take on the situation in Afghanistan.
Highlights:
-Why couldn't we have maintained a presence at Bagram, even if we pulled out everywhere else (think Guantanamo and Cuba).
-Biden had more flexibility than he claimed.
-It feels like this might lead to a loss of confidence similar to what we experienced after Vietnam
-The effect on our allies may be the worst consequence of our withdrawal.
I discussed Fermi's Paradox in my last newsletter. In this I discuss the hint it provides that technology may be inevitably linked to extinction. That the reason the universe is not teeming with aliens is that the technology to get to that point presents insuperable risks which cannot be overcome.
As I said this is a hint, but I think it's a hint we need to take seriously.
The massive attention being paid to UFOs in the form of the Pentagon/Naval videos has rekindled interest in the subject and by extension interest in Fermi's Paradox. I think people's interest in these subjects is entirely too trivial. Treating it as a curiosity rather than one of the most important indications of what the future has in store for humanity — either eventual doom or being terribly alone.
In a continuation of the last episode I examine my favorite explanation for the inflection point in 1971: that this is when energy decoupled from economic growth. Economic output which has no connection to energy usage is a new and strange beast, much easier to manipulate in ways that produce inequality and inflation and all the other ills which have afflicted us since the early 70s.
The website wtfhappenedin1971.com presents a series of charts which show that there was inflection in rates of everything from inequality to obesity in 1971. In every case with things getting worse. Why would that be? In this episode I examine at 8 explanations (possibly more depending on how you count). Full warning, my favorite explanation is not included. That will be the subject of my next episode.
And here is where I have cordoned off spoilers for Project Hail Mary. Listen at your own risk.
My capsule reviews for the month:
This episode is in three parts. First is the eschatological reviews:
I've been talking about the knobs of society in my newsletters. Well one of the knobs we appear to have lost all fear of is the spending knob and we've decided we can pretty much turn it as high as we want without consequence. And yet everyone regardless of their economic ideology realizes that we can't turn it up forever. And the key problem is that people imagine that when the time comes when we need to moderate our spending that it will be easy to turn down. I very much doubt that.
I recently encountered the term Wizards and Prophets as a way of describing those who were, respectively, optimistic about technology or pessimistic about it. I think this is a good way of thinking about things, and as the context I encountered these terms ended up being a full-throated defense of wizardry, I thought it might be worthwhile to offer up a defence of Prophets. Those who contend that we are playing a dangerous game, one whose stakes Wizards may not entirely understand. The recent resurgence of the Wuhan lab-leak theory for the pandemic proved very timely.
Making any predictions about China is difficult, but that doesn't mean it's not important. It may in fact be one of the most important things we can do if we want to have some idea of what the future holds. And while predictions are difficult, it does seem like a worthwhile endeavor to look at potential inflection points. Points where we can definitely say that past here things are very different. In this episode I offer up some potential inflection points. I'm not sure that any of them will come to pass, to say nothing of all of them, but they provide a useful marker for where China is headed and what it might mean should it arrive there.
In my last newsletter I described the temple of technology and progress with a countless knobs that could be turned. Some of the knobs obviously inspire caution, but some seem like an unalloyed good. Like the knob for safety. Accordingly that's what we've done we've turned the knob of safety all the way to 11, but as with all progress the effects have not always been what we expect. For example when you maximize safety you can't actually maximize safety, you can only maximize it's perceived importance, which is how we ended up in a situation where, in the midst of a deadly pandemic, we have paused, or refused to approve, or otherwise restricted vaccines, dooming thousands because the vaccines are not entirely risk free. But is anything?
Lately there have been a lot of attempts to relitigate history. It is felt that taking history which has been ignored and giving it new emphasis will both increase the accuracy of that history and also help mitigate the negative effects of historical events. I show that this is generally not the case and that what we choose to emphasis is more based on the narrative we're pushing than the actual impact of the history or event in question.
There were various approaches to fighting COVID, and in retrospect we ended up with the worst of all. It's understandable that we didn't follow China in taking the authoritarian approach. And it's also understandable that we weren't going to be as lackadaisical as we were in 1918. But what kept us from taking the technolibertarian approach of human challenge trials, first doses first, and approving the Astrazeneca vaccine as soon as Europe did. And more importantly why are we now taking the exact opposite approach, "pausing" Johnson and Johnson, while Europe restricts Astrazeneca? Why are we so bold when it comes to government spending and so timid when it comes to vaccine safety?