This book was on my summer reading list; I had enjoyed his other book Fooled by Randomness which was mostly about the fallacy of finding patterns when none exist.
This book has political implications especially in economics; witness the dust up between the more conservative economists who use mathematical modeling heavily (Fresh Water economists; e. g. University of Chicago) versus the more liberal economists who claim that the models don't work (Salt Water economists; e. g. Berkeley, Ivy schools). Paul Krugman is one of the more famous Salt water economists.
The review of The Black Swan is below the fold.
(Cross posted at my personal blog)
The short: I enjoyed the book and found it hard to put down. It challenged some of my thinking and changed the way that I look at things.
What I didn't like: the book was very inefficient; he could have conveyed the same message in about 1/3 of the pages.
But: the fluff/padding was still interesting; the author has a sense of humor and writes in an entertaining style.
What is the gist of the book? Well, the lessons are basically these:
- Some processes lend themselves to being mathematically modeled, others don't. Unfortunately, some people use mathematical models in situations where it is inappropriate to do so (e. g., making long term forecasts about the economy). People who rely too much on mathematical modeling are caught unprepared (or just plain surprised) when some situation arises that wasn't considered possible in the mathematical model (e. g., think of a boxer getting in a fight with someone who grabs, kicks and bites).
- Some processes can be effectively modeled by the normal distribution, others can't. Example: suppose you are machining bolts and are concerned about quality, as, say, measured by the width of the bolt. That sort of process lends itself to a normal distribution; after all, if the specification is, say, 1 cm, there is no way that an errant bolt will be, say, 10 cm wide. On the other hand, if you are talking about stock markets, it is possible that some catastrophic event (called a "black swan") can occur that causes the market to, say, lose half or even 2/3'rd of its value. If one tried to model recent market price changes by some sort of normal-like distribution, such a large variation would be deemed as being all but impossible.
- Sometimes these extremely rare events have catastrophic outcomes. But these events are often impossible to predict beforehand, even if people do "after the fact studies" that say "see, you should have predicted this."
- The future catastrophic event is, more often than not, one that hasn't happened before. The ones that happened in the past, in many cases, won't happen again (e. g., terrorists successfully coordinating at attack that slams airplanes into buildings). But the past catastrophic events are the ones that people prepare for! Bottom line: sometimes, preparing to react better is possible where being proactive is, in fact, counter productive.
- Sometimes humans look for and find patterns that are really just coincidence, and then use faulty logic to make an inference. Example: suppose you interview 100 successful CEO's and find that all of them pray to Jesus each day. So, obviously, praying to Jesus is a factor in becoming a CEO, right? Well, you need to look at everyone in business who prayed to Jesus and see how many of them became CEOs; often that part of the study is not done. Very rarely do we examine what the failures did.
I admit that I had to laugh at his repeated slamming of academics (I am an academic). In one place, he imagines a meeting between someone named "Fat Tony" and an academic. Taleb poses the problem: "suppose you are told that a coin is fair. Now you flip it 99 times and it comes up heads. On the 100'th flip, what the odds of another head?" Fat Tony says something like "about 99 percent" where the academic says "50 percent".
Frankly, that hypothetical story is pure nonsense. In this case, the academic is really saying "if I am 100 percent sure that the coin is fair, there is a Black Swan even that has 100 heads in a row" though, in reality, the academic would reject the null hypothesis that the coin is fair as the probability of a fair coin coming up heads 99 times in a row is 2^{-99} which is way in the rejection region of a statistical test.
Taleb also discusses an interesting aspect of human nature that I didn't believe at first..until I tried it out with friends. This is a demonstration: ask your friend "which is more likely:
- A random person drives drunk and gets into an auto accident or
- A random person gets into an auto accident.
Or you could ask: "which is more likely: a random person:
- Is a smoker and gets lung cancer or
- Gets lung cancer.
Of course, the correct answer in each case is "2": the set of all auto accidents caused by drunk driving is a subset of all auto accidents and the set of all lung cancer cases due to smoking is a subset of all lung cancer cases.
But when I did this, my friend chose "1"!!!!!!
I had to shake my head, but that is a human tendency.
One other oddity of the book toward the end, Taleb discusses fitness. He mentions that he hit on the perfect fitness program by asking himself: "what did early humans do? Ans.: walk long distances to hunt, and engage in short burst of high intensity activity". He then decided to walk long, slow distances and do sprints every so often.
Well, nature also had humans die early of various diseases; any vaccine or cure works against "mother nature". So I hardly view nature as always being optimal. But I did note with amusement that Taleb walks 10-15 hours a week, which translates to 30-45 miles per week! (20 minutes per mile pace).
I'd say THAT is why he is fit. :)
(note: since I love to hike and walk long distances, this comment was interesting to me)