“How Not to Be Wrong: The Power of Mathematical Thinking” by Jordan Ellenberg Part One

This book by Jordan Ellenberg really ties a lot of ideas from other books together. He touches on elements of prospect theory as in “Thinking, Fast and Slow” as well as rational decision making like “Think Like a Freak” and “Predictably Irrational”. This book goes one step further because it also relies heavily on statistical thinking but with the caution preached in “The Signal and the Noise”. Finally, there is a firm foundation of historical significance and idea development that makes it feel like “The Innovators” but with a very practical approach like “A Random Walk Down Wall Street.”

  1. Math is a science of not being wrong about things, its techniques and habits hammered out by centuries of hard work and argument.
  2. The winners are usually the guys who get 5% fewer of their planes shot down, or use 5% less fuel, or get 5% more nutrition in to their infantry at 95% of the cost. That’s not the stuff war movies are made of, but it’s the stuff wars are made of. And there’s math every step of the way.
  3. A mathematician is always asking, “What assumptions are you making? And are they justified?”
  4. It is exceptional that one should be able to acquire the understanding of a process without having previously acquired a deep familiarity with running it, with using it, before one has assimilated it in an instinctive and empirical way.
  5. Nonlinear thinking means which way you should go depends on where you already are.
  6. Thomas Edison was a nut, Leibniz was a nut, Galileo was a nut, so forth and so on. Everybody who comes with a new idea to the conventional wisdom, comes with an idea that’s so far outside the mainstream, that’s considered nutty.
  7. That’s one reason linear thinking comes so naturally to us: our intuition about time and motion is formed by the phenomena we observe in the world. Even before Newton codified his laws, something in us knew that things like to move in straight lines, unless given a reason to do otherwise.
  8. Working an integral or performing a linear regression is something a computer can do quite effectively. Understanding whether the result makes sense—or deciding whether the method is the right one to use in the first place—requires a guiding human hand. When we teach mathematics we are supposed to be explaining how to be that guide. A math course that fails to do so is essentially training the student to be a very slow, buggy version of Microsoft Excel.
  9. Don’t talk about percentages of numbers when the numbers might be negative.
  10. Dividing one number by another is mere computation; figuring out what you should divide by what is mathematics.
  11. When a company launches a mutual fund, they often maintain the fund in-house for some time before opening it to the public, a practice called incubation. The life of an incubated fund is not as warm and safe as the name might suggest. Typically, companies incubate lots of funds at once, experimenting with numerous investment strategies and allocations. The funds jostle and compete in the womb. Some show handsome returns, and are quickly made available to the pubic, with extensive documentation of their earnings so far. But the runts of the litter are mercy-killed, often without any public notice that they ever existed.
  12. The Baltimore Stockbroker con works because, like all good magic tricks, it doesn’t try to fool you outright. That is, it doesn’t try to tell you something false—rather, it tells you something true from which you’re likely to draw incorrect conclusions.
  13. The universe is big, and if you’re sufficiently attuned to amazingly improbably occurrences, you’ll find them. Improbably things happen a lot.
  14. The more chances you give yourself to be surprised, the higher your threshold for surprise had better be.
  15. With some effort, we can shoehorn the weather into the frequentist model; maybe we mean that among some large population of days with conditions similar to this one, the following day was rainy 20% of the time. But then you’re stuck when asked, “What’s the probability that the human race will go extinct in the next thousand years?” This is, almost by definition, an experiment you can’t repeat. We use probability even to talk about events that cannot possibly be thought of as subject to chance.
  16. We find ourselves able to say, of questions like this, “It seems improbable” or “It seems likely.” Once we’ve done so, how can we resist the temptation to ask, “How likely?”
  17. The “does nothing” scenario is called the null hypothesis. That is, the null hypothesis is the hypothesis that the intervention you’re studying has no effect.
  18. It’s not enough that the data be consistent with your theory; they have to be inconsistent with the negation of your theory, the dreaded null hypothesis.
  19. Here’s the procedure for ruling out the null hypothesis, in executive bullet-point form:
    1. Run an experiment.
    2. Suppose the null hypothesis is true, and let p be the probability (under that hypothesis) of getting results as extreme as those observed.
    3. The number p is called the p-value. If it is very small, rejoice; you get to say your results are statistically significant. If it is large, concede that the null hypothesis has not be ruled out.
  20. So : significance. In common language it means something like “important” or “meaningful.” But the significance test that scientists use doesn’t measure importance. When we’re testing the effect of a new drug, the null hypothesis is that there is no effect at all; so to reject the null hypothesis is merely to make a judgment that the effect of the drug is not zero. But the effect could still be very small—so small that the drug isn’t effective in any sense that an ordinary non-mathematical Anglophone would call significant.
  21. Twice a tiny number is a tiny number. How good or bad it is to double something depends on how big that something is. Risk ratios are much easier for the brain to grasp than tiny splinters of probability like 1 in 7,000. But risk ratios applied to small probabilities can easily mislead you.
  22. A significance test is a scientific instrument, and like any other instrument, it has a certain degree of precision. If you make the test more sensitive—by increasing the size of the studied population, for example—you enable yourself to see ever-smaller effects. That’s the power of the method, but also its danger. The truth is, the null hypothesis, if we take it literally, is probably just about always false.
  23. If only we could go back in time to the dawn of statistical nomenclature and declare that a result passing Fisher’s test with a p-value of less than 0.05 was “statistically noticeable” or “statistically detectable” instead of “statistically significant”! That would be truer to the meaning of the method, which merely counsels us about the existence of an effect but is silent about its size or importance.
  24. Gilovich, Vallone, and Tversky are absolutely correct in their central contention that human beings are quick to perceive patterns where they don’t exist and to overestimate their strength where they do.
  25. It’s tempting to think of “very improbable” as meaning “essentially impossible,” and, from there, to utter the word “essentially” more and more quietly in our mind’s voice until we stop paying attention to it. But impossible and improbable are not the same—not even close. Impossible things never happen. But improbable things happen a lot.
  26. Prime Number Theorem: Among the first N numbers, about N/log N are prime.
  27. The logarithm of a positive number N, called log N, is the number of digits it has.


How Not to Be Wrong: The Power of Mathematical Thinking

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s