Hot Best Seller

The Signal and the Noise: Why So Many Predictions Fail—But Some Don't

Availability: Ready to download

One of Wall Street Journal's Best Ten Works of Nonfiction in 2012 New York Times Bestseller "Not so different in spirit from the way public intellectuals like John Kenneth Galbraith once shaped discussions of economic policy and public figures like Walter Cronkite helped sway opinion on the Vietnam War…could turn out to be one of the more momentous books of the decade." -New One of Wall Street Journal's Best Ten Works of Nonfiction in 2012 New York Times Bestseller "Not so different in spirit from the way public intellectuals like John Kenneth Galbraith once shaped discussions of economic policy and public figures like Walter Cronkite helped sway opinion on the Vietnam War…could turn out to be one of the more momentous books of the decade." -New York Times Book Review "Nate Silver's The Signal and the Noise is The Soul of a New Machine for the 21st century." -Rachel Maddow, author of Drift "A serious treatise about the craft of prediction-without academic mathematics-cheerily aimed at lay readers. Silver's coverage is polymathic, ranging from poker and earthquakes to climate change and terrorism." -New York Review of Books Nate Silver built an innovative system for predicting baseball performance, predicted the 2008 election within a hair's breadth, and became a national sensation as a blogger-all by the time he was thirty. He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of FiveThirtyEight.com. Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future. In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good-or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary-and dangerous-science. Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise. With everything from the health of the global economy to our ability to fight terrorism dependent on the quality of our predictions, Nate Silver's insights are an essential read.


Compare

One of Wall Street Journal's Best Ten Works of Nonfiction in 2012 New York Times Bestseller "Not so different in spirit from the way public intellectuals like John Kenneth Galbraith once shaped discussions of economic policy and public figures like Walter Cronkite helped sway opinion on the Vietnam War…could turn out to be one of the more momentous books of the decade." -New One of Wall Street Journal's Best Ten Works of Nonfiction in 2012 New York Times Bestseller "Not so different in spirit from the way public intellectuals like John Kenneth Galbraith once shaped discussions of economic policy and public figures like Walter Cronkite helped sway opinion on the Vietnam War…could turn out to be one of the more momentous books of the decade." -New York Times Book Review "Nate Silver's The Signal and the Noise is The Soul of a New Machine for the 21st century." -Rachel Maddow, author of Drift "A serious treatise about the craft of prediction-without academic mathematics-cheerily aimed at lay readers. Silver's coverage is polymathic, ranging from poker and earthquakes to climate change and terrorism." -New York Review of Books Nate Silver built an innovative system for predicting baseball performance, predicted the 2008 election within a hair's breadth, and became a national sensation as a blogger-all by the time he was thirty. He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of FiveThirtyEight.com. Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future. In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good-or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary-and dangerous-science. Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise. With everything from the health of the global economy to our ability to fight terrorism dependent on the quality of our predictions, Nate Silver's insights are an essential read.

30 review for The Signal and the Noise: Why So Many Predictions Fail—But Some Don't

  1. 4 out of 5

    Michael Austin

    Nate Silver has done an incredible (and, quite possibly an unpredictable) thing with _The Signal and the Noise_: He has written an extremely good book when he didn't even have to. Nothing is more common than for someone like Silver--a media phenom with a strong platform (his 538 blog) to phone a book in to cash in on his 15 minutes. I have probably read two dozen books in the past five years that do exactly this. But _The Signal and the Noise_ is a much more substantial book than, say, _The Blac Nate Silver has done an incredible (and, quite possibly an unpredictable) thing with _The Signal and the Noise_: He has written an extremely good book when he didn't even have to. Nothing is more common than for someone like Silver--a media phenom with a strong platform (his 538 blog) to phone a book in to cash in on his 15 minutes. I have probably read two dozen books in the past five years that do exactly this. But _The Signal and the Noise_ is a much more substantial book than, say, _The Black Swan_ or either of the _Freakonomics_ offerings. It is a wide-ranging, in-depth look at the ways that we are wired to make predictions (and the reasons that these are so often wrong). Silver ranges over a variety of prediction environments: baseball, chess, poker, the stock market, politics, weather, and terrorist attacks to name the most interesting. Throughout it all, he reminds us that human beings are pattern-seeking animals and that we are just as likely to build patterns where none exist as we are to find the correct patterns and harness their predictive capacity. Predictions work best when they are 1) probabilistic (i.e., express a range of possibilities and assign probabilities for each); 2) when they use as much information--both statistical and analytical--as possible; and 3) when they are continually revised to account for new information. As logical as these sound, human nature seems to drive us in three opposite directions: 1) we seek predictions that are definite and can be acted upon (i.e. "Obama will beat Romney," or "it will rain tomorrow"); 2) we gravitate towards methodologies that seem to discover a magic bullet formula that guarantees success; and 3) we feel compelled to stand by our predictions even as they become increasingly unlikely. Seasoned prognosticators play a long game. Under the right circumstances (a poker game, for example), a strategy that produces only a sightly better prediction than random chance can produce huge dividends. Perhaps most surprisingly, Silver is a great writer (or, at least a great explainer). As an English major with very little grounding in statistics, I could still understand everything he said. Even more importantly, his narratives are interesting. Who could have predicted that from America's most famous stat-geek?

  2. 5 out of 5

    Justin

    I had read most of this book with a fair degree of equanimity - finding some faults, but also a lot of good information in it. Then I'm jarred out of complacency by a sudden shot from nowhere, in which he says that David Hume, one of the greatest philosophers of the 18th century, is simply too 'daft to understand' probabilistic arguments. Without any introduction to the subject, he claims Hume is stuck in some 'skeptical shell' that prevents him from understanding the simple, elegant solutions o I had read most of this book with a fair degree of equanimity - finding some faults, but also a lot of good information in it. Then I'm jarred out of complacency by a sudden shot from nowhere, in which he says that David Hume, one of the greatest philosophers of the 18th century, is simply too 'daft to understand' probabilistic arguments. Without any introduction to the subject, he claims Hume is stuck in some 'skeptical shell' that prevents him from understanding the simple, elegant solutions of Bayes. What makes this so painful to read is that it shows Silver has never even taken the time to read Hume, at least not more than the two paragraphs he used to cite his sources. If he had even kept on for five more pages he would have found that Hume was defending the very type of probabilistic arguments that Silver said Hume was 'too daft' to understand. Nate seems to have given a cursory glance to a single page of Hume's work - "SCEPTICAL DOUBTS CONCERNING THE OPERATIONS OF THE UNDERSTANDING," without even bothering to proceed to the very next section - "SCEPTICAL SOLUTION OF THESE DOUBTS," in which Hume lays a rational foundation for belief in the absence of certainty. In fact, the entire 'Enquiry of Human Understanding' can be read as a treatise attempting to supplant abstract and questionable a priori proofs, with more sensible arguments grounded entirely in the test of experience and probability. By brushing Hume aside so casually, Silver spits in the face of his own philosophical progenitor - a man who helped plant the foundations for the sort of thinking that Silver now takes for granted. I am sure the vast majority of readers will roll a bemused eye at my anger over trivial details like this - but not only does it show that Silver very often doesn't take the time to understand his sources (see Michael Mann's critique of Silver's presentation of global warming), but Silver's casual remarks could easily turn a lot of readers off to Hume before they've even read him. Trendy books like Silvers are far more popular than classic works of philosophy, and new readers are likely to take Silver's description as an accurate portrayal of that daft, old skeptic, David Hume.

  3. 4 out of 5

    Charles

    The Signal and the Noise is a very interesting book with mixed success: 3 1/2 stars, were this permitted. I found it somewhat difficult to review; however, my entire book group – without exception – had similar opinions. I would encourage you to view this as a group opinion. At its best, TSANTN is interesting, illustrative, educational, and provocative. And many chapters – including banking, the weather, volcanoes, elections, and poker – were exactly that. Four stars, without hesitation. The prob The Signal and the Noise is a very interesting book with mixed success: 3 1/2 stars, were this permitted. I found it somewhat difficult to review; however, my entire book group – without exception – had similar opinions. I would encourage you to view this as a group opinion. At its best, TSANTN is interesting, illustrative, educational, and provocative. And many chapters – including banking, the weather, volcanoes, elections, and poker – were exactly that. Four stars, without hesitation. The problem is that some chapters – including baseball, terrorists, and the last several – were dull. Either too long or too scattered or just not interesting. (Again, this was the unanimous opinion among my group.) Nate Silver is a wunderkind polymath, who has scored resounding successes in statistical applications to baseball, poker, and, most recently and most impressively, politics. He emphasizes that huge bunches of data are the tools needed for predictions and that there are huge bunches of data out there. He calmly points out that some things are predictable and are predicted, using various methods with resultant various success. Some things that are predictable are not predicted accurately, exactly because the wrong tools or approaches are used. He equally argues that some things are not predictable, and when predicted, have, predictably, low success. Poor predictors often share the characteristics of ignorance of facts, inappropriate application of basic probability analyses, and, especially, overconfidence. Forecasts are made more inaccurate by overfitting – confusing noise for signal. His grasp of applied math and statistics is refreshing. His application – although, perhaps not the explanation - of Bayes theorem is lucid. His writing style is casual, more impressive considering the subject material. As has been noted by others, the number of typographical errors is unacceptable. An even greater editorial error is letting the author ramble on (again, in some chapters). Liberal use of both a sharp red pencil and an X-Acto knife would have improved this book. So, overall, I really liked some parts. This is why I gave the book a 4-star review. (Most of my book group ended up awarding only 3-stars). But, overall, after a few strong opening innings, the precision of text and purpose waned. In the beginning I did not want the book to end; by 2/3 of the way through, I was more than ready.

  4. 4 out of 5

    Ted

    4 ½ stars. Nate Silver is probably best known as the statistician who confounded the “experts” by predicting the results of the 2008 and 2012 U.S. Presidential elections. As a matter of fact, his web site (https://fivethirtyeight.com/) actually did much better than the average pollsters and media with the 2016 election as well. I was following the writing on the site right up to the night of the election. Entering the final few days, 538 was giving Trump about a 1/3 chance of winning, while most 4 ½ stars. Nate Silver is probably best known as the statistician who confounded the “experts” by predicting the results of the 2008 and 2012 U.S. Presidential elections. As a matter of fact, his web site (https://fivethirtyeight.com/) actually did much better than the average pollsters and media with the 2016 election as well. I was following the writing on the site right up to the night of the election. Entering the final few days, 538 was giving Trump about a 1/3 chance of winning, while most others were saying that the election was a foregone conclusion. And on election day, the 538 article which pointed out early signs that Hillary could be in trouble was so accurate that I had given up for her before 10 pm that evening. And, despite any negative impressions I may leave below about any issues I previously had with Silver's writing, or his style, the last few years, in which he's developed his own web site, together with the interactions he's had will the commenters and other statisticians that he's hired, have made his writing a model of clearness and conciseness. He also (nowadays) is very careful to refrain from making rash statements about probabilities, usually listing many reasons why the "odds" being quoted could be risky bets. Anyway - before Silver's election triumphs he was known to a less wide, but no less fervid, audience as a sabermetrician who, starting in 2003, contributed predicted statistical ranges of performance for major league baseball players to the Baseball Prospectus. In The Signal and the Noise, Silver discusses issues related to these foundations of his reputation in the second and third chapters. To me, the chapter on political predictions was fascinating, the chapter on baseball less so – this despite, or perhaps because of, the fact that I’ve been a keen consumer of sabermetric literature almost since Bill James brought it into the mainstream in the late 1970s. On balance I found the book, in terms of insights offered and simple interest, much closer to the political chapter than the baseball chapter – thus the high rating. I have to confess, however, that I certainly had my expectations lowered by Silver’s Introduction. This impressed me as an attempt (possibly at the urging of an editor?) to present a “Big Theme” context to the book which was described not only disjointedly, but in a manner that makes Silver look like a poor writer, which he isn’t at all. (view spoiler)[For example, many statements that I would think are pretty much common knowledge are footnoted. To be fair, Silver does have a habit of putting comments in addition to source information in his footnotes. Where I believe he often errs is in not needing a source for a statement that is pretty non-controversial; in these cases the comment could just be inserted into the text and the footnote dispensed with. There are other cases, such as “(The printing press) was a spark for the Industrial Revolution in 1775 …”, in which simply removing the year from the statement removes the need for a footnote. (hide spoiler)] The “Big Theme” that Silver talks about in the Introduction is that of Big Data inundating humankind, starting with the invention of the printing press and culminating in recent decades in the spread of powerful computers (to both hold and analyze previously unimaginable amounts of data) and the world wide web, which makes this data not merely available to almost anyone, but overwhelmingly so. But Big Data is only briefly mentioned in the book, and is brought up again in the Conclusion in a correspondingly unenlightening manner. In fact, the book’s first and foremost theme is simply expressed in the book’s title. The difficulty in handling large amounts of data is separating the signal from the noise. The theme, expressed in this manner, is handled more or less brilliantly throughout. Once past the Introduction, the book immediately improved. Silver seemed to quickly find his comfort level in treating one area after another in which we attempt to make predictions, with varying success. Besides the chapters on political forecasts and baseball, there are discussions of the economic meltdown of 2007-8; weather and earthquake predictions ; economic forecasts; infectious disease (flu) forecasts; gambler’s bets; top-level chess; poker; investments; climate forecasts; and terrorism. The great majority of the chapters I found very interesting. Silver writes well, and can clearly get across his points. He shows convincingly I think how these fields differ from one another, and how the problems they have with making successful predictions and forecasts vary from field to field, depending on a variety of elements. I approached the chapter on climate prediction with some trepidation, wondering if Silver was going to somehow take the position that it was all baloney. Thankfully no, and his conclusions about climate forecasts are along the lines of “well the forecasts of warming so far have had a rather mixed record”. So he feels there is a case to be made for some skepticism regarding the accuracy of the models, and thus of the forecasts being produced by the models. He doesn’t doubt for a moment the science involved, or the ultimate warming path we are on, but cautions against believing that we have a very good handle on how fast the warming will occur under different scenarios of additional heat trapping elements being added to the atmosphere. But what Silver doesn’t analyze, here or anywhere else in the book, is how the aspect of risk should be accounted for in making predictions, or in acting on the predictions that we do make. I suppose this may be a bit off the track of what he’s addressing in the book. But it’s one thing to forecast the likelihood of my house burning down (very small), or of a young healthy person needing vast amounts of medical care in the next 12 months (also very small). It’s quite another to use those forecasts to conclude that in neither one case nor the other is spending money on insurance a good idea. Most of us realize that because of the catastrophic consequences of these very unlikely events, buying insurance is rational. In the same way, it seems to me that ignoring climate change forecasts until “more evaluation” of these forecasts, and thus more fine tuning of the models, can be done, is a tremendously risky thing to do, and cannot really be rationally justified. I’ll wind up with a brief mention of an aspect of Silver’s thinking that I found more interesting than anything else. That is his interest in, and application of, Bayesian reasoning or inference. Silver is quite obviously much taken with this, and he does a good job (in my opinion) of explaining it. He doesn’t really introduce it until his chapter on gambling, where he shows how it can be used to make probabilistic forecasts using several interesting (non-gambling) examples. In almost every chapter following this he refers to the way that Bayesian reasoning can be used to strengthen forecasting and to overcome some of the difficulties of predicting in that area.

  5. 4 out of 5

    David

    This is a fantastic book about predictions. I enjoyed every page. The book is filled to the brim with diagrams and charts that help get the points across. The book is divided into two parts. The first part is an examination of all the ways that predictions go wrong. The second part is about how applying Bayes Theorem can make predictions go right. The book focuses on predictions in a wide variety of topics; economics, the stock market, politics, baseball, basketball, weather, climate, earthquakes This is a fantastic book about predictions. I enjoyed every page. The book is filled to the brim with diagrams and charts that help get the points across. The book is divided into two parts. The first part is an examination of all the ways that predictions go wrong. The second part is about how applying Bayes Theorem can make predictions go right. The book focuses on predictions in a wide variety of topics; economics, the stock market, politics, baseball, basketball, weather, climate, earthquakes, chess, epidemics, poker, and terrorism! Each topic is covered lucidly, in sufficient detail, so that the reader gets a good grasp of the problems and issues for predictions. There are so many fascinating insights, I can only try to convey a few. At the present time, it is impossible to predict earthquakes, that is, to state ahead of time when and where a certain magnitude earthquake will occur. But it is possible to forecast earthquakes in a probabilistic sense, using a power law. Likewise, it may be possible to forecast terrorism, because that too, follows a power law! (Well, it follows a power law in NATO countries, probably because of the efforts to combat terrorists. But in Israel, the tail of the curve falls below the power law, likely because of the stronger anti-terror emphasis there.) The accuracy of weather predictions increases slowly but steadily, year by year. Ensembles of computer model runs are part of the story, but human judgment add value, and increases the accuracy. Weather forecasts issued by the National Weather Service are unbiased in a probabilistic sense. But weather forecasts by the TV weatherman are very strongly biased--the weatherman over-predicts precipitation by a significant amount. Nate Silver shows that the people who are most confident are the ones that make the worst predictions. The best predictions are those that are couched in quantitative uncertainties. Silver shows how Bayes Theorem can be applied to improve predictions; it is all about probabilities. And I just love this footnote, A conspiracy theory might be thought of as the laziest form of signal analysis. As the Harvard professor H.L. "Skip" Gates says, "Conspiracy theories are an irresistible labor-saving device in the face of complexity."

  6. 4 out of 5

    Julie

    The Signal and the Noise by Nate Silver is a 2012 Penguin publication. More Information, more problems- This book was recommended by one the many books related emails I get each day. I can’t remember what the particular theme was for its recommendation, although I’m sure it had something to do with how political forecasting data could fail so miserably. Nevertheless, I must have thought it sounded interesting and placed a hold on it at the library. Many of you may be familiar with statistician, N The Signal and the Noise by Nate Silver is a 2012 Penguin publication. More Information, more problems- This book was recommended by one the many books related emails I get each day. I can’t remember what the particular theme was for its recommendation, although I’m sure it had something to do with how political forecasting data could fail so miserably. Nevertheless, I must have thought it sounded interesting and placed a hold on it at the library. Many of you may be familiar with statistician, Nate Silver. His blog/podcast, ‘fivethirtyeight’, is quite popular, featuring talks about polls, forecasting, data, and predictions about sports, and politics, and was even carried by the NYT at one point. I admit I was not familiar with his work until now. However, after reading this book, I think I will keep a closer eye on his website. This book examines the way data is analyzed, how some predictions are correct and why some fail. “The Signal is the truth. The noise is what distracts us from the truth.” I’m not one to put my trust in predictions or polls. I don’t bet on sports teams, and I’m even skeptical about the weather forecast. With the polls and the media thinking they had the most recent election forecasted, I think people are warier than ever. That may be why there has been a renewed interest in this book. The first section of the book, takes a look at the various ways experts make predictions, and how they could miss something like the financial crisis, for example. Silver does speak to political predictions. Thinking like the ‘fox of the hedgehogs’, the biased of political polls, the media’s obsession with things the public doesn’t care about. Remember, this book was published in 2012, so, apparently, the media didn’t learn their lesson. (Silver predicted Obama’s win over Romney much to the chagrin of ‘Morning Joe’, and more accurately predicted the outcome of the most recent election, closer than most) “The fox knows many little things, but the hedgehog knows one big thing”. The second portion of the book is where Silver really excels: Baseball statistics. Now, this section really appeals to baseball fans, which I am not. But, it also would appeal to those who understand math and complicated Algorithms. Again, not my thing. I tried my best to understand this section, but just could not get into it and because it was not a topic I was well versed in, much of it went over my head and frankly, it was boring to me. So, I gave up on this section and went to the next. Weather: This section, which deals with prediction of major weather events, such as hurricanes was very interesting. Weather forecasting not only has an effect on safety, but on our economy as well. Many times, forecasters get things right, and many lives are saved, but at times, they get in right, but things are not as bad as predicted, such as the recent blizzard expected to hit NYC. Yet, as frustrating as that may be, erring on the side caution, still might be a good thing, and remember, many weather forecasters, those working behind the scenes, are not being paid exorbitant fees. Just think about the times when you made it out of the path of a tornado, and be thankful for these guys, who must decipher an incredible amount of data and unpredictable patterns, and they must deal with the human element on top of that. Raw data doesn’t always translate well to the average consumer. For example: What does ‘bitter cold’ mean to you? But, there has to be an honesty in forecasting, too. Television ratings can come into play, too, unfortunately. This was my favorite section of the book. Earthquake predictions, economic forecasters, sports betting/gamblers, or anyone or anything that depends on statistics, data, or formulas is examined in this book. It’s all interesting, for the most part, although, math equations and other information laid out went over my head. The author recommends Baye’s theorem, which I understood on one level, but was overwhelmed by it most of the time. But, I did find the book fascinating, informative, and chock full calculations juxtaposed against unpredictable elements that could not be foreseen, or against patterns in plain sight, were ignored, all mix together to prove why predictions and forecast often fail, but also, what makes them work! Although, this book centers around events taking place throughout the economic crisis, and is a point the author often refers back to, the last point in the book of ‘what you don’t know can hurt you’, reminds us that history can repeat itself, that there is always the element of improbability, the unfamiliar, the unknown, and what we can learn from it in order to make better, more informed decisions in the future. 4 stars

  7. 4 out of 5

    Ilya

    This book was a disappointment for me, and I feel that the time I spent reading it has been mostly wasted. I will first, however, describe what I thought is good about the book. Everything in this book is very clear and understandable. As for the content, I think that the idea of Baysean thinking is interesting and sound. The idea is that, whenever making any hypothesis (e.g. a positive mammogram is indicative of breast cancer) into a prediction (for example, that a particular woman with a posit This book was a disappointment for me, and I feel that the time I spent reading it has been mostly wasted. I will first, however, describe what I thought is good about the book. Everything in this book is very clear and understandable. As for the content, I think that the idea of Baysean thinking is interesting and sound. The idea is that, whenever making any hypothesis (e.g. a positive mammogram is indicative of breast cancer) into a prediction (for example, that a particular woman with a positive mammogram actually has cancer), one must not forget to estimate all the following three pieces of information: 1. The general prevalence of breast cancer in population. (This is often called the "prior": how likely did you think it was that the woman had cancer before you saw the mammogram) 2. The chance of getting a positive mammogram for a woman with cancer. 3. The chance of getting a positive mammogram for a woman without cancer. People often tend to ignore items 1 and 3 on the list, leading to very erroneous conclusions. "Bayes rule" is simply a mathematical gadget to combine these three pieces of information and output the prediction (the chance that the particular woman with a positive mammogram has cancer). There is a very detailed explanation of this online, no worse (if more technical) than the one in the book. If you'd like a less technical description, read chapter 8 of the book (but ignore the rest of it). Now for the bad. While the Baysean idea is valuable, its description would fit in a dozen of pages, and it is certainly insufficient by itself to make good predictions about the real world. I had hoped that the book would draw on the author's experience and give an insight into how to apply this idea in the real world. It does the former, but not he latter. There are lots of examples and stories (sometimes amusing; I liked the Chess story in Chapter 9), but the stories lead the reader to few insights. The examples only lead to one conclusion clearly. If you need to be convinced that "the art of making predictions is important, but it is easy to get wrong", read this book. If you wonder: "how can we actually make good predictions?", don't. The only answers provided are useless platitudes: for example, "it would be foolish to ignore the commonly accepted opinion of the community, but one must also be careful to not get carried away by herd mentality". While I was searching for the words to describe the book, I have found the perfect description in Chapter 12 the book itself: Heuristics like Occam's razor ... sound sexy, but they are hard to apply.... An admonition like "The more complex you make the model the worse the forecast gets" is equivalent to saying "Never add too much salt to the recipe".... If you want to get good at forecasting, you'll need to immerse yourself in the craft and trust your own taste-buds. Had this quote been from the introduction, and had the book given any insight into how to get beyond the platitudes, it would be the book I hoped to read. However, the quote is from the penultimate chapter, and there is no further insight inside this book.

  8. 5 out of 5

    Kate

    I'm going to do this the Nate Silver (Bayesian) way. Kind of. Prior Probability Initial estimate of how likely it is that I will buy Nate Silver a drink: x = 10% (This may seem high, given that he is a stranger who lives in another city, but I did rely on his blog during the past two elections, so I'd at least like to.) New Event -- I read Nate Silver's book Probability that I will fly to New York and track him down and thrust a drink in his hand because this was a great book and I am impressed. y I'm going to do this the Nate Silver (Bayesian) way. Kind of. Prior Probability Initial estimate of how likely it is that I will buy Nate Silver a drink: x = 10% (This may seem high, given that he is a stranger who lives in another city, but I did rely on his blog during the past two elections, so I'd at least like to.) New Event -- I read Nate Silver's book Probability that I will fly to New York and track him down and thrust a drink in his hand because this was a great book and I am impressed. y = 50% Probability that I will stay home just remember to check FiveThirtyEight more often instead. z = 30% Posterior Probability Revised estimate of probability that I will buy Nate Silver a drink, given that his book was illuminating and enjoyable: xy/xy + z(1-x) = 15.6%. Feel free to check my math.

  9. 4 out of 5

    Olive Fellows (abookolive)

    I was expecting a lot of data but this was...a LOT of data.

  10. 4 out of 5

    Dewey

    I wanted to like this book as I enjoy reading Silver's blog. The majority of chapters in this book are inferior rehashes of arguments and anecdotes from other authors. See Moneyball, the Information, Fortune's Formula, A Random Walk, The Theory of Poker etc. etc. The book is clearly intended to capitalize on the popularity of his 538 blog, which as John Cassidy of the New Yorker just articulated overemphasizes the use of Monte-Carlo simulations to come up with inanely precise projections of a te I wanted to like this book as I enjoy reading Silver's blog. The majority of chapters in this book are inferior rehashes of arguments and anecdotes from other authors. See Moneyball, the Information, Fortune's Formula, A Random Walk, The Theory of Poker etc. etc. The book is clearly intended to capitalize on the popularity of his 538 blog, which as John Cassidy of the New Yorker just articulated overemphasizes the use of Monte-Carlo simulations to come up with inanely precise projections of a tenth of a point of who will win the Presidential election. While heuristics and Monte-Carlo style simulations may provide details given the parameters included in the model; Silver's assumptions about the usefullness of one poll over another; and the averaging of prediction markets generally reach similar conclusions to what basic common sense would dictate. I happen to believe just as some people inevitably beat the market by looking at past historical data without actual acumen, Silver's model seems to have been successful. The self-aggrandizing by Silver of his own skill at Poker, political forecasting, sports betting etc, seems to belie his own understanding of Bayesian theory and at times reach nauseating levels. I don't care to know his own personal income from limit poker or his player tracking system used by baseball prospectus. The books dabbles in many areas and is truly compelling in none of them. While not an awful book, a curious reader would be better served by reading separate books on area's of interest including book's that offer a stronger statistical background and less "pop culture" examples. I do not recommend this book to anyone. See more @ Timeisrhythm.wix.com/home

  11. 5 out of 5

    Wen

    Another classic on statistics. This one focused more on real-life applications; sports, politics, finance, weather, climate change... I assume those who had basic statistics would enjoy it more. it was about weeding out noises from the data, and zooming in on signals which will improve the quality of the predictions. All easy say (or read) than do :) Here is my prediction...okay more like a hunch: machine won’t be taking over the sorting task mentioned above before humans safely land on Mars. Le Another classic on statistics. This one focused more on real-life applications; sports, politics, finance, weather, climate change... I assume those who had basic statistics would enjoy it more. it was about weeding out noises from the data, and zooming in on signals which will improve the quality of the predictions. All easy say (or read) than do :) Here is my prediction...okay more like a hunch: machine won’t be taking over the sorting task mentioned above before humans safely land on Mars. Let’s see how I did. This was my second read of the book as part of my recent series of refreshers on statistics and data analysis. I felt I appreciated Silver's approach to the problems more this time, hence I added one star.

  12. 5 out of 5

    Amin

    "فارسی در ادامه" My actual rating would be 7/10. In general, it was an interesting and insightful read, although I have mixed feelings about some of the chapters and concepts, and sometimes the pretentious tone of presenting ideas. Let's start by two weaknesses: At some points it seems good prediction looks like a 'hammer' to see all the problems as 'needles'. So, all the problems can be interpreted as the failures of prediction. To me it does not sound very scientific (in a Popperian sense): an ' "فارسی در ادامه" My actual rating would be 7/10. In general, it was an interesting and insightful read, although I have mixed feelings about some of the chapters and concepts, and sometimes the pretentious tone of presenting ideas. Let's start by two weaknesses: At some points it seems good prediction looks like a 'hammer' to see all the problems as 'needles'. So, all the problems can be interpreted as the failures of prediction. To me it does not sound very scientific (in a Popperian sense): an 'out-of-sample' situation for Silver is close to what Talib uses to explain 'antifragility'. Or the concepts of hedgehogs and foxes are interesting, but the implications are black and white, in a gray word. Furthermore, there is too much detail and bla-blas on some of the topic such as baseball and basketball players in America, which makes the book boring or too Americanized! without a good understanding of the main points which makes some chapters very journalistic. However, it tries to highlight the importance of statistics, and the way facts less quantifiable and accessible for everyone contribute to unique predictions. The second and the more analytical half of the book was more interesting to me. Ideas such as the changing mental model towards predicting (advantages of humans over computers and the role of the chaos theory), statistical errors in everyday life (overfitting and taking noise for signal), the necessity of creating incentives for good predictions (especially in the fields of politics and economics), complexity of predictions for resolving collective and global problems, the necessity of understanding context (and still having good theories), the story of human/machine rivalry (materialized in Kasparov/Blue Deep match), the necessity of being familiar with the basic principles of human action (to live in an uncertain world), misuse of prediction in financial markets, journalistic side effects of making prediction a popular science and the necessity of understanding the world as a gray zone (in contrast to black and white, or impossible/certain situations) are among the interesting ideas for further investigation through different chapters. در کل اثری مفید و خواندنی بود. گرچه فصلها و جزئیات علمی و کاربردی شان با هم تفاوتهای چشمگیری داشتند. نیمه دوم و تحلیلی تر کتاب جذابیت بیشتری داشت، از این بابت که مفاهیم مهم و کاربردی را ارائه می کرد. مواردی مانند خطاهای آماری انسان در محاسبات، تفاوت یا رقابت انسان با کامپیوتر در پیش بینی، نیاز به آشنایی اولیه با علم پیش بینی در زندگی روزمره، اهمیت توجه به زمینه هر موضوع برای پیش بینی صحیح و غیره اما دو ایراد: اول اینکه به سبک کتابهای پرفروش علمی برای عموم، مثل کتابهای گلدول و نیکولاس طالب، مفهوم اصلی کتاب که پیش بینی صحیح است مثل چکشی است که هر چیزی را میخ می بیند و راه حل اصلی را در پیش بینی صحیح برمی شمرد. از دیدگاه پوپری این رویکرد را من خیلی علمی نمی دانم و بیشتر برایم جنبه تجاری دارد. نکته دوم جزئیات فراوان و شاید غیرضروری در برخی فصول است که وجهه ای آمریکایی (مثلا در فصول مرتبط با بیسبال یا بسکتبال) به کتاب میدهد یا برای خواننده ای که خیلی به موضوع خاص فصل علاقه دارد جذابیت بیشتر دارد جزئیاتی درباره برخی مفاهیم و فصول: (view spoiler)[ آغاز علم پیش بینی آنجاست که بشر دریافت برای تغییر آینده غیرجبری باید آن را پیش بینی کند. اما در دنیای پیچیده و واقعیتهای موازی، "نظریه آشفتگی" نشان داد با افزودن اثر روابط غیرخطی و پیش فرضها نسبت به وضع سیستم در گذشته چندان هم نمیشود نسبت به پیش بینی ها مغرور بود. بعلاوه، در پیش بینی های روزمره، مثل هواشناسی، کلی نگری انسان به کامپیوتر ارجحیت دارد و از طرفی سودآوری و منافع مادی بر پیش بینی صادقانه اولویت داده میشوند بعضی از معضلات روز را میشود با خطاهای آماری توضیح داد. مثلا ساختن فرضیه ای که بیش از حد با آمار تطبیق کند، بیشتر مصرف رسانه ای یا آکادمیک تا توضیح واقعیت. وقتی خطاهای محاسباتی و نویز هم وارد محاسبه شوند و عدم قطعیتها بیرون بمانند، نتیجه میشود دادن جوابی خاص به مساله ای کلی که بین مردم هم پخش میشود. بعلاوه معادلات و فرضیه هایی که همه اتفاقات را توضیح دهند، پیچیده ترند و توجه بیشتری جلب میکنند؛ حتی اگر غلط باشند چرا حوزه اقتصاد بدترین آمار پیش بینی را دارد؟ شاید از یک طرف دادن پاسخ مشخص به سوالات مبهم در آن داغ است، مانند آمار برای آینده. از طرفی وقتی سیاستمدار دست روی متغیری میگذارد، اعتبار آن متغیر برای تحلیل اوضاع پایین میرود و انگیزه هم فراوان است که پیش بینی جهتدار تولید شود. با حجم بی نهایت داده، میتوان بین هر دو متغیری ارتباط یافت. بنابرابن نیاز به نظریه های خوب و ایجاد انگیزه برای پیش بینیهای واقعی بیش از همیشه است مسائلی که می توانند مخاطرات جهانی ایجاد کنند، مثل بیماریهای مرگبار، ملتها را خواه ناخواه به هم نزدیک میکنند؛ برای فهم راه حل. اما پای انسان که به پیش بینی ها باز شود، پیچیدگی تحلیلها آن قدر بالا میرود که درک بشر کفایت نمیکند و انسان باید بجای تحلیل و محاسبه، راه حلی خلق کند که پای خودش را از محاسبات بیرون بکشد. بنابراین، مدلها بیش از ظرفیت محاسبه، به هوشمندی انسان احتیاج دارند تا مسائل پیچیده راحت تر فهمیده شوند بیشترین اطمینان به اعتقادات را کسانی دارند که بر روی آنها قمار میکنند. تجربه هم ثابت کرده که موفق ترین آنها کسانی هستند که بجز دانش فنی لازم، فهم خوبی از کانتکست پدیده ها دارند. این گونه است که تفکر بر پایه احتمالات لاپلاس، راه را برای نظریه بیز در مقابل رویکرد فیشری در آمار باز میکند. بر اساس رویکرد بیزی، از هر اعتقادی شروع کنیم و بر اساس داده های جدید آن را اندکی تغییر دهیم، نظراتمان به حقیقت همگرا خواهد شد یکی از جالب ترین داستانهای پیش بینی، تقابل انسان و کامپیوتر در شطرنج است که در مسابقه معروف کاسپاروف با دیپ بلو زیبایی خودش را آشکار میکند. جدا از اهمیت خلاقیت و تفکر استراتژیک بشر در مقابل سرعت محاسبه و تفکر تاکتیکی ماشین، نکته مهم فهم چگونگی "فکر کردن" یک ماشین است که محدودیتها و نیت سازندگان آن را آشکار میکند و برای زندگی واقعی هم پیامد دارد. واقعیت نه چندان خوشایند این است که در حوزه های مختلفی، ضعیف ترین افراد منبع درآمد بخش وسیعی از افراد متوسط هستند. یعنی صرفا در بخشهایی که رقابت وجود دارد و بازی برنده و بازنده است، ضررهای بزرگ افراد با مهارتهای پایین جا را برای بهره مندی عده زیادی همراه میکند، بدون اینکه چندان خوب و ماهر باشند. بنابراین در چنین دنیایی آشنایی با حداقلهای یک زندگی فکری ماهرانه، دیگر نه یک امتیاز، که یک ضرورت است فرضیه اولیه در تجارت این بوده که معامله صورت میگیره تا هر دو طرف نفعی ببرند. اما واقعیت بازارهای مالی حکایت از تفاوت در دیدگاهها و پیش بینی‌های گاه متضاد دارد و هر روز آدمهای بیشتری پیدا میشوند که فکر میکنند می‌توانند خرد جمعی را شکست دهند اصطلاح "نفرین برنده" (کسی که بالاترین ارزش گذاری را برای یک کالا میکند، بالاترین قیمت را برای آن میپردازد؛ عموما بالاتر از ارزش واقعی) نقطه شروع خوبی است برای تامل در رفتار قبیله ای در بازار، یا سرخوردگی بعد از خرید (هیرشمن)، حبابهای بازار، صحیح بودن قیمت ها یا فرضیه بازار کارا. به قول معروف بازار مالی یک جریان اصلی دارد که اقتصاد را به پپیش میبرد و یک جریان هیجانی و سریع دارد که محل بازیهای مالی است در مورد مساله مهمی مثل گرمایش زمین، تنها بر روی جنبه های خاصی از واقعیت - مثل اثر فعالیت های بشر - اجماع وجود دارد و ترجیحات سیاسی و ارزش خبری منجر به فراگیر شدن دیدگاهها میشود. اما از طرف دیگر جنبه های مهم و عموما فنی هستند که برای عموم جاذبه کمتری دارند، مثل اینکه بر مدلهای کامپیوتری محاسبه گرمایش زمین اجماعی وجود ندارد، درحالیکه بر روی نتایج این مدلها قبلا اجماع داشته ایم اگر ذهنمان را به اشتباه عادت بدهیم تا دنیا را دوقطبی بفهمد، قطعی و غیرممکن، ضروری و بیفایده، دوست و دشمن، آن وقت در مواقع ارزیابی و پیش بینی آینده احتمالا یا دچار اطمینان کاذب به یافته ها هستیم یا ندانستن نادانسته ها در جهل مرکب سرگردانمان میکند. پیچیدگی دنیا و واقعیت هایش از حدود وسط میگذرد، جایی که تحلیل و نگرش و بینش انسان بیش از هر جای دیگر به کار می آیند (hide spoiler)]

  13. 4 out of 5

    Mike Mueller

    I followed Nate Silver's blog (FiveThirtyEight) closely during the run-up to election day 2012. His premise was simple: grab every public poll possible, attempt to correct for pollsters' known biases, and produce a forecast based on the result. Somehow no one had thought to do this before. Silver simply crunched the numbers and nailed the outcomes in every state. Meanwhile, pundits, bloggers, and assorted blowhards made predictions based on nothing but gut feeling and partisan hackery, and they I followed Nate Silver's blog (FiveThirtyEight) closely during the run-up to election day 2012. His premise was simple: grab every public poll possible, attempt to correct for pollsters' known biases, and produce a forecast based on the result. Somehow no one had thought to do this before. Silver simply crunched the numbers and nailed the outcomes in every state. Meanwhile, pundits, bloggers, and assorted blowhards made predictions based on nothing but gut feeling and partisan hackery, and they mostly missed the mark (often by a wide margin). I was looking forward to reading more about his methodology in this book, as well as his take on the principles involved in making predictions from noisy data. In this regard, I wasn't disappointed. Silver does a good job of laying out the rules of the road: * It's easy to mistake essentially random fluctuations for a meaningful pattern, and in some contexts (say, earthquake predictions), this can have devastating results. * Having a well-formed, testable theory is better than just looking for any correlations you can find in your data set. * Always make predictions and update your probability estimates like a good Bayesian. Your predictions should approach reality as you continually refine them. * Watch out for biases in yourself and in your data set. * Often overlooked: make sure incentives are aligned with the results you would like to achieve. Also, some specific interesting facts: * Making a living at poker is really hard. Without any really bad players at the table, it's nearly impossible for anyone but the top players to turn a profit. * The efficient market hypothesis doesn't hold up to scrutiny; however, even though the stock market has discernible patterns, it may not be possible to exploit the patterns and consistently beat the market. * Weather prediction has gotten a lot better in the last couple decades, even though most people think it hasn't. * Both earthquakes and terrorist attacks follow a power law distribution. If you're a stock trader, scientist, gambler, or simply someone who wants to form an accurate picture in a noisy environment, there's something in this book for you. The book is also well cited, which helps give weight to some of the more counterintuitive claims. There was a missed opportunity to spend some time on results from the medical research industry. It's well known that publication bias and other factors result in misleadingly positive results for new treatments, which ultimately go away after independent researchers attempt (unsuccessfully) to reproduce the results. It seems like a pertinent, prototypical case of finding patterns in noise, one which could have been instructive. A final note: Silver is not the best writer; his prose is uneven and occasionally downright awkward. His casual style works fine for a blog, but here it diminishes the impact the book could otherwise have had. This is his first published book, and it shows. There are also a couple glaring mistakes that make me think he needed a better editor.

  14. 4 out of 5

    Mehrsa

    Some interesting parts, but it's really hard to take this superforecaster seriously on political forecasting--you know what I mean? And I am sort of over the moneyball theory too. I mean, it was useful a few years ago to break free from "gut feelings", but I think the pendulum swung too far into just cold data and needs to swing back into the world of humans and fat tails and Trump getting elected. Some interesting parts, but it's really hard to take this superforecaster seriously on political forecasting--you know what I mean? And I am sort of over the moneyball theory too. I mean, it was useful a few years ago to break free from "gut feelings", but I think the pendulum swung too far into just cold data and needs to swing back into the world of humans and fat tails and Trump getting elected.

  15. 5 out of 5

    Cameron

    This is a really amazing book - a must read for anyone who makes decisions or judgement calls. Even before I had finished the book it caused me to look at some of the assumptions and bad forecasts I was making as well as recognising "patterns" as noise. There is nothing "new" in this book, just well established and solid methods applied well and explained very coherently. The writing is excellent, the graphics helpful and the type not too small. There are plenty of footnotes (relevant to the page This is a really amazing book - a must read for anyone who makes decisions or judgement calls. Even before I had finished the book it caused me to look at some of the assumptions and bad forecasts I was making as well as recognising "patterns" as noise. There is nothing "new" in this book, just well established and solid methods applied well and explained very coherently. The writing is excellent, the graphics helpful and the type not too small. There are plenty of footnotes (relevant to the page), but I didn't bother with the references at the back. All up it was not at all the onerous read I was expecting from the size and nature of the book. What I particularly liked was that it agrees with many of my "hunches" and "gut feels" (that seem to work out mostly) but more importantly puts theory that I can put to the tests and use more widely. A few points raised really made me feel chuffed and not alone (a little cleverer than most): The misuse and misapplication of Occam's razor; Overfit of models onto data; Fisherian statistical significance (particularly in medical science). There was only one "low" point; chapter 11 on free markets, "If you can't beat'em...", kind of got off course. It started out as a slightly irked, though legitimate, response to a smart ass comment about a free market betting pool being a better predictor than his 538 website. It then went into stock market trading and but didn't go far enough into the information inequalities with market making for my liking. The end conclusion (two streams - indexed investment on signal trading and short trading on the noise), I agree with. A final point on my bad predictions: of the last 4 books I have read I have judged reading time and effort on size and been wrong 3 times - twice with small novels that were philosophically challenging and unpleasant to read and once with this behemoth of a book that was breeze to read!

  16. 5 out of 5

    Laura Noggle

    Meh, I was hoping for more. Interesting at points, but the main message gets swallowed by the noise—almost too much random content. Basically, it's hard to predict stuff. Be careful what predictions you trust, most of them will be wrong a good portion of the time. The end. Meh, I was hoping for more. Interesting at points, but the main message gets swallowed by the noise—almost too much random content. Basically, it's hard to predict stuff. Be careful what predictions you trust, most of them will be wrong a good portion of the time. The end.

  17. 4 out of 5

    Jonathan Mckay

    The Prior Before reading this book, I thought there was a 70% chance I would rate this book 3 stars or higher. The Signal Silver's chapter on Poker was interesting both from the perspective of statistics, but also about poker tactics and the metagame. I wish this were the core of the book. Also, the explanation of Bayes' theorem was solid, as was the chapter on stocks. The Noise Everything else. Superforecasting is MUCH better when talking about predictions, and much more engaging. Shiller's The Prior Before reading this book, I thought there was a 70% chance I would rate this book 3 stars or higher. The Signal Silver's chapter on Poker was interesting both from the perspective of statistics, but also about poker tactics and the metagame. I wish this were the core of the book. Also, the explanation of Bayes' theorem was solid, as was the chapter on stocks. The Noise Everything else. Superforecasting is MUCH better when talking about predictions, and much more engaging. Shiller's book Irrational Exuberance is better on stocks, even Rumsfeld's biography Known and Unknown: A Memoir is better when talking about politics. It felt like Silver took a lot of shortcuts and made claims about causality in multiple areas without sufficient evidence. The Result Read chapters 8, 10, and 11. Skip the rest. Better yet, just skip this book and read Superforecasting. That's 77% of the chapters that are below three stars for me. So let's run some Bayesian inference, with the hypothesis that I would give this book >= 3 stars. P (Hypothesis given evidence) = P (Evidence given Hypothesis) * P (Hypothesis) / P (Evidence) .27 = .3 * .7 / (.77) Now there is only a 27% chance of >= 3 stars.

  18. 5 out of 5

    Mal Warwick

    An eminently readable book about how experts make sense of the world (or, more often, don’t) Statisticians rarely become superstars, but Nate Silver is getting close. This is the guy who writes the FiveThirtyEight.com blog for the New York Times and has correctly predicted the outcome of the last two presidential elections in virtually every one of the 50 states. But Silver is no political maven weaned on election trivia at his parents’ dinner table: he earned his stripes as a prognosticator supp An eminently readable book about how experts make sense of the world (or, more often, don’t) Statisticians rarely become superstars, but Nate Silver is getting close. This is the guy who writes the FiveThirtyEight.com blog for the New York Times and has correctly predicted the outcome of the last two presidential elections in virtually every one of the 50 states. But Silver is no political maven weaned on election trivia at his parents’ dinner table: he earned his stripes as a prognosticator supporting himself on Internet poker and going Billy Beane of the Oakland A’s (Moneyball) one better by developing an even more sophisticated statistical analysis of what it takes to win major league baseball games. And, by the way: Silver is just 34 years old as I write this post. The Signal and the Noise is Silver’s first book, and what a book it is! As you might expect from this gifted enfant terrible, the book is as ambitious as it is digestible. Written in an easy, conversational style, The Signal and the Noise explores the ins and outs of predicting outcomes not just in politics, poker, and sports (baseball and basketball) as well as the stock market, the economy, and the 2008 financial meltdown, weather forecasting, earthquakes, epidemic disease, chess, climate change, and terrorism. Fundamentally, The Signal and the Noise is about the information glut we’re all drowning in now and how an educated person can make a little more sense out of it. As Silver notes, “The instinctual shortcut we take when we have ‘too much information’ is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.” What else could explain why Mitt Romney was “shell-shocked” and Karl Rove was astonished by Romney’s loss in a presidential election that every dispassionate observer knew was going Obama’s way? Silver asserts that “our predictions may be more prone to failure in the era of Big Data. As there is an exponential increase in the amount of available information, there is likewise an exponential increase in the number of hypotheses to investigate . . . But the number of meaningful relationships in the data . . . is orders of magnitude smaller. Nor is it likely to be increasing at nearly so fast a rate as the information itself; there isn’t any more truth in the world than there was before the Internet or the printing press. Most of the data is just noise, as most of the universe is filled with empty space.” Sadly, it’s not just in politics that bias clouds judgment and leads to erroneous conclusions. “In 2005, an Athens-raised medical researcher named John P. Ioannidis published a controversial paper titled ‘Why Most Published Research Findings Are False.’ The paper studied positive findings documented in peer-reviewed journals: descriptions of successful predictions of medical hypotheses carried out in laboratory experiments. It concluded that most of these findings were likely to fail when applied in the real world. Bayer Laboratories recently confirmed Ioannidis’s hypothesis. They could not replicate about two-thirds of the positive findings claimed in medical journals when they attempted the experiments themselves.” In general, Silver’s thesis runs, “We need to stop, and admit it: we have a prediction problem. We love to predict things — and we aren’t very good at it. . . We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.” There’s more: Silver relates the work of a UC Berkeley psychology and political science professor named Philip Tetlock, who categorizes experts as either foxes or hedgehogs (in deference to an ancient Greek poet who wrote, “The fox knows many little things, but the hedgehog knows one big thing.”). Hedgehogs traffic in Big Ideas and often hew to ideologies; these are the people who talk to the press and are frequently found on TV talk shows. Foxes are cautious types who carefully examine and weigh details before reaching conclusions. Not surprisingly, Tetlock found that “The more interviews that an expert had done with the press . . . the worse his predictions tended to be.” In other words, Be afraid. Be very afraid. If the people who supposedly know what they’re talking about often really don’t, how can the rest of us figure out what’s going on?

  19. 5 out of 5

    Rick Presley

    Nate Silver does an excellent job demonstrating the different domains where statistics plays a part. More importantly, he describes why methods that proved successful in one domain are inadequate or inappropriate to another domain. The best part about the book is that he doesn't resort to math to explain these differences. The problem with the book is that he fails to take the lessons from previous chapters and apply them to subsequent chapters. I think this may have explained his hubris in mis- Nate Silver does an excellent job demonstrating the different domains where statistics plays a part. More importantly, he describes why methods that proved successful in one domain are inadequate or inappropriate to another domain. The best part about the book is that he doesn't resort to math to explain these differences. The problem with the book is that he fails to take the lessons from previous chapters and apply them to subsequent chapters. I think this may have explained his hubris in mis-forecasting the 2016 election outcome. I did hear an interview with him that said his stats weren't wrong. If 2 out of 3 scenarios had Hillary winning, then 1 out of 3 scenarios had Trump winning. I think this illustrates his discussion on the difference between likelihood and probability. I would recommend this as a primer on stats for the non-mathematician, but I would caution that there are sprawling passages of boring stuff that you'll want to skip over.

  20. 4 out of 5

    Brian Clegg

    It was really interesting coming to this book soon after reading The Black Swan, as in some ways they cover similar ground – but take a very different approach. I ought to say straight away that this book is too long at a wrist-busting 534 pages, but on the whole it is much better than its rival. Where Black Swan is written in a highly self-indulgent fashion, telling us far too much about the author and really only containing one significant piece of information, Signal and Noise has much more c It was really interesting coming to this book soon after reading The Black Swan, as in some ways they cover similar ground – but take a very different approach. I ought to say straight away that this book is too long at a wrist-busting 534 pages, but on the whole it is much better than its rival. Where Black Swan is written in a highly self-indulgent fashion, telling us far too much about the author and really only containing one significant piece of information, Signal and Noise has much more content. (Strangely, the biggest omission is properly covering Taleb’s black swan concept.) What we’re dealing with is a book about forecasting, randomness, probability and chance. You will find plenty about all the interesting stuff – weather forecasting, the stock market, climate change, political forecasts and more, and with the exception of one chapter which I will come back to in a moment it is very readable and well-written (though inevitably takes a long time to get through). It has one of the best explanations of Bayes’ theorem I’ve ever seen in a popular science book, and (properly to my mind) makes significant use of Bayesian statistics. What’s not to like? Well, frankly, if you aren’t American, you might find it more than a trifle parochial. There is a huge section on baseball and predicting baseball results that is unlikely to mean anything to the vast majority of the world’s readers. I’m afraid I had to skip chunks of that. And there’s a bizarre chapter about terrorism. I have two problems with this. One is the fawning approach to Donald Rumsfeld. Nate Silver seems so thrilled Rumsfeld gives him an interview that he treats his every word as sheer gold. Unfortunately, he seems to miss that for much of the world, Rumsfeld is hardly highly regarded (that parochialism again). There is also a moment where Silver falls for one of the traps he points out that it’s easy to succumb to in analyzing data. On one subject he cherry picks information to present the picture he wants. He contrasts the distribution of deaths in terrorist attacks in the US and Israel, pointing out that where the US numbers follow a rough power law, deaths in Israel tail off before 100 people killed in an incident, which he puts down to their approach to security. What he fails to point out is that this is also true of pretty well every European country, none of which have Israeli-style security. I also couldn’t help point out one of the funniest typos I have ever seen. He quotes physicist Richard Rood as saying ‘At NASA, I finally realised that the definition of rocket science is using relatively simple psychics to solve complex problems.’ Love it Bring on the simple psychics. Overall, despite a few issues it was a good read with a lot of meat on probability and forecasting and a good introduction to the basics of Bayesian statistics thrown in. Recommended. Review first published on www.popularscience.co.uk and reproduced with permission

  21. 5 out of 5

    Patrick Brown

    This was a fun read that tickled the nonfiction part of my brain in pleasant ways. It felt a bit repetitive in parts, and I found myself wondering how various chapters (such as the chess chapter) related to the whole. In the end, I'll take from this book the need to think probabilistically in life, and Bayes' theorem, about which I knew little. The chapter on terrorism was an excellent ending to the book, as it not only tied the concepts together, but it also made apparent the stakes in predicti This was a fun read that tickled the nonfiction part of my brain in pleasant ways. It felt a bit repetitive in parts, and I found myself wondering how various chapters (such as the chess chapter) related to the whole. In the end, I'll take from this book the need to think probabilistically in life, and Bayes' theorem, about which I knew little. The chapter on terrorism was an excellent ending to the book, as it not only tied the concepts together, but it also made apparent the stakes in predicting the future. The McLaughlin Group, for instance, gets to keep coming back each week, even though their predictions are laughably bad. When you're trying to guess whether a terrorist might nuke New York...well, you kind of have to be more right about that. Still, I'm not sure this book quite added up to the sum of its parts. For instance, after reading about the super-skilled sports gambler, I didn't have any better idea how he did what he did than I had before reading the chapter. Perhaps he wouldn't tell Silver his secrets, I don't know. I doubt my predictions will get much better from having read this book, either (though I wonder whether that was the goal of the book or now). I'd still recommend it to anyone with a love of charts, a thirst for interesting data-driven nonfiction, or anyone looking for something to shake up their reading list with something a little different.

  22. 4 out of 5

    Lightreads

    Eh, underwhelmed. A survey of prediction and predictive tools, starting with failures and moving on to successes. Nothing particularly new or interesting here, and I think Silver knew it. It’s not like the premise that the strength of a prediction depends on the accuracy of the data is revelatory or anything. A lot of survey nonfiction like this can be saved with interesting collateral content. This book tours over a dozen topics, but I didn’t find much new or compelling or even particularly com Eh, underwhelmed. A survey of prediction and predictive tools, starting with failures and moving on to successes. Nothing particularly new or interesting here, and I think Silver knew it. It’s not like the premise that the strength of a prediction depends on the accuracy of the data is revelatory or anything. A lot of survey nonfiction like this can be saved with interesting collateral content. This book tours over a dozen topics, but I didn’t find much new or compelling or even particularly complex in the subjects I know something about (the efficient market hypothesis, political polling, the spread of infectious disease), and more damningly I was never engaged by his writing on subjects I don’t know much about (the weather, sports betting, baseball. Oh my God, so much baseball.) I guess what I’m saying here is that the book format reveals all of Silver’s weaknesses as a writer, and there are many. The nicest thing you can say is that when he’s really on a roll, he’s workmanlike. And that’s okay! He doesn’t have to write brilliantly, he can just keep doing statistical modeling. (Better him than me – I disliked stats so much, it doesn't actually qualify as math in my head.) Just, turns out I prefer him doing stats in 1000 word articles and in person, where he comes across much better.

  23. 4 out of 5

    Ms.pegasus

    Yes, this book is by that guy — Nate Silver who correctly predicted the winner of the 2008 presidential elections in 49 out of 50 states. That might seem off-putting. The credentials portend a heavy tome on statistics. Those fears are quickly allayed. This book is entertaining as well as informative. Silver offers solace to those frustrated by information overload. Over-simplification on the one hand and brute-force data crunching on the other can both lead to serious errors. Of the latter he wr Yes, this book is by that guy — Nate Silver who correctly predicted the winner of the 2008 presidential elections in 49 out of 50 states. That might seem off-putting. The credentials portend a heavy tome on statistics. Those fears are quickly allayed. This book is entertaining as well as informative. Silver offers solace to those frustrated by information overload. Over-simplification on the one hand and brute-force data crunching on the other can both lead to serious errors. Of the latter he writes: “The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. ...Data-driven predictions can succeed – and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.” This is a book that provides a context as well as explanation for something called Bayes's Hypothesis. Silver begins by considering the many recent instances of blatantly failed prediction. These include the 2008 housing bubble, the collapse of the Soviet Union, and the Fukushima disaster. In all of these examples he probes the multiple reasons behind human error. Among these is our very human imperative to interpret through patterns. Vision and taste, for example, are perceptions derived from the brain's ability to discern pattern. In a similar way, we try to make sense of events affecting our lives. Unfortunately, all too often, we are unable to separate significant data from insignificant data. In the data-rich field of economic forecasting, it's all too easy to develop models that overfit the data, accounting for insignificant and significant data points indiscriminately. A dense layer of possibly random correlations is captured in a convoluted skein of calculations fed into a computer to generate a “pattern”: “The wide array of statistical methods available to researchers enables them to be no less fanciful – and no more scientific—than a child finding animal patterns in clouds.” A second major source of error is emotion. Experts are frequently wrong because they simply don't want to look bad. He cites the participants of the McLaughlin Group. An outlandish prediction which proves true will be remembered. If it's false, people tend to forget. There is a built-in incentive to grandstand, making outlandish predictions. Scholars may have the opposite incentive: It's safer to stay within the consensus rather than risk looking foolish. Silver also points out another dichotomy. Some experts are so wedded to a pet theory or model that they are incapable of recognizing contradictory data. He characterizes such people as hedgehogs; their opposite are the nimble minded foxes, always seeking out new information and willing to try out new frameworks for fit. Finally, he cites an innate tendency to ignore frightening signals. “Human beings have an extraordinary capacity to ignore risks that threaten their livelihood, as though this will make them go away.” Along the way, he redefines the problem of forecasting in today's world. We live in a world of complex and dynamic systems. A promising forecasting model must allow for adjustment through feedback. Context is always important to separate independent from dependent data points. For example, during the housing bubble, the rating agencies did not recognize that the playing field for issuing mortgages had shifted drastically. The assumption that each mortgage default within a given tranche was independent was the basis for their overly optimistic credit ratings. A corollary of this is that qualitative information must be included in the forecasting process. The problem then becomes how to quantify qualitative data. Finally, we live in a world of uncertainty. Failing to include uncertainty in forecasting calculations is a form of denial. In other words, there is a lot of noise and a sparsity of signal. How can uncertainty be expressed and used in the forecasting process? It cannot fail to astonish most readers that Silver cites weather forecasting as one of the more successful efforts in forecasting. First, meteorologists work with hypotheses that describe how weather systems work. Second, there is an enormous amount of data. Third, the models are constantly being improved as new data either affirms or disproves the latest prediction. Silver also discusses a technique called agent-based modeling, used to predict the spread of epidemics. Incorporated into the model is a sim-city of human behavior parsed by demographic details down to the minutest level. In Chapter 8 Silver finally introduces Baye's Theorem. In addition to his own examples, he uses the classic example of how the rate of false positives in a sample of mammograms affects the actual probability that a positive test accurately predicts the presence of cancer. Rather than repeat the explanation here, I have added some useful websites in the notes section. (The reason I do this is that the more ways a math problem is explained, the likelier it is that understanding will eventually come. I admit it. I didn't understand the formula itself until I had worked through several of these alternative explanations. My favorite is the one that used decision trees). It's amusing that Silver chooses as his first example a scenario in which a woman finds a stranger's underpants in her husband's bed. Using Bayes's Theorem, he gets the probability down from 50% to only 29%! Imagine the beleaguered husband giving this explanation to his wife! Bayes's Theorem is all about conditional probabilities: There is an assumed prior probability, and a resulting posterior probability. The general idea is that even if the prior probability is a wild guess, it will be refined by repeated recalculation of the formula by applying new data successively. The result isn't a prediction – it's only a probability that a proposition is true. It's a technique for modulating new data to align its importance with older data. It's a reminder that uncertainty arises not just from the numbers we collect, but from the innate complexity of the events we are attempting to study. The method is contrasted to the more familiar bell-shaped curve assumptions of frequentism. Silver's varied interests are reflected in this book. He provides examples from Kasparov's chess match with Big Blue, and an interview on poker strategy with Tom Dwan. These examples serve to illustrate the dynamic properties of applying Bayes's Theorem. Anyone interested in either of these areas should definitely take a look at Silver's commentary. Will this book leave you an expert on Bayesian Theory? By no means. The book is designed to whet your appetite. Silver concludes with the final consolation: “Prediction is difficult for us for the same reason that it is so important: it is where objective and subjective reality intersect.” NOTES: Silver's formulation of Bayes's Theorem: (Prior Probability x Probability of specified event) / (Prior Probability x Probability of specified event) + (Probability of specified event being not true) x (1 - Prior Probability). Additional websites that explain Bayes's Theorem: https://www.youtube.com/watch?v=aGnVj... This is a video explanation using a decision tree https://www.youtube.com/watch?v=E4rlJ... This is a classroom video which includes a decision tree explanation http://betterexplained.com/articles/a... This is a really detailed text explanation covering Bayes' Theorem step-by-step with interactive calculation boxes.

  24. 4 out of 5

    Gumble's Yard - Golden Reviewer

    Book about prediction by the author of the 538 political blog, which became particularly famous in the 2012 presidential election (after the book was written) due to the author's high confidence in an Obama victory due to polling evidence in marginals. The author was prior to 538 spread over two jobs - online poker (until it was made illegal in US - see below) and baseball stat evaluation (where he developed his own site which he sold to a professional site for which he then worked). The book's Book about prediction by the author of the 538 political blog, which became particularly famous in the 2012 presidential election (after the book was written) due to the author's high confidence in an Obama victory due to polling evidence in marginals. The author was prior to 538 spread over two jobs - online poker (until it was made illegal in US - see below) and baseball stat evaluation (where he developed his own site which he sold to a professional site for which he then worked). The book's central themes are the importance of Bayesian stats (as opposed to Fisher type confidence intervals based only on data) as the optimal blend of expertise and data and the difficulty of distinguishing the true signal from underlying noise which can either obscure the signal or create false ones. He continues various areas in turn - all of which have their own forecasting issues, which are often very different leading to his third point the difficulty of drawing hard and fast rules around prediction. Generally an interesting book – more a compendium of ideas and so lacking the really big idea/takeaway – which seems deliberate due to the last point. In respect of the financial crisis, he identifies various failures of prediction (housing bubble, rating agencies, failure to see how it would cause a global financial crisis, failure to realise how big and deep recession would be) which he largely ascribes to over-confidence and inability to forecast out of sample events. In political forecasting he claims his ability think probabilistically, revisit and alter past forecasts and look for data consensus means he outperforms what is a poor level of competition (biased and unscientific political pundits). For baseball again he initially competed against simple rules of thumb but sees the real skill in continuing to combine the best of stats with properly incorporated qualitative information to continue to look for edges. Weather forecasting he sees as largely a success story especially when you account for bias (for example to over-predict bad weather as that is less catastrophic an error) and allowing for chaos theory which makes precise long range forecasts difficult. Earthquake forecasting by contrast has had almost no success (here he talks about over fitting). For economic forecasting there are lots of challenges (Uncertainty principle type ideas such as Goodhart’s law, self-fulfilling prophecies so that talk of a recession causes one, natural biases of commentators including either not wanting to go away from herd or being deliberately provocative) not least the sheer noisiness of economic data. For infectious diseases he discusses self-cancelling prophecies (epidemic warnings change behaviour in a good way) and although it’s a challenging area he believes practitioners in this field (perhaps due to their Hippocratic oaths) are more thoughtful about their predictions. In chess he discusses in detail the psychology of Kasparov’s defeat by a computer – an error it made in a losing position convinced him it could think more deeply than it could as well as where humans are better or worse than computers and how blended programmes are very strong. For Poker he takes the view that the Poker players are very natural Bayesians, adjusting their knowledge both as cards appear and also assessing chance of different hands by an intuitive posterior analysis based on how they think their opponents would act with different hands. He also takes the view that he standard of opponents is key to if you can make money. For stock picking he discussed the efficient market hypothesis (especially with transaction costs) and the psychology of bubbles. For climate change he discusses healthy scepticism and also his conclusion that scientists are a lot more seekers after the truth than politicians. For terrorist attacks he discussed power laws to extrapolate to major attacks (which actually dominate costs and deaths) and the importance of lateral and imaginative thinking around threats.

  25. 5 out of 5

    Gina

    Reading Nate Silver is like exhaling after holding your breath for a really long time. I found FiveThirtyEight back in the primary days of 2008, when it was Hillary and Barack fighting it out, and it became apparent that not one of Hillary's advisers to whom she was presumably paying lots and lots of money were as smart or observant as Nate Silver (or Obama's advisers). One of my favorite tweets ever (I don't read many tweets) came from Ken Jennings on election morning of 2012, something along t Reading Nate Silver is like exhaling after holding your breath for a really long time. I found FiveThirtyEight back in the primary days of 2008, when it was Hillary and Barack fighting it out, and it became apparent that not one of Hillary's advisers to whom she was presumably paying lots and lots of money were as smart or observant as Nate Silver (or Obama's advisers). One of my favorite tweets ever (I don't read many tweets) came from Ken Jennings on election morning of 2012, something along the lines of "Obama could still lose this thing if too many democrats write in Nate Silver with little hearts drawn around his name." He had Obama with a 90% chance of winning. And while you could find plenty of other people calling it for Romney or Obama, they are for the most part just talking heads that don't actually care about reality. When Nate Silver gives you a 90% chance of something, it means that nine times out of ten it is going to happen, and one time out of ten it won't, nothing more and nothing less. You don't have to spend energy paying attention to which station it is on and who he is catering to. He caters to reality, which is surprisingly novel. Finding someone who can do this feels like, as I said, exhaling. Of course he has biases, etc, but his job is to be aware of them. This whole book is about why making accurate predictions is extraordinarily difficult. Sometimes apparently impossible, as in the cases of trying to beat the stock market over the long term or predict earthquakes. Sometimes made extremely difficult by humans' strong tendency to not accept the truth of things that don't serve our ends, as in the case of the financial collapse of 2008 (which first chapter in this book is the absolute best summary of that whole fiasco I have ever read). Sometimes the message of people willing and able to make careful, thoughtful predictions with honest margins of error, as is the case with many climate scientists in relation to global warming, is hijacked by politics and agendas. (The chapter on climate change was also exceptionally good, and the people who are criticizing Silver for being a climate change denier or for giving legitimacy to deniers' views have very poor reading comprehension and/or are so blinded by their own religious belief in their version of climate change that they cannot accept the reality of how hard it is to make accurate predictions.) The chapter on his era as a successful online poker player was very entertaining and reinforced why I do not have the stomach to be a gambler. All that being said, be forewarned that most people will find this book extremely boring. It is in the vein of Malcolm Gladwell, but about three times as long and dense (and therefore more substantial). Also, I sadly did not feel like I had gained a very deep understanding of Bayesian thinking by the end, which is unfortunate since that is one of the main points of the book. Surely that is partly my fault, but he could have been more clear about it. At any rate, I think the chapters on the financial collapse and global warming should be required reading for everyone, and the rest of it for those who are interested.

  26. 5 out of 5

    Dave

    Silver's gone 99 for 100 on predicting the state winners of the last two presidential elections. Here he goes something like 7 for 13, very good in parts, solid in some, and misfires in others. It's well-researched, mostly objective (but by no means totally), but it rarely covers anything I didn't already know. If you've read Michael Lewis's The Big Short and Moneyball you can skip chapters 1 and 3 and if you've ever had a class that proves pundits are not any more accurate forecasters than the Silver's gone 99 for 100 on predicting the state winners of the last two presidential elections. Here he goes something like 7 for 13, very good in parts, solid in some, and misfires in others. It's well-researched, mostly objective (but by no means totally), but it rarely covers anything I didn't already know. If you've read Michael Lewis's The Big Short and Moneyball you can skip chapters 1 and 3 and if you've ever had a class that proves pundits are not any more accurate forecasters than the population at large you can skip chapter 2. In addition, Silver loses his way with the climate change chapter as subjectivity overcomes math and the piece covering his online poker career in lifeless, as I expect it would be for anyone who's not a fan of the game. Silver's at his best covering the weather (temperature predictions and hurricane landfall site predictions have decreased their margin of error by significant margins in the last few decades; trust the National Weather Service and not your local newscaster for the most accurate forecast), earthquakes (impossible to predict), and the Bayes theorem, which he champions as the best model by which to life your life and conduct your business. As we learn that it's nearly impossible to beat the stock market over the long run without the benefit of inside information, it becomes clear that the best thing a reader with sound statistical analysis ability can take away from this book, other than making the Bayes theorem a default operating method, is to take that skill and apply it where the analysis to this point is weak. The stock market, baseball, poker - they've been covered, but if you can separate the signal from the noise as the availability of big data overwhelms our ability to parse the useful pieces from it then you can gain a competitive edge in your industry. It's good advice and there are some solid parts of the book, but for such a successful guy there was not much groundbreaking material here. If I weren't a completist I would have read only the chapters that started going somewhere in the first few pages, as the correlation between the first five pages was .92. The exception is the chapter on chess, which was fast out the gate, but faded down the stretch, especially as Silver ignored the fact that Kasparov's loss to Deep Blue was in part triggered by the unfairness of the latter's team getting to see the former's recent matches, but not the other way around. So,yes, Silver's political forecasting is exceedingly accurate and his writing is hit or miss.

  27. 4 out of 5

    Todd N

    I finished this right before the 2012 elections, and I should have written my review before then so that I could convince more people to read this book when Nate Silver got more than Internet famous for a few weeks. At its core, this is a book about how to think -- a very important topic in these times when anything less than complete certainty is viewed as a weakness and the usual response to a disagreement is to double down on the position no matter how ridiculous. In contrast to this, Mr. Silve I finished this right before the 2012 elections, and I should have written my review before then so that I could convince more people to read this book when Nate Silver got more than Internet famous for a few weeks. At its core, this is a book about how to think -- a very important topic in these times when anything less than complete certainty is viewed as a weakness and the usual response to a disagreement is to double down on the position no matter how ridiculous. In contrast to this, Mr. Silver frames issues in terms of Bayes Theorem, which is a centuries-old mathematical formula for determining how probabilities change as new information becomes available. Of course, for this concept to have any bearing on how one thinks, one would have to (1) think in terms of probabilities instead of certainties and (2) modify one's views based as new information comes to light. So, maybe 5 to 10% of Americans? Mr. Silvers points out that a good way to test the validity of a model (or a person) is to see if it gets more accurate as more information is made available. (I would add this corollary: If more information makes a person a worse predictor of reality, then that person is an asshat.) But he also warns about "overfitting" a model by forcing it to fit noise instead of signal. There are so many interesting topics covered in this book such as the prediction of weather (has gotten a lot more accurate despite the jokes), prediction of earthquakes (no progress despite throwing those Italian scientists in jail for failing to predict an earthquake), the economics of poker sites (without large number of dummies to feed in money they are unsustainable), and hurricane landfall predictions (getting much more accurate, as we saw with Sandy). It also has the most intelligent discussion of global warming that I have encountered. I read some of the other reviews that complained that this book is somewhat meandering, which I can see. But I really didn't mind at all. I have enough interest in the general topics of prediction, probability, and Bayes Theorem that I found the occasional digression illuminating rather than distracting. Very highly recommended.

  28. 4 out of 5

    Kristen

    I picked up The Signal and the Noise from the library because I thought it would be slightly boring. I've been having trouble sleeping lately, and I wanted something that would be distracting without being too stimulating. For a while it was hitting that sweet spot, but then it took a turn toward the unexpectedly awesome. Nate Silver is great a explaining things and illustrating them with compelling stories. That's what I was assuming the book would be. But what I was not expecting was the exten I picked up The Signal and the Noise from the library because I thought it would be slightly boring. I've been having trouble sleeping lately, and I wanted something that would be distracting without being too stimulating. For a while it was hitting that sweet spot, but then it took a turn toward the unexpectedly awesome. Nate Silver is great a explaining things and illustrating them with compelling stories. That's what I was assuming the book would be. But what I was not expecting was the extent to which The Signal and the Noise embodies a philosophy of living. Silver is a proponent of thinking probabilistically, which means making predictions and decisions based on the most likely outcome, given the data you have. Sometimes probabilities are effectively 100% (will the sun rise again today?), but often they are not (will a hurricane hit? does my opponent have a better poker hand than me?). This book spoke to me, because I'm actually kind of bad at dealing with uncertainty. I like to think that I can control every outcome, even knowing logically that that is not possible. The Signal and the Noise is a good reminder that even when you make the best decision possible based on your data, you might be wrong. The chapter on poker was my favorite in this respect, and provided a nice metaphor for how to take a more philosophical approach to the limited predicability of life: When we play poker, we control our decision-making process but not how the cards come down. If you correctly detect an opponent's bluff, but he gets a lucky card and wins the game anyway, you should be pleased rather than angry, because you played the hand as well as you could. The irony is that by being less focused on your results, you may achieve better ones.

  29. 5 out of 5

    Susan Visser

    I really enjoyed the book, Nate's talk, and meeting him in person. The book is about predictions and goes through many world events that we can all relate to and discusses the signals and noise that went on around these events. You'll recognize the 2008 US election, the large earthquakes, especially in Japan, swine flu, both the one in the 70s and the more recent epidemic, economic meltdowns, 911, Pearl Harbour, stock market fluctuations, and much more. Throughout these stories, we learn about w I really enjoyed the book, Nate's talk, and meeting him in person. The book is about predictions and goes through many world events that we can all relate to and discusses the signals and noise that went on around these events. You'll recognize the 2008 US election, the large earthquakes, especially in Japan, swine flu, both the one in the 70s and the more recent epidemic, economic meltdowns, 911, Pearl Harbour, stock market fluctuations, and much more. Throughout these stories, we learn about what the predictions were and why they failed or succeeded. Nate gives advice on how the predictions can be improved in these particular incidents, but gives the reader advice on how to create accurate predictions in similar situations. One of the most amazing things you'll learn in the book is that weather predictions is one of the best success stories. Most of us think that weather forecasters are the worst at their jobs, but we're not thinking about probability as we should. You'll learn about Bayes theorem of probability and how to use it in fun things like winning at poker! I enjoyed the book very much and encourage you to read it!

  30. 5 out of 5

    Rose

    This book had so many parts that really captured my attention. The chapter on chess was particularly fascinating. Nate Silver did a great job of compiling vignettes about humans and our inability to see the signal through the noise. On the other hand, this book is simply a series of vignettes. And while I love that they are told in a way that conveys the point, I didn't feel like each chapter I was continuing on a journey or growing from point to point. It was just a series of points, tacked on. This book had so many parts that really captured my attention. The chapter on chess was particularly fascinating. Nate Silver did a great job of compiling vignettes about humans and our inability to see the signal through the noise. On the other hand, this book is simply a series of vignettes. And while I love that they are told in a way that conveys the point, I didn't feel like each chapter I was continuing on a journey or growing from point to point. It was just a series of points, tacked on. I like Steven Jay Gould's books of scientific essays, but I know going in that that is what I'm getting into -- a set of essays. And then there's his problem with the word "literally." I realize that there are many who feel it is grammatically correct to use "literally" to mean the exact opposite. I do not agree, but despite where you fall on that debate, you have to admit that he overuses it to the point of literally driving me out of my mind. I'm honestly shocked that this verbal tic got through an editor. I would have probably forgotten about it if it had been every once in a while, but geez! For example, on page 276-277, he says, "literally" three times in the span of seven sentences. Literally. "[A chess opponent must] execute literally 262 consecutive moves correctly... unless a computer can literally solve the position to the bitter end, it may lose the forest for the trees... Literally all positions in which there are six or fewer pieces on the board have been solved to completion." No matter where you stand on the grammatical rules around "literally," you have to admit that this tic literally adds nothing to the text and should have been caught in editing.

Add a review

Your email address will not be published. Required fields are marked *

Loading...
We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy.