You don't need to own a Kindle device to enjoy Kindle books. Download one of our FREE Kindle apps to start reading Kindle books on all your devices.

  • Apple
  • Android
  • Windows Phone
  • Android

To get the free app, enter your mobile phone number.

This title is not currently available for purchase
Superforecasting: The Art and Science of Prediction by [Tetlock, Philip E., Gardner, Dan]
Kindle App Ad

Superforecasting: The Art and Science of Prediction Kindle Edition

4.3 out of 5 stars 4 customer reviews

See all formats and editions Hide other formats and editions
Amazon Price
New from Used from

Length: 355 pages Word Wise: Enabled Enhanced Typesetting: Enabled
Page Flip: Enabled Language: English

Kindle Monthly Deals
Kindle Monthly Deals
New deals each month starting at $1.49. Learn more

Product description

Product Description

New York Times Bestseller
An Economist Best Book of 2015

"The most important book on decision making since Daniel Kahneman's Thinking, Fast and Slow."
Jason Zweig, The Wall Street Journal
Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week’s meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts’ predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught?
In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."
In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden’s compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn’t require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course.

Superforecasting offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.

Product details

  • Format: Kindle Edition
  • File Size: 3948 KB
  • Print Length: 355 pages
  • Publisher: Crown (29 September 2015)
  • Language: English
  • ASIN: B00RKO6MS8
  • Text-to-Speech: Enabled
  • X-Ray:
  • Word Wise: Enabled
  • Enhanced Typesetting: Enabled
  • Average Customer Review: 4.3 out of 5 stars 4 customer reviews
click to open popover

Customer Reviews

4.3 out of 5 stars
5 star
4 star
3 star
2 star
1 star
See all 4 customer reviews
Share your thoughts with other customers

Top Customer Reviews

Format: Kindle Edition Verified Purchase
If humility and a growth mindset are characteristics of superforecasters, as rightly asserted by the authors, then they qualify. This book is another example of the value of true expertise reflecting on the lessons of a lifetime. Readers from many expert backgrounds,for example, the "lean thinking" approach to process management will find resonance with the discussion on where knowledge is located. The content begs to be extended into the welfare [not financial] economic decision-making space which is currently so distorted by celebrity experts being listened to by our corporatist political representatives. Highly recommended.
One person found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Kindle Edition Verified Purchase
If you're familiar with Tetlock's work on experts then you'll have already be aware that, when it comes to the future, experts don't always have a great track record. In this book, Tetlock examines what makes forecasts more likely to have merit and ways in which we can learn from our mistakes.
Well worth reading!
One person found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Kindle Edition Verified Purchase
Very readable and thought provoking take on on only forecasting, but decision making in general.
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Kindle Edition Verified Purchase
A well researched assessment of prediction and those who claim to be expert.
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse

Most Helpful Customer Reviews on (beta) (May include reviews from Early Reviewer Rewards Program) 4.3 out of 5 stars 267 reviews
188 of 205 people found the following review helpful
2.0 out of 5 stars More about superforecasters than about superforecasting 8 October 2015
By Jackal - Published on
There are two kind of pop-science books; one deep and thoughtful based on years of research, one quick and dirty written by a ghost-writer. This book is of the latter kind. Tetlock wrote Expert Political Judgment: How Good Is It? How Can We Know? about a decade ago. That book was deep and thoughtful. I had expected his new book to be an update with ten more years of research and consulting. Sadly, I am greatly disappointed. The book could have been written totally without additional research input. It starts with a couple of chapters of the history of the standard controlled experiment. There is about 50 pages of real content in the 330 pages of the book.There is a lot of content directly lifted from the web (e.g. Fermi-forecasting, Auftragstaktik) - kind of Malcolm Gladwell style, some insight and some misinterpretation.

The style is **extreme pop-science**. What do I mean with that? Far too many pages, plentiful descriptions of minute irrelevant details of individuals (so called human interest points - I guess that is what they teach in creative writing), never any figure or number (e.g. 67% is changed to two thirds), all difficult material removed or put in a footnote. And how come a book with two authors use the pronoun "I" all the time?

The researcher has run a forecasting tournament for several years. He has loads of data, but he does not provide any analysis in the book. He refers to his research in footnotes, but no explanation or description at all. Instead we get statements like 80% of superforecasters are more intelligent than average. What is wrong with running a regression to find out what characteristics are important? Why spend five chapters going through the characteristics of superforecasters? In the end, apparently, two characteristics stand out. (1) Continual updating of forecasts, (2) Being intelligent. That fact is told after around 200 pages of tedious writing. Wtf? I can reluctantly accept dumbing down the book, but it is inexcusable that the footnotes does not include some further help to the reader that wants more depth.

The author likes to give minute details of the superforcasters. Personally, I don't care that Brian likes Facebook updates of cats, that John is retired because he is sick and that he now likes to collect stuff or that Steve is and old colleague of the author that likes opera. Who reads and enjoy this written muzak? It goes on chapter after chapter. We "meet" 15-20 superforecasters.

There is a lot about the superforecasters in the book, but the title of the book is "Superforecasting". This is a seriously misleading title. It makes you believe that you will learn tools to become a great forecaster. You get some, mostly general, points in an eight page appendix. With the researcher's experience, I would have expected a lot of practical advice.

What is good about the book?
(1) The key message that experts are lousy forecasters and do not want accountability is very important, but that was already in the author's earlier book.
(2) Some useful anecdotes that you probably should pick up if you are teaching/presenting on the topic.
(3) Odd bits of information. I liked the discussion of how the German military used what we consider modern management already 100 years ago. As mentioned earlier, there are 50 pages of really good material in the book.

I bought the hard-cover edition. If you make notes with a normal pencil, be careful because it easily pierces the paper.

The book is worth two stars. If you are en educator and want a few anecdotes, read the book. Others should give it a pass. Instead sign up to the author's forecasting tournament. You learn more by trial and error learning. I signed up two years ago and it is a useful experience. You can also check the video features on Then spend time reading better books. A few rigorous pop-science books:
* Another forecasting perspective is Steenbarger's Trading Psychology 2.0: From Best Practices to Best Processes (Wiley Trading). It is about trading in the market, but it covers many of the topics from a different perspective. Worth reading his earlier books too.
* And if you haven't read Thinking, Fast and Slow, that is a more important book (but also too fluffy for my linking).
* You should also read Taleb's The Black Swan: Second Edition: The Impact of the Highly Improbable: With a new section: "On Robustness and Fragility" (Incerto), but don't buy his fluffy version of the same topic Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets (Incerto)
176 of 197 people found the following review helpful
3.0 out of 5 stars Valuable lessons for forecasting, but lacks a practical recipe: 3.5 stars 14 September 2015
By Ashutosh S. Jogalekar - Published on
Amazon Vine Review ( What's this? )
In the 1990s Philip Tetlock gathered together hundreds of experts and "ordinary" - albeit extremely well-read - people and asked them to try to predict global questions of significance: What will happen to the stock market in the next one year? What will be the fate of Tunisia in two years? What kind of impact of middle eastern politics on oil prices are we going to see in the next six months?

He continued the contest for several years and came up with a shocking answer: the ordinary people who read the daily news and thought about it with depth and nuance were at least as good as self-proclaimed and well-known experts from the financial sector, from government and from intelligence agencies. These results of the so-called 'Good Judgement Project' were widely publicized by the media under the "there are no experts" drumroll, but as Fetlock and his co-author Gardner indicate in this book, what the media failed to report was the presence of a handful of people who were even better than the experts, albeit by modest amounts. Tetlock called these people 'superforecasters', and this is their story.

The crux of the book is to demonstrate the qualities that these superforecasters have and try to teach them to us. The narrative is packed with very interesting problems of forecasting like figuring out if the man in a mysterious compound in Pakistan was Osama Bin Laden or whether Yasser Arafat had been poisoned by Israel. In each case Tetlock takes us through the thought processes of his superforecasters, many of who have held non-forecasting related day jobs including plumbing, office work and construction. In addition, since Tetlock is a well-known psychologist himself, he has access to leading business leaders, academics and intelligence analysts who he can interview to probe their own views.

Tetlock tries to distill the lessons that these super forecasters can teach us. Foremost among them are an almost obsessive proclivity toward probabilistic and at least semi-quantitative thinking and an almost automatic willingness to update their prior knowledge in the face of contrary opinions and new evidence. Open mindedness, flexibility and an ability to move quickly between different viewpoints is thus essential to good forecasting. Other lessons include striking a good balance between under and over confidence and between under and overreacting to the evidence, breaking down problems into smaller problems (the so-called Fermi approach to problem solving), recognizing the limits of one's prediction domain, looking for clashing or contradictory causal factors and dividing the evidence into more and less certain pieces. Finally, being part of a good team and learning from each other can often be a revelation.

Tetlock and Gardner's book thus gives us a good prescription for confident forecasting. What I found a bit disappointing was that it does not give us a recipe - hence the 3 stars (actually 3.5 had Amazon permitted a fractional rating system). It points out the destination but not the path, and so even at the end I felt myself floundering a bit. To some extent this path is subjective, but in its absence at least some of the prescriptions (such as "break down a problem into parts" or "consider contradictory evidence") sound rather obvious. What Tetlock and Gardner could do in a forthcoming book in my opinion is teach us how to ingrain the valuable lessons that they learnt from superforecasters in our daily habits and thinking, perhaps with case studies. For instance how do we start to think along the lines of superforecasters the moment we open our daily paper or flip on a news channel? How exactly do we reach a conclusion when presented with contradictory evidence? It's great to know all the qualities that forecasters could teach us, but preaching is not quite the same as practicing so I think all of us would appreciate some help in that arena. I think there's a great self-help manual hidden in Fetlock and Gardner's book.
111 of 121 people found the following review helpful
5.0 out of 5 stars This book has a 100% probability of making you think! 29 August 2015
By Angie Boyter - Published on
Amazon Vine Review ( What's this? )
Everyone wants to be able to predict the future, whether they are buying stocks, choosing a mate, or deciding how the next presidential election will go, but what, if anything, can we do to improve our ability to predict? Wharton School professor Philip Tetlock has been studying that question since the Reagan era and has observed forecasters from pundits and intelligence analysts to filmmakers and pipe fitters to try to learn why some people are better at making predictions than others. In this book, he describes his work and that of others and presents some techniques that may help all of us make better decisions.
As someone who enjoys reading about topics like decision-making, forecasting, and behavioral economics, I too often find myself reluctantly concluding, “That was well-presented, but there is nothing here I have not heard before.” For a reader new to the subject, it is good that Superforecasting delves into the ideas of people like psychologist Daniel Kahneman, whose description of the biases in judgment that impede our ability to make good decisions and forecasts earned him a Nobel Prize in Economics, and Tetlock appropriately covers topics like these.
I was pleased, though, he also presented some interesting work I was not familiar with, such as the author’s own Expert Political Judgment project to study whether some people really are better predictors than others and, if so, how they differ from the less successful experts, and the Good Judgment Project that was part of an effort to improve intelligence estimating techniques funded by IARPA (the intelligence community’s equivalent of DARPA). I was also especially amused by a contest run in 1997 by the Financial Times at the suggestion of behavioral economist Richard Thaler. People were to guess a number between 0 and 100, and the winner would be the person whose guess comes closest to TWO-THIRDS of the average guess of all contestants. If thinking about this contest begins to make your head spin, read this book. If it sounds pretty simple to you, then you should DEFINITELY read this book; the answer will surprise you!
The history of science was also interesting and often surprising, such as the idea of randomized controlled trials, which are taken for granted today, not being used until after World War II. The book introduces us to people like meteorologist Edward Lorenz, the author of the classic paper asking whether the flap of a butterfly’s wings in Brazil can set off a tornado in Texas, and physician Archie Cochrane, an early advocate for randomized trials and a scientific approach to medical decisions who nonetheless was driven by his human biases to make a decision about his own health that subjected him to a mutilating surgery and could have cost him his life.
After studying and identifying a group of superforecasters and their characteristics, Tetlock asked the natural question: Are superforecasters born, or can we all become superforecasters? As a good scientist, he concludes he cannot answer that question with certainty, but he does lay down some habits of mind that are very likely (Give me a probability here, Phil!) to improve anyone’s ability to make predictions and improve the resulting decisions.
If your aim is to improve your own ability to make predictions, Tetlock will both give you valuable advice and explain how following that rather simple-sounding advice may be harder than you think. I predict you’ll find the book both enjoyable and informative.
4.0 out of 5 stars I (Matt) have really enjoyed reading Super Forcasting 10 March 2017
By slpleslieanne - Published on
Verified Purchase
I (Matt) have really enjoyed reading Super Forcasting: The Art and Science of Prediction by Phillip E. Tetlock and Dan Gardner. It's a fun book for taking a dive into forecasting .

I enjoyed reading about how forecasters related to weather, politics, stocks, etc. are often considered professionals even though they may actually be amateurs. I also liked learning about how even though these people are often very bad or unreliable at forecasting, it is in many ways a skill that can be learned. While it definitely leans more towards pop-science than going academically deep into the presented topic, I feel like I learned some new information from this book.

I do wish there was less fluff and I often found myself wishing for deeper analysis of the years of research the authors kept referencing.

If you enjoy books like Moneyball or Freakonomics then I would recommend this book to you.

**I received this book from Blogging for Books for free.**
5.0 out of 5 stars It will challenge what you think you know 21 June 2017
By RAY HINTZ - Published on
Verified Purchase
I almost gave this four stars because I got bogged down a bit in the beginning, but then the book really took off.

The concept of people forecasting future events was intriguing, but discovering how the effective ones did it was absolutely fascinating. It's not a book that is going to tell you about future events like some talking head, but you will discover some keys based off of test groups to help you forecast more accurately. It's a great book that not only discusses people predicting the future, but how our own bias and perspective can skew our ability to see clearly. The last portion of the book really delved into the power of teamwork and how the RIGHT team can forecast with incredible accuracy.

If you're looking for a book to engage your intellect and challenge some preconceived notions, this is a great one for you.