Expert Political Judgment: How Good is It? How Can We Know?Hardback
List price $46.95
Unavailable - AbeBooks may have this title.
Additional formats available
- Paperback $27.34
- Publisher: Princeton University Press
- Format: Hardback | 352 pages
- Dimensions: 160mm x 234mm x 30mm | 612g
- Publication date: 5 July 2005
- Publication City/Country: New Jersey
- ISBN 10: 0691123020
- ISBN 13: 9780691123028
- Illustrations note: 39 line illus. 7 tables.
- Sales rank: 1,623,527
The intelligence failures surrounding the invasion of Iraq dramatically illustrate the necessity of developing standards for evaluating expert opinion. This book fills that need. Here, Philip E. Tetlock explores what constitutes good judgment in predicting future events, and looks at why experts are often wrong in their forecasts. Tetlock first discusses arguments about whether the world is too complex for people to find the tools to understand political phenomena, let alone predict the future. He evaluates predictions from experts in different fields, comparing them to predictions by well-informed laity or those based on simple extrapolation from current trends. He goes on to analyze which styles of thinking are more successful in forecasting. Classifying thinking styles using Isaiah Berlin's prototypes of the fox and the hedgehog, Tetlock contends that the fox - the thinker who knows many little things, draws from an eclectic array of traditions, and is better able to improvise in response to changing events - is more successful in predicting the future than the hedgehog, who knows one big thing, toils devotedly within one tradition, and imposes formulaic solutions on ill-defined problems. He notes a perversely inverse relationship between the best scientific indicators of good judgement and the qualities that the media most prizes in pundits - the single-minded determination required to prevail in ideological combat. Clearly written and impeccably researched, the book fills a huge void in the literature on evaluating expert opinion. It will appeal across many academic disciplines as well as to corporations seeking to develop standards for judging expert decision-making.
Other people who viewed this bought:
USD$27.34 - Save $6.13 18% off - RRP $33.47
USD$48.34 - Save $27.90 36% off - RRP $76.24
USD$11.44 - Save $3.79 24% off - RRP $15.23
USD$12.79 - Save $3.97 23% off - RRP $16.76
USD$13.92 - Save $5.89 29% off - RRP $19.81
USD$22.43 - Save $6.02 21% off - RRP $28.45
Other books in this category
USD$4.73 - Save $0.61 11% off - RRP $5.34
USD$7.99 - Save $2.67 25% off - RRP $10.66
USD$9.85 - Save $3.86 28% off - RRP $13.71
USD$12.46 - Save $2.77 18% off - RRP $15.23
USD$12.87 - Save $4.67 26% off - RRP $17.54
Philip E. Tetlock is Mitchell Professor of Leadership at the University of California, Berkeley. His books include "Counterfactual Thought Experiments in World Politics" (Princeton).
It is the somewhat gratifying lesson of Philip Tetlock's new book ... that people who make prediction their business--people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables--are no better than the rest of us. When they're wrong, they're rarely held accountable, and they rarely admit it, either... It would be nice if there were fewer partisans on television disguised as "analysts" and "experts"... But the best lesson of Tetlock's book may be the one that he seems most reluctant to draw: Think for yourself. -- Louis Menand The New Yorker Before anyone turns an ear to the panels of pundits, they might do well to obtain a copy of Phillip Tetlock's new book Expert Political Judgment: How Good Is It? How Can We Know? The Berkeley psychiatrist has apparently made a 20-year study of predictions by the sorts who appear as experts on TV and get quoted in newspapers and found that they are no better than the rest of us at prognostication. -- Jim Coyle Toronto Star Tetlock uses science and policy to brilliantly explore what constitutes good judgment in predicting future events and to examine why experts are often wrong in their forecasts. Choice [This] book ... Marshals powerful evidence to make [its] case. Expert Political Judgment ... Summarizes the results of a truly amazing research project... The question that screams out from the data is why the world keeps believing that "experts" exist at all. -- Geoffrey Colvin Fortune Philip Tetlock has just produced a study which suggests we should view expertise in political forecasting--by academics or intelligence analysts, independent pundits, journalists or institutional specialists--with the same skepticism that the well-informed now apply to stockmarket forecasting... It is the scientific spirit with which he tackled his project that is the most notable thing about his book, but the findings of his inquiry are important and, for both reasons, everyone seriously concerned with forecasting, political risk, strategic analysis and public policy debate would do well to read the book. -- Paul Monk Australian Financial Review Phillip E. Tetlock does a remarkable job ... applying the high-end statistical and methodological tools of social science to the alchemistic world of the political prognosticator. The result is a fascinating blend of science and storytelling, in the the best sense of both words. -- William D. Crano PsysCRITIQUES Mr. Tetlock's analysis is about political judgment but equally relevant to economic and commercial assessments. -- John Kay Financial Times Why do most political experts prove to be wrong most of time? For an answer, you might want to browse through a very fascinating study by Philip Tetlock ... who in Expert Political Judgment contends that there is no direct correlation between the intelligence and knowledge of the political expert and the quality of his or her forecasts. If you want to know whether this or that pundit is making a correct prediction, don't ask yourself what he or she is thinking--but how he or she is thinking. -- Leon Hadar Business Times
Table of contents
Acknowledgments ix Preface xi Chapter 1: Quantifying the Unquantifiable 1 Chapter 2: The Ego-deflating Challenge of Radical Skepticism 25 Chapter 3: Knowing the Limits of One's Knowledge: Foxes Have Better Calibration and Discrimination Scores than Hedgehogs 67 Chapter 4: Honoring Reputational Bets: Foxes Are Better Bayesians than Hedgehogs 121 Chapter 5: Contemplating Counterfactuals: Foxes Are More Willing than Hedgehogs to Entertain Self-subversive Scenarios 144 Chapter 6: The Hedgehogs Strike Back 164 Chapter 7: Are We Open-minded Enough to Acknowledge the Limits of Open-mindedness? 189 Chapter 8: Exploring the Limits on Objectivity and Accountability 216 Methodological Appendix 239 Technical Appendix: Phillip Rescober and Philip E. Tetlock 273 Index 313