商家名称 | 信用等级 | 购买信息 | 订购本书 |
![]() |
Think Twice: Harnessing the Power of Counterintuition | ![]() |
|
![]() |
Think Twice: Harnessing the Power of Counterintuition | ![]() |
a fine pick for any business collection strong in management issues, and addresses some of the predecessors of bad mistakes.” Bookwatch
网友对Think Twice: Harnessing the Power of Counterintuition的评论
作者的文章每期都读,这本当然也不例外,推荐!
Think Twice represents the next step in Mauboussin's beneficial quest to help all of us identify the mistakes we make and provide the tools to fix them. It takes many of Mauboussin's past ideas, condenses a vast array of additional sources, and puts them in a manifesto on how to dodge the pitfalls of poor decision making. Mauboussin has managed to write a book that is interesting for everyone. Deceptively short at 143 pages (with 33 pages of notes and references), I recommend readers slow their pace to digest this book and internalize the tools and countless real world examples used to clarify and illustrate.
"No one wakes up thinking, "I am going to make bad decisions today." Yet we all make them." Think Twice outlines eight common mistakes, tries to help the reader recognize these in context, then provide ideas on how to mitigate your own tendency to repeat these same mistakes. Certain ideas recur throughout the text, including using data and models to inform decisions; viewing many real world situations as complex adaptive systems; as well as appreciating context and luck.
Each chapter focuses on one key error we make:
+Chapter 1: Viewing our problem as unique. Others have usually faced the same decisions we face and we can learn from their results to get to the right answer - for example in corporate M&A, you can look at how other similar deals have performed.
+Chapter 2: We fail to consider enough alternative options under pressure because we have models in our head that oversimplify the world; usually that helps us make quick decisions but often it causes us to leave out alternative choices which could be better. Incentives and unconscious anchoring on irrelevant information contribute to this tunnel vision.
+Chapter 3: An uncritical reliance on experts. Experts are people like us and are subject to all the same bias and error. While this has been covered by Cialdini and others, Mauboussin focuses on the solution - "computers and collectives remain underutilized guides for decision making." We see this idea now in practice in the development of prediction markets for Hollywood movies to who will be the next Senator from North Dakota.
+Chapter 4: "Situation influences our decisions enormously." We all underestimate how much we are influenced not only by others, but by our own feelings.
+Chapter 5: Cause and effect reasoning fails when systems are complex because the whole is greater than the sum of the parts. Focusing on why individuals in a system do something - an investor in the market, an ant in a colony, or birds in a flock - does not help explain how the entire system performs. Understand the rules that govern the entire system, rather than the rules that drive the individual participants.
+Chapter 6: We try to apply general rules in contexts that are not appropriate. In real life, decisions are specific. As Mauboussin says, "it depends".
+Chapter 7: Small changes in a system (or an input) can lead to a large change in output. We mess things up by assuming the same input will always have the same output. One quotation I particularly liked in this chapter was from Peter Bernstein - "Consequences are more important than probabilities."
+Chapter 8: We forget about reversion to the mean. "Any system that combines skill and luck will revert to the mean over time." Ignoring this makes people think they are special and that the rules of probability don't apply to them. This is reinforced by the "halo effect" - when someone is doing well in any field, people and the press lionize that individual and report on the multitude of genius they have ...but when they revert to the mean, all of a sudden that same person is viewed as incompetent. Mauboussin's own colleague Bill Miller faced this same perception cycle, and emerged with a halo in 2009.
Mauboussin concludes the book by summarizing an effective action plan - to put it simply, Mauboussin admonishes us to Think Twice before we make a serious decision to ensure we don't fall victim to any of these pernicious errors.
There are now many good books available on why we make errors in judgment and decision making. This book represents Michael Mauboussin's contribution to this genre, and I think he has done a good job in pulling together a lot of information from a diverse range of credible sources. The information he presents has broad application, though he has a slight emphasis on business and investing applications (his own area of specialization). The book is also a fairly easy and quick read.
Perhaps the best way to describe the content of the book is to summarize the key points, roughly in the order they appear in the book:
(1) "Think twice" to avoid errors in judgment and decision making, especially in situations where stakes are high.
(2) Learn from the experiences of others in similar situations (making use of statistics when possible), rather than relying only on your own perspective, and don't be excessively optimistic about expecting to beat the odds.
(3) Beware of anecdotal information, since it can paint a biased picture. Related to this point, don't infer patterns which don't exist, especially when the available data is limited, and avoid the bias of favoring evidence which supports your beliefs while ignoring contradictory evidence (deliberately seek dissenting opinions if necessary).
(4) Avoid making decisions while at an emotional extreme (stress, anger, fear, anxiety, greed, euphoria, grief, etc.).
(5) Beware of how incentives, situational pressures, and the way choices are presented may consciously or subconsciously affect behavior and shape decisions.
(6) In areas where the track record of "experts" is poor (eg, in dealing with complex systems), rely on "the wisdom of crowds" instead. Such crowds will generally perform better when their members are capable and genuinely diverse, and if dissent is tolerated (otherwise the crowd will be prone to groupthink).
(7) Use intuition where appropriate (eg, stable linear systems with clear feedback), but recognize its limitations otherwise (eg, when dealing with complex systems).
(8) Avoid overspecialization, aiming to have enough generalist background to draw on diverse sources of information.
(9) Make appropriate use of the power of information technology.
(10) Overcome inertia by asking "If we did not do this already, would we, knowing what we now know, go into it?"
(11) Because complex systems have emergent properties (the whole is more than the sum of the parts), avoid oversimplifying them with reductionistic models (simulation models are often helpful), remember that the behavior of components is affected by the context of the system, and beware of unintended consequences when manipulating such systems.
(12) Remember that correlation doesn't necessarily indicate causality.
(13) Remember that the behavior of some systems involves nonlinearities and thresholds (bifurcations, instabilities, phase transitions, etc.) which can result in a large quantitative change or a qualitative change in system behavior.
(14) When dealing with systems involving a high level of uncertainty, rather than betting on a particular outcome, consider the full range of possible outcomes, and employ strategies which mitigate downside risks while capturing upside potential.
(15) Because of uncertainties and heterogeneities, luck often plays a role in success or failure, so consider process as much as outcomes and don't overestimate the role of skill (or lack thereof). A useful test of how much difference skill makes in a particular situation is to ask how easy it is to lose on purpose.
(16) Remember that luck tends to even out over time, so expect outcomes to often "revert to the mean" (eventually move close to the average). But this isn't always the case, since outliers can also occur, especially when positive feedback processes are involved (eg, in systems in which components come to coordinate their behavior); in a business context, remember to make a good first impression.
(17) Make use of checklists to help ensure that important things aren't forgotten.
(18) To scrutinize decisions, perform a "premortem" examination. This involves assuming that your decision hasn't worked out, coming up with plausible explanations for the failure, and then revising the decision accordingly to improve the likelihood of a better outcome.
While this book doesn't really present any new material, I still found it to be a good resource, so I recommend it. After all, this subject matter is important and practical, yet also counterintuitive, so it makes sense to read many books to help these insights sink in and actually change one's habits.
I borrowed this book title and some of the author's advice in a speech to accounting students. My speech title was Think Twice ... ten illogical actions to have a successful accounting career.
Author Michael Mauboussin states on page 143 that almost everyone agrees decision-making is important yet we don't teach students how to make good decisions. I recommended Think Twice plus Decisive by Chip & Dan Heath to the students.
I was not wow'ed by this book (therefore 4, not 5, stars) but had several Aha's:
1. A crowd of partially informed is more accurate than a handful of experts.
2. Peter Drucker's question to Campbell Soup leaders : "If we did not do this (promote tomato soup) already, would we, knowing what we do now, go into it?" Drucker was one of the best at posing questions instead of giving answers.
3. To improve a team or organization, don't rely on bringing in a star. Instead improve the whole instead of adding or subtracting one person.
This short and easy-reading book packs a real wallop when it comes to identifying and explaining common mental tendencies that can impair our judgment. These tendencies may be hard-wired into our DNA as a result of events thousands of years ago. For example, suppose that while two cavemen were walking down a path, they both heard a sound in a nearby bush. One caveman bolts immediately, while the other stops and thinks to himself, "I wonder what that is?" While he's thinking, a rattlesnake strikes out and bites him--and he dies shortly thereafter. That caveman's DNA didn't get passed on to future generations, while the DNA of the caveman who ran first and thought later does. So it may make sense that in uncertain or stressful situations, we have a tendency to react quickly. But modern society's situations may call for careful thought more than immediate reaction, so at times even smart people can be led by their instinctive responses into poor decisions. As author Michael Mauboussin puts it, "Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us."
This book is filled with interesting examples of how our instinctive first responses can lead to less than optimal choices. I'll relate two of the book's examples, so you can get a feel for what's in the book, and then you can hopefully decided better whether you want to buy it. The first example that comes to mind may be called the "inside versus outside" view. To illustrate "inside" thinking, consider the case of race horse Big Brown in 2008. He won the Kentucky Derby by four and three-quarters lengths, and then he won the Preakness Stakes by five and one-quarter lengths. Prior to Big Brown's attempt to win the Belmont Stakes (and thus capture racing's Triple Crown), he looked great, and his owner expressed a lot of confidence. On race day for the Belmont, Big Brown's odds were 3 - 10, making him the easy (75% likelihood of winning) favorite. That's the details-oriented "inside" view. The "outside" view is that of the 29 prior horses to compete in the Belmont after winning both the Kentucky Derby and the Preakness, only 11 won the Belmont (about 38%). Further, since 1950 (perhaps when better training methods were more commonly practiced) only 3 of 20 horses (15%) with a chance to win the Triple Crown won the Belmont. In short, the inside view was optimistic, but the outside view wasn't. It turns out that Big Brown finished ninth in the Belmont.
A briefer second example of how our "mental models" affect our thinking and decision making, concerns French and German wines available for sale in a store. When French music was played, customers chose French wine 77% of the time, yet when German music was played in the store, customers chose German wine 73% of the time. The customers were asked whether they heard the music and whether it affected their choices. Most customers recalled hearing the music, but they denied that it had anything to do with their choices.
Okay, perhaps these examples will help you understand the kind of reasoning processes that author Mauboussin examines and discusses in Think Twice. Indeed, the book's title is the shortest advice he has to give to people facing situations where instinctive responses may impede the best decisions. As they say, forewarned is for forearmed.
喜欢Think Twice: Harnessing the Power of Counterintuition请与您的朋友分享,由于版权原因,读书人网不提供图书下载服务