By Adan Alter*
Imagine a fair coin that is tossed three times. You will have a one-in-four chance of turning up a string of three heads or tails. If you make too much of that small sample, you might conclude that the coin is rigged.
If you continue to toss the fair coin, say, 1,000 times, you are far more likely to turn up a distribution that approaches 500 heads and 500 tails.
As the sample grows, your chance of turning up an unbroken string shrinks rapidly.
A string is far better evidence of bias after 20 tosses than it is after three tosses - but if you succumb to the law of small numbers, you might draw sweeping conclusions from even tiny samples of data.
The law of small numbers explains a range of harmful behaviors: stereotyping (believing that all people with a particular trait behave the same way), relying on first impressions (concluding from one encounter or interview that someone is smart, capable or trustworthy) and basing financial decisions on transitory, short-term patterns in a market, such as one day’s uptick in a stock.
The solution to this problem is to pay attention not just to the pattern of data but also to how much data you have. Small samples aren’t just limited in value; they can be counterproductive because the stories they tell are often misleading.
Dr. Alter is a psychologist at New York University’s Stern School of Business and the author of “Drunk Tank Pink” and “Irresistible.” This article is re-posted from edge.org and is here with permission.
9 Comments
Another thought provoking eassy in Edge.org from Freeman Dyson about faith in predictions and experts.
"In the modern world, science and society often interact in a perverse way. We live in a technological society, and technology causes political problems. The politicians and the public expect science to provide answers to the problems. Scientific experts are paid and encouraged to provide answers. The public does not have much use for a scientist who says, “Sorry, but we don’t know”. The public prefers to listen to scientists who give confident answers to questions and make confident predictions of what will happen as a result of human activities. So it happens that the experts who talk publicly about politically contentious questions tend to speak more clearly than they think. They make confident predictions about the future, and end up believing their own predictions. Their predictions become dogmas which they do not question. The public is led to believe that the fashionable scientific dogmas are true, and it may sometimes happen that they are wrong. That is why heretics who question the dogmas are needed.
As a scientist I do not have much faith in predictions. Science is organized unpredictability. The best scientists like to arrange things in an experiment to be as unpredictable as possible, and then they do the experiment to see what will happen. You might say that if something is predictable then it is not science. When I make predictions, I am not speaking as a scientist. I am speaking as a story-teller, and my predictions are science-fiction rather than science. The predictions of science-fiction writers are notoriously inaccurate. Their purpose is to imagine what might happen rather than to describe what will happen. I will be telling stories that challenge the prevailing dogmas of today. The prevailing dogmas may be right, but they still need to be challenged. I am proud to be a heretic. The world always needs heretics to challenge the prevailing orthodoxies. Since I am heretic, I am accustomed to being in the minority. If I could persuade everyone to agree with me, I would not be a heretic. "
https://www.edge.org/conversation/freeman_dyson-heretical-thoughts-abou…
This is the weakness of the "data driven organisation". Yes, data can inform and help managers make adjustments to maximise efficiency or exploit opportunities within the range.
However, an over dependence on data and data analytics can ignore human ingenuity, hunches, judgments, and brand new initiatives, or actually doing the right thing because its ethical. Steve Jobs didn't do official market research - he had a gut feeling that the iPhone was the big thing - mRket research and 'data' would never have given us the iPhone, for example.
Another along the same lines: "For the experts themselves – many of them, my fellow academics – this is deeply disturbing, signalling the inexorable rise of irrational, fact-free political debate. But what people have had enough of is not experts or expertise, per se; rather, it is the automatic, assumed authority that experts wield over non-experts.
The rise of “experts” to positions of authority in public life is intimately connected with the decline in popular political participation over the last few decades. Society has always needed technical experts to provide advice and implement policies, but increasingly “experts” have taken a central place in decision-making itself. A burgeoning array of issues have been removed from the domain of democratic contestation and handed over to unelected technical experts to decide. In many jurisdictions, legal changes have locked in this turn to “evidence-based policymaking”. The obvious example is the rise of independent central banks. Populated by professional economists, these now control monetary policy – once a matter of intense political contestation between forces favouring inflation control (typically, capital) and those favouring full employment at the expense of some inflation (typically, organised labour). More generally, the rise of quasi-autonomous non-governmental organisations (“quangos”), judicialised bodies, and various commissions and inquiries since the 1980s marks the depoliticisation of many areas of public policy, and the growing authority of technocrats – people whose power derives not from their popular support but their technical expertise. These technocrats have also started coordinating their work across borders, forming transnational governance networks even more remote from popular democratic control."
https://thecurrentmoment.wordpress.com/2017/01/02/why-have-people-had-e…
expecting the unexpected
https://www.bloomberg.com/view/articles/2013-02-11/expecting-the-unexpe…
The 40-70 Colin Powell rule
"On this particular subject of making tough decisions, Powell prescribes the amount of information that one needs to make the decision. He says that we need between 40 and 70 percent of the total information to make a decision.
He believes that with less than 40 percent of information, we are bound to make a wrong decision. At the same time, if we keep looking for information beyond 70 percent, then by the time we make the decision, it will be so late that others will have taken that decision and moved on. It will be too late!"
MortgageBelt,
You are quite wrong. Go and read the work of Kahneman and Tversky on this. Their paper "Belief in the Law of Small Numbers" was published in 1971. Of course,there will always be exceptions,but overwhelmingly, properly used data beats gut-feeling. The problem is that this offend the feeling of self-worth common to most of us.
And your misguided thinking & co-dependence on 'data' & comment illustrates why we now have 'click-bait vehicles' instead of journalism, why we have universities more interested in boosting numbers with online 'learning' & international 'students', & why we have handed over most of our real local enterprises to global corporate entities with no concern for their customers.
.
There's just some things that numbers can't measure.
.
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.