Monday, April 02, 2020

Unscientific management

Decision making by data collection isn’t management. It isn’t even sensible.

The current-day obsession with data and measurement is part of a supposedly “scientific” approach to management and decision making. Yet our equal obsession with speed and cutting corners ensures that choices are often made without taking enough time to weigh all the evidence, test it for validity, or even consider its true meaning. To parody Sir Winston Churchill: “Never in the history of human leadership has so much been measured by so many for so little resulting clarity.”
We live in an age that prizes data and measurement to an almost obsessive degree. Computers have increased our ability to collect and process information by many orders of magnitude. Almost every special interest group, from political parties to social action groups and trade associations, trot out yet another slew of survey results whenever they wish to make a point or attract the attention of the media. No one seems to stop to ask what use we are making of all this data. Do we even know if it’s correct? Or what it means?

The media report all the often conflicting survey results with gleeful interest. Survey stories fill air-time and column inches. You can nearly always find some nugget in them to create a jaw-dropping headline. Never mind that today’s survey contradicts yesterday’s. The public attention span is assumed to be too short to care—or maybe even to notice.

Surveys and statistical studies have long been the stock-in-trade of academics. You publish your results, others test and criticize them, and—slowly—knowledge inches forward. If what you report fails to stand up to analysis and replication by your peers, it is rejected. You are an expert writing for experts. They demand solid evidence and unshakeable methodology. This process is the foundation of the scientific method.

Thanks to Powerpoint, presentations contain carefully chosen summaries—little more than headlines designed to produce an emotional reaction, not an analytical one.

In organizations, much of the data is collected and analyzed by amateurs. The methods used are often poorly understood. Once available, results are use more politically than scientifically: to justify individual points of view, support pet projects, or wave in the face of opponents. What supports a case is seized up. Often there is no one to question it, since any “inconvenient” findings are quietly hidden away. Thanks to Powerpoint, presentations contain carefully chosen summaries—little more than headlines designed to produce an emotional reaction, not an analytical one.

It is the aura of scientific respectability that makes the day-to-day use of numerical data and survey results so attractive—and so dangerous. The results printed in the media, or reported in tens of thousands of Powerpoint presentations in corporations every day, are not delivered to be checked, questioned, or challenged. They are to be believed. All the scientific (or pseudo-scientific) trappings are used to foster an unquestioning acceptance of the supposed findings. The hearer or reader is subtly reminded that they are ill-informed amateurs being addressed by experts possessing all the data. This isn’t science. It’s marketing and PR “spin” wrapped in scientific garb. It’s a very aggressive wolf trying to pretend it’s a harmless, scientific sheep.

In today’s hyper-competitive climate, no one wants to admit that they understood barely one word in five . . .

In the workplace, more and more data is demanded, processed, and used to justify various points of view. Do those making decisions based on presentations of this data understand it? Do they have the knowledge, or the time, to question its validity—or even reflect on what else it might be pointing to, in place of whatever they have been told to believe? Is there any opportunity given for fact-checking or attempts to replicate the findings?

The answer to all these questions is usually “no”. Haste is endemic. Executives are expected to make virtually instant decisions. Most of them are too overwhelmed with data, let alone all the other demands that they face, to do more than accept what their “experts” tell them. In today’s hyper-competitive climate, no one wants to admit that they understood barely one word in five; or that they have virtually no grasp of statistics and can be bamboozled by almost any set of plausible-seeming figures.

Worse, yet, many of the “experts” producing and presenting this data are consultants, and expensive ones at that. When you pay millions to get a report from a consulting firm, you aren’t usually disposed to question or reject the results. And the more that you’ve paid for the consultants’ findings, the less willing that you are even to consider that your money might have been wasted.

What does it take to make sure of a sensible level of fact-checking, critical analysis, and consideration of all this data, let alone the conclusions that you are told that it supports?

In management decision making, all data ought really to be presumed false or misleading until proven factual.

It takes time and the willingness to regard all proposals, however enthusiastically presented and wrapped in “scientific” analysis of data, with initial skepticism. In our judicial systems, people are presumed innocent until proven guilty (though try getting the media to respect that). In management decision making, all data ought really to be presumed false or misleading until proven factual; and all proposals supported by data, however superficially convincing, should be the subject of deep suspicion until proper independent evidence is produced.

Time and skepticism: the very heart of Slow Leadership. Without them, managers and executives are almost helpless against manipulation by special interests and confusion by data overload. A glut of macho Hamburger Managers, all primed with endless ambition and eager to appear decisive, coupled with silly workloads and a corporate obsession with instant gratification, is a terrifying prospect. It’s like putting a group of manic two-year olds in charge of your trust fund.

Hardly a recipe for sound, truly scientific decision-making, is it?

Email Newsletter icon, E-mail Newsletter icon, Email List icon, E-mail List icon
Sign up for our Email Newsletter

Labels: , , , , ,

Stumble Upon Toolbar

Interesting post. I think it's important to distinguish between the data itself and the use of data. There is nothing wrong with collecting data. Used correctly, it can be a very powerful tool for making effective, sound decisions, especially when the answers aren't easy or obvious.

But it is a tool, like the Internet or computers, and thus we need to learn how to use it correctly and effectively. This is where the breakdown happens. As long as you don't know how to use something, you will be at the mercy of those who do, whether those are so-called consultants or tech support.
Yep, Shannon. Data itself is neutral.

The analysis ought to be neutral too, in the sense of looking to see what is there, whether or not the outcome is what you want.

Trouble is that too many people either have little skill with data, or set out looking for a specific answer. No suprise then if they make sure that they find whatever they wanted.

Keep reading, my friend.

Post a Comment

<< Home
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.5 License.

This page is powered by Blogger. Isn't yours?