Management Is Much More Than a Science30 August, 2017 / Articles
Underlying the practice and study of business is the belief that management is a science and that business decisions must be driven by rigorous analysis of data. The explosion of big data has reinforced this idea. In a recent EY survey, 81% of executives said they believed that “data should be at the heart of all decision-making,” leading EY to enthusiastically proclaim that “big data can eliminate reliance on ‘gut feel’ decision-making.”
Managers find this notion appealing. Many have a background in applied sciences. Even if they don’t, chances are, they have an MBA—a degree that originated in the early 20th century, when Frederick Winslow Taylor was introducing “scientific management.”
MBA programs now flood the business world with graduates—more than 150,000 a year in the United States alone. These programs have been trying to turn management into a hard science for most of the past six decades. In large measure this effort began in response to scathing reports on the state of business education in America issued by the Ford and Carnegie Foundations in 1959. In the view of the report writers—all economists—business programs were filled with underqualified students whose professors resisted the methodological rigor of the hard sciences, which other social sciences had embraced. In short, business education wasn’t scientific enough. It was in part to remedy this shortcoming that the Ford Foundation supported the creation of academic journals and funded the establishment of doctoral programs at Harvard Business School, the Carnegie Institute of Technology (the predecessor of Carnegie Mellon), Columbia, and the University of Chicago.
But is it true that management is a science? And is it right to equate intellectual rigor with data analysis? If the answers to those questions are no and no—as we will suggest in the following pages—then how should managers arrive at their decisions? We’ll set out an alternative approach for strategy making and innovation—one that relies less on data analysis and more on imagination, experimentation, and communication.
But first let’s take a look back at where—or rather with whom—science started.
Is Business a Science?
What we think of as science began with Aristotle, who as a student of Plato was the first to write about cause and effect and the methodology for demonstrating it. This made “demonstration,” or proof, the goal of science and the final criterion for “truth.” As such, Aristotle was the originator of the approach to scientific exploration, which Galileo, Bacon, Descartes, and Newton would formalize as “the Scientific Method” 2,000 years later.
It’s hard to overestimate the impact of science on society. The scientific discoveries of the Enlightenment—deeply rooted in the Aristotelian methodology—led to the Industrial Revolution and the global economic progress that followed. Science solved problems and made the world a better place. Small wonder that we came to regard great scientists like Einstein as latter-day saints. And even smaller wonder that we came to view the scientific method as a template for other forms of inquiry and to speak of “social sciences” rather than “social studies.”
But Aristotle might question whether we’ve allowed our application of the scientific method to go too far. In defining his approach, he set clear boundaries around what it should be used for, which was understanding natural phenomena that “cannot be other than they are.” Why does the sun rise every day, why do lunar eclipses happen when they do, why do objects always fall to the ground? These things are beyond the control of any human, and science is the study of what makes them occur.
However, Aristotle never claimed that all events were inevitable. To the contrary, he believed in free will and the power of human agency to make choices that can radically change situations. In other words, if people choose, a great many things in the world can be other than they are. “Most of the things about which we make decisions, and into which we therefore inquire, present us with alternative possibilities….All our actions have a contingent character; hardly any of them are determined by necessity,” he wrote. He believed that this realm of possibilities was driven not by scientific analysis but by human invention and persuasion.
We think this is particularly true when it comes to decisions about business strategy and innovation. You can’t chart a course for the future or bring about change merely by analyzing history. We would suggest, for instance, that the behavior of customers will never be transformed by a product whose design is based on an analysis of their past behavior.
Yet transforming customer habits and experiences is what great business innovations do. Steve Jobs, Steve Wozniak, and other computing pioneers created a brand-new device that revolutionized how people interacted and did business. The railroad, the motor car, and the telephone all introduced enormous behavioral and social shifts that an analysis of prior data could not have predicted.
To be sure, innovators often incorporate scientific discoveries in their creations, but their real genius lies in their ability to imagine products or processes that simply never existed before.
The real world is not merely an outcome determined by ineluctable laws of science, and acting as if it is denies the possibility of genuine innovation. A scientific approach to business decision making has limitations, and managers need to figure out where those limitations lie.
Can or Cannot?
Most situations involve some elements you can change and some you cannot. The critical skill is spotting the difference. You need to ask, Is the situation dominated by possibility (that is, things we can alter for the better) or by necessity (elements we cannot change)?
Suppose you plan to build a bottling line for plastic bottles of springwater. The standard way to set one up is to take “forms” (miniature thick plastic tubes), heat them, use air pressure to mold them to full bottle size, cool them until they’re rigid, and finally fill them with water. Thousands of bottling lines around the world are configured this way.
Some of this cannot be other than it is: how hot the form has to be to stretch; the amount of air pressure required to mold the bottle; how fast the bottle can be cooled; how quickly the water can fill the bottle. These are determined by the laws of thermodynamics and gravity—which executives cannot do a thing to change.
Still, there’s an awful lot they can change. While the laws of science govern each step, the steps themselves don’t have to follow the sequence that has dominated bottling for decades. A company called LiquiForm demonstrated that after asking, Why can’t we combine two steps into one by forming the bottle with pressure from the liquid we’re putting into it, rather than using air? And that idea turned out to be utterly doable.
Executives need to deconstruct every decision-making situation into cannot and can parts and then test their logic. If the initial hypothesis is that an element can’t be changed, the executive needs to ask what laws of nature suggest this. If the rationale for cannot is compelling, then the best approach is to apply a methodology that will optimize the status quo. In that case let science be the master and use its tool kits of data and analytics to drive choices.
In a similar way, executives need to test the logic behind classifying elements as cans. What suggests that behaviors or outcomes can be different from what they have been? If the supporting rationale is strong enough, let design and imagination be the master and use analytics in their service.
It’s important to realize that the presence of data is not sufficient proof that outcomes cannot be different. Data is not logic. In fact, many of the most lucrative business moves come from bucking the evidence. Lego chairman Jørgen Vig Knudstorp offers a case in point. Back in 2008, when he was the company’s CEO, its data suggested that girls were much less interested in its toy bricks than boys were: 85% of Lego players were boys, and every attempt to attract more girls had failed. Many of the firm’s managers, therefore, believed that girls were inherently less likely to play with the bricks—they saw it as a cannot situation. But Knudstorp did not. The problem, he thought, was that Lego had not yet figured out how to get girls to play with construction toys. His hunch was borne out with the launch of the successful Lego Friends line, in 2012.
The Lego case illustrates that data is no more than evidence, and it’s not always obvious what it is evidence of. Moreover, the absence of data does not preclude possibility. If you are talking about new outcomes and behaviors, then naturally there is no prior evidence. A truly rigorous thinker, therefore, considers not only what the data suggests but also what within the bounds of possibility could happen. And that requires the exercise of imagination—a very different process from analysis.
Also, the division between can and cannot is more fluid than most people think. Innovators will push that boundary more than most, challenging the cannot.
Breaking the Frame
The imagination of new possibilities first requires an act of unframing. The status quo often appears to be the only way things can be, a perception that’s hard to shake.
We recently came across a good example of the status quo trap while advising a consulting firm whose clients are nonprofit organizations. The latter face a “starvation cycle,” in which they get generously funded for the direct costs of specific programs but struggle to get support for their indirect costs. A large private foundation, for instance, may fully fund the expansion of a charity’s successful Latin American girls’ education program to sub-Saharan Africa, yet underwrite only a small fraction of the associated operational overhead and of the cost of developing the program in the first place. This is because donors typically set low and arbitrary levels for indirect costs—usually allowing only 10% to 15% of grants to go toward them, even though the true indirect costs make up 40% to 60% of the total tab for most programs.
The consulting firm accepted this framing of the problem and believed that the strategic challenge was figuring out how to persuade donors to increase the percentage allocated to indirect costs. It was considered a given that donors perceived indirect costs to be a necessary evil that diverted resources away from end beneficiaries.
We got the firm’s partners to test that belief by listening to what donors said about costs rather than selling donors a story about the need to raise reimbursement rates. What the partners heard surprised them. Far from being blind to the starvation cycle, donors hated it and understood their own role in causing it. The problem was that they didn’t trust their grantees to manage indirect costs. Once the partners were liberated from their false belief, they soon came up with a wide range of process-oriented solutions that could help nonprofits build their competence at cost management and earn their donors’ confidence.
Although listening to and empathizing with stakeholders might not seem as rigorous or systematic as analyzing data from a formal survey, it is in fact a tried-and-true method of gleaning insights, familiar to anthropologists, ethnographers, sociologists, psychologists, and other social scientists. Many business leaders, particularly those who apply design thinking and other user-centric approaches to innovation, recognize the importance of qualitative, observational research in understanding human behavior. At Lego, for example, Knudstorp’s initial questioning of gender assumptions triggered four years of ethnographic studies that led to the discovery that girls are more interested in collaborative play than boys are, which suggested that a collaborative construction toy could appeal to them.
Powerful tool though it is, ethnographic research is no more than the starting point for a new frame. Ultimately, you have to chart out what could be and get people on board with that vision. To do that, you need to create a new narrative that displaces the old frame that has confined people. And the story-making process has principles that are entirely different from the principles of natural science. Natural science explains the world as it is, but a story can describe a world that does not yet exist.