Discovery Beats Planning, so Plan to Discover

A great article by Bill Barnett on the role of discovery and learning in the execution of a strategy or a plan. TAP’s Agile Strategy system does just that, learn more by contacting us:

Heard at an outdoor café along University Avenue in Palo Alto: “The strategy was clear. You can’t start as a platform. You start as an application and then, when the user base is large enough to get a network effect, you can pivot into a platform.” Knowing nods around the table; wisdom understood by the cognoscenti.

I was hunkered down with a Super Tuscan at the last sidewalk table, eavesdropping on the ideas circulating among the start-up crowd. This one is a lesson from my course at Stanford. Not to imply that I’m the headwaters. To the contrary, I’ve waded into a stream of ideas cascading around the valley, ideas that change with each new, unexpected development. Even the subject of the debate I overheard, Facebook’s platform strategy, was discovered along the way. Originally, Facebook’s leaders saw it as a social network application. Only once Facebook grew large did the idea materialize to become a platform. So in 2007 the website’s APIs were opened to a world of developers who could independently create Facebook applications. Since then, a litany of reversals and changes have fueled debate among developers and users, as Facebook has tried to exert control over the platform. Some have criticized Facebook for this haphazard evolution. Turns out, that is how most strategies emerge: Discovery beats planning.

For more evidence, go back and look at the strategic plan from years ago at your favorite successful company. There is a good chance that the company’s winning strategy won’t appear in that old plan. Examples abound: Trader Joe’s, a boutique specialty retailer in the U.S., once made its money selling cigarettes and ammunition – a far cry from the microwavable organic meals and fancy cheeses one can get there today. Honda Motors, famously, planned to sell big motorcycles – “choppers” – in the U.S., and ended up discovering the market for small “minibikes.” The list of examples goes on, including many entrepreneurial firms that discover a strategy better than the plan their founders once pitched.

So how do we deal with the fact that discovery beats planning?

One common reaction is to pretend that the success was planned. Of course, after a discovery we naturally try to make sense of what we see working so well. And there is nothing wrong with retrospective rationalization; we do it all the time in business school “case studies” in an effort to learn. The problem is allowing retrospective rationalization to masquerade as a well-planned strategy, as in the young folks talking about Facebook (“The strategy was clear…”) Such a misunderstanding leads observers to think (wrongly) that great businesses result from a great plan.

Another bad reaction is to wax cynical, surmising that success really just comes down to luck. This conclusion denies the fact that some people are better than others at spotting the opportunities that (luckily) come along. There is much more to discovery than the flip of a coin. When plans produce unanticipated consequences, these look like failures. If you think that leadership means waiting to get lucky, you’ll conclude from such failure that your luck has run out – a self-fulfilling prophecy.

A better reaction is to lead your organization through the process of discovery. After all, there is information in those unanticipated consequences for those who know to seek it out. Scott Cook, Intuit’s founder, coaches his people to “savor surprises” – to see deviations from plan as the fountainhead of opportunity. Seen this way, the strategic plan is just the start of the discovery process. This process happens differently at different companies - and often happens unintentionally. But research on how companies evolve has revealed two key steps in the discovery process that are worth highlighting.

Step 1: Failed Execution

It may surprise you that the discovery process often starts with failed execution. But think it through. When execution of a strategy goes smoothly, we simply enjoy the success. It often requires failure for companies and their leaders to stop and reconsider their strategy.

In fact, rather than think in terms of "strategy execution", many prefer instead to describe the process as a hypothesis test. This approach draws from the ideas of Professor James March, who conceived of organizations as "learning" much like people do. Seen this way, a company's strategy is a theory - one that develops over time as the organization and its leaders learn. Taking this approach, the execution of strategy is really just a test of the strategy's logic. Our strategy guides and coordinates, but in the process it allows us to treat the underlying logic as a hypothesis to be tested. So it is that you will hear people advise you to "fail fast and cheap." They don't really want you to fail; they want you to learn - to develop your strategy during the process of execution.

Strategy.png

Some readers will now be thinking of the modern approach to entrepreneurship, as summarized in the work of Eric Ries in The Lean Startup or Steve Blank in The Four Steps to the Epiphany. Understanding that an entrepreneur must learn by doing, these authors nicely outline an iterative approach to creating the start-up's strategy: Starting with a "minimum viable product," using it to fail fast and cheap in a test of the "value hypothesis," and then adapting to the market's response (the "pivot"). The process of creating start-ups has been dramatically improved by this approach, since it explicitly allows the strategies of start-ups to learn from the failures so common during execution.

But adapting strategy through failure applies far beyond the world of the start-up, to established companies and even global giants. Siemens - the German technology giant - initially failed when it attempted to enter China with its medical diagnostic imaging machines. Only over time would that company succeed in China, after adapting its approach to the very different health care markets and institutions it encountered there. More generally, whenever a company attempts to do something novel, from new product launches to internationalization, initial failure can trigger the process of strategic discovery.

Step 2: Diagnosis and Learning

Learning from failure is difficult. It requires considerable effort by leadership to correctly diagnose, to interpret and make sense of what has happened. Sometimes in the process leadership may even discover possibilities they might otherwise not have imagined. But this learning process is fraught with difficulties.

To illustrate, consider the example of music subscription. After Napster was shut down in 2001, the brand was reborn in 2003 as a subscription online-music service run by Roxio’s Chris Gorog. Chris and his team quickly amassed a large catalog of songs, enabled radio streaming, established partnerships with online platforms like yahoo, built an entrepreneurial organization, and expanded internationally. As record stores became history, Apple’s iTunes, illegal music downloads, and a few subscription services like Napster offered different visions of the future. But by 2005 the verdict was in. Illegal downloads continued apace, iTunes was a clear success, and subscription services were not. As one Washington Post writer put it (in 2005), Napster’s subscription model was not a viable alternative to music ownership: “When music is good, you want to know that it can’t be taken away from you.” The final nail was Steve Jobs' declaration: "Nobody wants to rent their music." The experiment had been run, and the music ownership model beat subscription services.

But wait. With the explosive growth of services like Spotify and Apple Music, the pundits are now saying that subscription models are the winning logic in that business. What about the lesson we learned from the failures of just a few years ago?

The problem here is that a failure is a datum, not a logical argument. Data do not speak for themselves. Failures can have various causes, and so it takes logical reasoning to explain why failures happen. Perhaps the early subscription services were ahead of their time, such that limited bandwidth might have made them less attractive than they are today. Or maybe the smartphone is a necessary complement to such services. Whatever the diagnosis, logic is required to sort out why firms succeed and fail.

Unfortunately, most observers skip the logic part. It is mentally easier to jump to the “obvious” conclusion: If the business failed, the business model must be wrong. Full stop. You can easily tell when this skip happens. The person will name an example as if it were a reason. Is online grocery delivery a viable model? No: Webvan. Is internet search a viable business? No: Alta Vista. These examples are data, not logical reasoning. But it is hard to rebut those who argue by citing examples, because you look the fool trying to say that a failure somehow might have made sense. Like Gerald Grow’s cartoon, we replace reasoning with dueling examples: I shout “Napster!” you reply “Spotify!”

 
dialectical discourse.jpg
 

The result? We often “learn” without logic, and so we often walk away from great ideas. The Apple Newton failed, leading many to say that there was no market for smart handheld devices - yet now we all own them. Early attempts at remote alarm systems failed, leading many to conclude that such services could not be profitable; now they are commonplace. Even internet search, possibly the most lucrative business in history, was initially panned after a spate of failures among early movers – Lycos, Alta Vista, Excite, and others. Often firms fail. But that may not mean, logically, that we should abandon their business models entirely.

To diagnose well, we need to systematically contrast failures and successes - as is done in good academic research. Popular techniques such as A/B testing, agile development, root-cause analysis and similar approaches are designed to show us successes and failures without destroying the firm. These techniques routinely are used in Silicon Valley firms these days, and are making their way into the global business lexicon. Sometimes such techniques are very effective for learning. But keep in mind that these techniques simply provide us with data. It is up to us to explain the data, and that requires logic.

False Results

The music subscription example highlights a problem in learning known as the "false negative." The possibility of false results needs to be understood if you are to successfully lead strategy. No doubt you've heard insanity defined as "doing the same thing over and over and expecting different results." And yet we know from science that advances often come when experiments fail to replicate a result. Einstein himself, said to be the aphorism's author, often repeated experiments. After all, experiments sometimes produce false results. You don't have to be Einstein to know it is a good idea to run the test again.

Yet if you are in business, you probably live by the insanity aphorism - insisting that no test be repeated. How often have you said "But we already tried that, and look how it turned out!" Calling others insane is an effective way to shut down further experimentation (and thinking).

Fortunately for us all, the world re-runs experiments all the time, and often gets different results. Webvan failed. Now Amazon, Google, and others are delivering to your door. EachNet (and its acquirer, eBay) failed to make cash-on-delivery work in Chinese C-to-C e-commerce. Now Taobao's cash-on-delivery system is thriving. The failures of Alta Vista, Excite, Lycos, and others led many to conclude that internet search could not be a business. Now, well, you know.

You're probably already trying to explain the differences in all these examples. Slow down; the broader issue here is the problem of false results.

Sometimes experiments generate false negatives - they tell you "no" when the real answer is "yes." And sometimes experiments generate false positives, telling you "yes" when the real answer is "no." You of course know about false positives and false negatives in medicine. We worry about them a lot, which is why we often go back for a re-test when things get medically serious. But for whatever reason, we don't think about false results nearly enough in business.

For instance, I recall one of the early movers in digital medical diagnostic imaging. Their system was rapidly adopted by several hospitals, leading to a lot of excitement, including executives quitting their jobs and joining the company. Then growth abruptly ended. It turns out the early wins were a false positive. (Many more examples of false positives can be found in Geoffrey Moore's "Crossing the Chasm" books, enough to have created a consulting juggernaut.)

False negatives in business are common too, as in the examples of search, delivery, and Chinese COD - but they are often harder to spot. The problem is that false positives are self-correcting, but false negatives are not. When you get a positive result from a business experiment, typically you'll keep at it. If it turns out to have been a false result, the world will make that clear enough. But if you get a false negative, you'll be inclined to "pivot." And you'll never know that you were on to something good - unless somebody else tries it again.

Perhaps you are thinking: "Hey, in all your examples, there were some variables that changed. A good test would take into account all the variables that matter." Sure, Amazon and Google now know about some things that Webvan did not, and they have adjusted those variables accordingly and that's why their experiments are working when Webvan's did not.

Here's the problem: Often we don't know all the variables that matter. This problem is well understood in science. Good scientists know that two seemingly identical experiments can produce different results, since often there are variables operating that are unknown to the scientist at the time. In fact, even random chance can produce odd results. That's why a good scientist knows that there is a lot she does not know; so she runs the test again.

The lesson: Insane though it may seem, don't just pivot. Run the test again.

Are the Dogs Eating the Dog Food?

The ultimate aim of the discovery process is to find "product market fit," that situation where customers are clearly engaged with your product or service. When I get to know a business leader, I typically ask "do you have product-market fit?" Normally she will say "yes," and then I'll ask "how do you know?" Then the hand-waving begins, featuring a lot of talk about "value propositions" and such. I listen for the evidence of product-market fit. Are the dogs eating the dog food? If so, can we be sure about why? Or are we misreading the signal?

dog.jpg

There are many ways of misreading a signal. For instance, just recently a software entrepreneur in Santa Clara told me, "Alibaba is a potential customer, and they want to invest!" OK, this remark raises the obvious problem that customers and investors have very different motivations. But the real problem is the executive. He's getting excited for the wrong reason. An offer to invest is not a purchase order. It is not evidence of product market fit. It is not "the signal."

What's more, an investment like this not only gets misread as the signal, it also buys the firm more time to keep doing what they are doing - even though the dogs are not eating the dog food. Investments often kill firms by cushioning their management teams, allowing them to feel like they are doing a good job even when they should be desperately reconsidering their strategy.

Ultimately, strategic discovery is about finding out what it takes to achieve product-market fit. We start with a strategy and test that theory by going to market, all the while trying to find our way to product-market fit. If all goes well, we may end up discovering a strategy much more valuable than what we could have imagined when the process began.

Courtesy Bill Barnett http://www.barnetttalks.com/2019/11/discovery-beats-planning-so-plan-to.html

Gary Sarson