Back in the 1960s and 1970s, strategic planning was seen as one of the finest tools to ensure high levels of employee effectiveness and corporate profitability. The underlying assumption was that decision-making aspects pertaining to strategic planning processes could be quantified, subjecting those measurements to quantitative models, which would then produce the best-possible strategies. It was during this time period that HBS professors Andrews and Christensen asserted that strategy could be made a powerful tool by linking it to business functions and by using it to assess a firm's strengths and weaknesses in relation with those of its rivals. General Electric (GE) emerged as a pioneer in the area of corporate strategic planning. With the assistance of consulting firm McKinsey, GE was organized into strategic business units (SBUs). During the same time, the Boston Consulting Group popularized a number of its own strategic approaches, including the "experience curve" and the "growth and market-share matrix."3 Strategic planning gained further regard and popularity among executives during the 1970s, peaking in the early 1980s with HBS scholar Michael Porter's seminal book publication entitled Competitive Strategy.

In the early 1980s, a number of executives began voicing concerns regarding their investments in strategic planning processes. Their concerns were related to dramatic changes in the now-globalized landscape, as well as to the incredibly rapid technological developments leading to increased levels of complexities in the marketplace. It was once again GE that led the way; its charismatic chairman Jack Welch championed the cutting of his own firm's planning departments. Other corporate executives followed his lead throughout the 1980s and 1990s. In many ways, strategic planning was replaced by notions of improving quality and productivity through operational innovation. Some of those techniques included the quality philosophies of Deming, Juran, and Crosby. In the 1990s, firms shifted their focus and attention to improving efficiency,4 resulting in the emergence of "strategic" tools, including delayering, BPR, downsizing, and rightsizing efforts.5 In the 1990s, strategic planning experienced a renaissance. Specifically, new strategies emerged, focusing upon growth through joint ventures and mergers and acquisitions, the generation of innovative ideas through decentralized strategic endeavors within the firm, emergent strategies, and the leveraging of core competencies to create strategic intent.6

The dominant theme for firms in the early days of this new millennium has been strategic and organizational innovation. Current issues include reconciling a firm's size with its flexibility and responsiveness.7 Strategic alliances infer cooperative strategies, complexity, and changes in commitments of corporate social responsibility (CSR). Today's strategic planning requires new forms and new models of leadership, more flexible organizational structures, and an increased commitment to self-direction.8


Some management scholars contend that the traditional strategic management models have failed for a variety of reasons.9 First, traditional models do not distinguish between strategic thinking and strategic planning. Indeed, traditional models rely heavily upon scientific and quantitative analyses, whereas strategic thinking methods focus upon the synthesis of a decision-maker's creativity, intuition, and experience in the selection of strategies. Second, traditional models overemphasize the role of strategy definition and formulation at the expense of aspects pertaining to strategy implementation, execution, and evaluation. This is particularly evident in business school curricula that focus heavily on strategy articulation and definition rather than the actual execution and evaluation of selected strategies. Moreover, those individuals who were traditionally tasked to translate strategy into workable tactics and operational action plans have been largely removed from organizational hierarchies in the 1990s and beyond. The "delayering" phenomenon promised many organizational benefits, yet, as we have come to understand, has left a deep vacuum in the translation and implementation of strategy.10 Additionally, since traditional strategic planning occurs at the very top of organizations and often with the guidance of consultants, strategic plans frequently are handed down to managers with little or no material input and buy-in from lower-ranked employees. Therefore, deep commitment to the successful execution of a chosen strategy, especially among lower-level managers and nonmanagerial employees, remains questionable.

Management writer Mintzberg posits further reasons why traditional strategic planning efforts have failed, namely, the fallacy of prediction, the fallacy of detachment, and the fallacy of formalization:11

The fallacy of prediction: Traditional strategic planning is based on the premise that all variables relevant to the future of a business are measurable, analyzable, and predictable. Once the results are available, strategies could be based upon those predictions, thus ensuring future success. However, even the most sophisticated predictive models are unable to foresee economic, industry, market, and social shifts. Economic cycles do not behave in a linear fashion. The fallacy of prediction, according to Mintzberg, has contributed extensively to the downfall of traditional strategic planning since it was unable to deliver predictable success.

The fallacy of detachment: Traditional strategic planning is based on the notion that strategists ought to be detached from middle managers and employees when analyzing the data in order to remain objective and to prevent bias. However, this approach decontextualizes relevant data and detaches the strategy champions from the strategy implementers. Also, qualitative information is often ignored by the scientific community, creating blind spots in the overall strategy planning.

The fallacy of formalization: Traditional strategic planning is based on the belief that formal systems for information processing and decision making are superior to human systems. Although computerized systems are able to process large quantities of data, it is individuals who integrate, synthesize, and create new directions, patterns, and trends from such analyses.

Naturally, there are other management writers who have theorized about the failure of traditional strategic planning. For instance, the Icarus Paradox, which refers to Icarus of Greek mythology, who flew too close to the sun and melted his own wings, is a neologism coined and popularized by Danny Miller. The Icarus Paradox epitomizes an observed business phenomenon whereby the strengths and apparent victories of successful firms can be the very cause of their own strategic failures. Indeed, the paradox of Icarus was that his skill and technology, which in the story led him to freedom, ultimately also led him to his own death.12

Clayton M. Christensen, in his book The Innovator's Dilemma, reported that even if firms follow established management principles and practices, they are nonetheless exposed to events, problems, and complexities that can cause strategic failures. Christensen posits that the innovator's dilemma is that the logical and competent decisions of management that are critical to the success of their firms are also the reasons why they lose their positions of leadership. He asserts that "good" management practice involves sustaining the successes of services, products, and processes, and that firms generally succeed in this. These same companies, however, become vulnerable by the emergence of disruptive technologies, which appear harmless in the marketplace to the successful firm. Since they do not pose an immediate threat, they are ignored. As such, disruptive technologies may grow to become powerful forces and successful firms may be ill prepared to respond to the changed competitive landscape. Christensen affirms that successful firms are caught in the routine of maintaining the status quo (i.e., the current success) and often fail to perceive or understand the threat of disruptive technologies. The objective then is to build and sustain successful services, products, and processes, while possessing the ability to recognize, evaluate, and develop disruptive technologies.13

< Prev   CONTENTS   Next >