Portfolio Models: The Ansoff Matrix & Boston Consulting Group Matrix

During the 1960s and 1970s when conglomerate organisations were still the mainstay of Western economies, a range of portfolio models were developed to simplify the process of strategically managing a broad and complex range of products and businesses. The two most famous and widely used models were the Ansoff Product/Market Matrix (1965) and the Boston Consulting Group (BCG) Matrix (Grant 2016).

Ansoff’s (1965) original work viewed the organisation as consisting of an assembly of strategic business areas (SBAs). These are the same as SBUs or Strategic Business Units, and each SBA offered different future growth and profitability opportunities requiring different competitive approaches which he referred to as market penetration, market development, product development and diversification.

Market penetration involved the organisation selling more of its existing products/services to its existing customers and increasing its market share. Market development occurred when the company identified new markets for the firm’s products/services and new customers. Meanwhile, product development involved the creation of new products to replace existing ones in the anticipation of changes in customer needs. Finally, diversification involved a firm moving into both new products/services and new markets (Ansoff 1965).

However, the application of the Ansoff Matrix to modern ICT firms can be problematic, particularly large Internet-based firms which have very large product portfolios. E-commerce technology giants such as Amazon and Alibaba have millions of products on their respective websites. Amazon, for example, now has more than 200 million items listed. So although the Ansoff matrix (1965) was designed to simplify the analysis of conglomerate businesses, the framework seems to lack the necessary level of sophistication required to handle such huge product portfolios. Chris Anderson (2008), in an article entitled ‘The End of Theory’ (in Wired magazine) went as far to say that due to the data deluge, the scientific method was becoming obsolete and theories and models were becoming irrelevant:

Anderson (2008) was, therefore, referring to the way that computers, algorithms and big data can potentially generate more insightful, useful, accurate or true results than specialists or domain experts who traditionally craft carefully targeted hypotheses and research strategies. According to Anderson (2008), the semantic or causal analysis was no longer required since technology and correlation provided the means to spot patterns, trends and relationships. Although controversial, this approach is one that has been adopted by large data-rich Internet firms with massive product portfolios such as Amazon and Alibaba that have strong ‘cloud’ computing capabilities.

Additional complications also arise due to the difficulties of defining product and market boundaries due to industry convergence and product bundling (Bakos and Brynjolfsson 2000). The fact that the Internet cuts across market boundaries makes it difficult to determine whether a product strategy is market penetration, product development or market development or even diversification. The digital nature of products also adds to this complexity. Therefore, geographic boundaries and standard industrial classifications (SIC) have very little relevance.

The Boston Consulting Group (BCG) matrix was a two-by-two matrix developed by Bruce Henderson (the firm’s founder) that classified businesses, divisions or products according to their market share and the potential future growth of the market. Growth was seen as the best measure of market attractiveness, and market share was seen to be a good indicator of competitive strength. Based on this there were four possible classifications.

  • • A ‘Cash Cow’ had a high relative market share in a low-growth market and would generate substantial cash inflows and profits as well as support the growth of other company products.
  • • A ‘Star’ had a high relative market share in a high-growth market. A star would normally be cash-neutral despite its strong position, as large amounts of cash would need to be spent to defend the organisation’ s position.
  • • A ‘Question Mark’ was characterised by a low market share in a high- growth market. Substantial net cash input was required to maintain or increase market share. The company would need to decide whether to do nothing (continuing to absorb cash) or market more intensively (requiring substantial investment) or get out of the market (‘double or quit’).
  • • The ‘Dog’ product would have a low relative market share in a low-growth market. Such a product tended to have a negative cash flow that would likely continue. An organisation with such a product could attempt to appeal to a specialised market, delete the product or harvest profits by cutting back support services to a minimum.

The BCG analysis is designed to reveal whether the organisation has too many declining products or services, too few products or services with growth potential, insufficient product and/or service profit generators to maintain present organisation performance.

The extent to which the BCG is relevant to the ICT sector today (in particular the platform-based firms) is highly debatable. For example, companies such as Google appear to be adopting a portfolio approach to the management of their services. Google’s product portfolio is a balanced mixture of relatively mature businesses such as AdWords and AdSense, rapidly growing products such as Android and more nascent ones such as its ‘Moonshot’ products including its driverless car (Financial Times 2015). Google’s well-known exploratory culture ensures that a large number of ideas get generated. From these ‘Question Marks’, a few are selected on the basis of rigorous and deep analytics. Subsequently, they are tried out on a restricted basis, before being scaled up. Gmail and Glass, for instance, were launched among a select group of enthusiasts. Such early testing not only kept the costs of the ‘Question Mark’ products down but also helped the company reduce the risk relating to new-product launches. After launch, Google leverages deep analytics to continuously monitor portfolio health and move products around the matrix. As a result, it is able to launch and divest approximately 10 to 15 projects every year (The Economist 2016).

Alternatively, large e-commerce platforms such as Amazon and Alibaba are able to maintain a very large portfolio of products and do not need to spend as much time considering investment vs. divestment decisions relating to products and product categories (Brynjolfsson et al. 2010). This is due to the concept known as the ‘long tail’ (Anderson 2008). According to Chris Anderson (2007), traditionally in brick-and-mortar industries, manufacturers and retailers focused on producing and selling those products and product ranges that were in high demand and largely ignored the low demand items. This was due to the need to maintain economies of scale and high asset utilisation in terms of production runs, inventory and retail shelf space.

The Long Tail theory is based on the belief that modern culture and the modern economy is now moving away from a focus on a relatively small number of ‘hits’ (mainstream products and markets) at the head of the demand curve towards a huge number of niches in the tail. The key factor that determines whether sales distribution has a long tail is the total cost of inventory storage and distribution (Anderson 2007). When this is insignificant, as it is with digital services and distribution, it becomes economically viable to target many small market segments (Brynjolfsson et al. 2006). So where there are no constraints on physical shelf space and distribution bottlenecks, narrowly-targeted goods and services can be as attractive as mainstream ones. Since shelf space was expensive, brick-and-mortar retailers only stocked the popular items. However, online retailers (from Amazon to iTunes) can now stock virtually everything. More than half of Amazon’s book sales come from outside its top 130,000 titles. This is more than most bookstores will typically stock as physical inventory. The number of available niche products, therefore, outnumber the hits by several orders of magnitude. Those millions of niches are the ‘Long Tail’, which had been largely neglected until recently in favour of the ‘Short Head ‘of hits (Anderson 2007).

Finally, a further problem when using the BCG matrix is the fact that many products being provided by technology firms in the ICT ecosystem are available at cost price or free (Anderson 2009) to entice traffic (Amazon, Google and Facebook) on to Internet platforms or to sell hardware products such as apps and content that are used as complements (i.e. the Apple and Google smartphones). This use of product bundling (Amazon Prime) and ‘freemium’ strategies makes it very difficult to apply portfolio theories in their traditional sense because of the distortion of product and market boundaries and economics.

< Prev   CONTENTS   Source   Next >