Capacity utilization is a measure of how close the nation’s manufacturing sector is to running at full capacity. Formally, it is the ratio of the industrial production index to an index of full capacity.
Economists, particularly central bankers, look at the total capacity utilization rate to discern trends in production, general economic activity, manufacturing conditions, and inflation. In addition, the rates for particular industries can pinpoint areas of overcapacitization (production that pushes capacity to its limit) that could become manufacturing bottlenecks, constraining production farther down the line and possibly pushing up prices. Such information is useful not only to economists but also to company managers trying to forecast costs and plan production schedules.
Low levels of capacity utilization—79 percent or below—indicate that the economy is headed to, or already in, recession. In fact, as Exhibit 9.4 illustrates, each of the last seven economic recessions was characterized by utilization rates in that range. This relationship is logical: Subpar economic conditions simply don’t warrant strong production.
EXHIBIT 9.3 Industrial Production versus Manufacturing Payrolls
EXHIBIT 9.4 Capacity Utilization, Fed Funds Rate
When demand and commerce are booming, on the other hand, factories tend to ramp up and produce at rates closer to their capacity. The downside to this is that the higher production rates tend to stoke inflation.
Exhibit 9.4 also shows that from 1970 through 2003, whenever the capacity utilization rate rose into the high 80s, the federal funds rate rose as well.
Similarly, when the capacity utilization rate fell sharply as in the 1990 — 1991, 2001, and 2007—2009 recessions, the Federal Reserve reacted by reducing its borrowing target. The conclusion: Although the relationship has loosened in recent years, the Federal Reserve clearly believes that capacity utilization is still a powerful inflation marker and watches the reported rate carefully.