# Econometrics and A-theoretical Empiricism

The idea that economic issues are to be studied by analysing quantitative relations between different variables is old, dating back as far as William Petty’s political arithmetic, for instance; although the subsequent centuries saw it superseded by the idea of political economy as a moral science, as maintained by Smith, Marshall and Keynes, it came to the fore once again in recent decades.

Wassily Leontief (1905-1999, Nobel Laureate 1973) developed input- output tables - a representation of the economy through matrixes: each column indicates the means of production utilised in a given sector distinguished by sector of origin; each row indicates the sector-by-sector destination of the product of a given sector (cf. Leontief 1941). Leontief s tables are a tool for empirical analysis, for instance, to study the differences in productive structures of the various countries or technical change. Moreover, statistical information organised in input-output tables has been used within linear programming, under the assumption of constant returns to scale in all sectors of the economy, to compute technical production coefficients (that is, the quantity of each means of production required for each unit of product) and thus to compute the quantity of gross output of the different sectors corresponding to a given set of net products. At the theoretical level, the system of determination of gross production levels thus arrived at turns out to be the dual (in the mathematical meaning of the term) of a system of determination of relative prices based on relative difficulties of production of the various commodities.^{[1]}

Another tool of empirical analysis is the system of national accounting, developed by Richard Stone (1913-1991, Nobel Laureate 1984) among others, under the stimulus of Keynesian theory and the macroeconomic categories it used. The national accounting system offers a set of categories, defined in such a way as to be susceptible to precise statistical computation and to accord with the principles of double entry bookkeeping, which represent the working of the economic system as a web of flows of goods and money connecting different groups of economic agents. Initiated by the United Nations and placed under the direction of Stone, a system of national accounts (SNA) was devised (for the first time in 1953 and subsequently revised a number of times) to constitute a compulsory reference point for national statistic institutes.

Increasing availability of statistical information, sufficiently reliable and organised in well-defined categories, favoured development of applied economic research. But developments in statistical theory, and in particular inferential statistics (Savage 1954), also played an important role. These elements (and others including, in particular, advances in information technology) combine to account for the impetuous development of econometrics (from the Greek *metron,* ‘measurement’): the science that aims at identifying quantitative relations among economic variables.

Rodolfo Benini (1862-1956), statistician, demographer and economist, was among the first (cf. Benini 1907) to utilise advanced statistical methods such as multiple regressions in economic analysis. Henry Moore (1869-1958) and his pupils (among them Paul Douglas (1892-1976) and Henry Schultz (1893-1938)) worked systematically on statistical estimates of economic relationships.

Ambitious prospects for the newborn econometric science were evoked by Ragnar Frisch (1895-1973, Nobel Laureate in 1969), in his editorial for the first issue of *Econometrica* (Frisch 1933), the organ of the Econometric Society, founded in 1933. According to Frisch, econometrics constitutes the unification of statistics, economic theory and mathematics necessary ‘for a real understanding of the quantitative relations in modern economic life’.

Contributions to the development of new econometric techniques came from economists grouped in the Cowles Commission, including Jacob Marshak (1898-1977), Jan Tinbergen (1903-1994, Nobel Laureate 1969), Tjalling Koopmans (1910-1985, Nobel Laureate 1975), Don Patinkin (1922-1997) and Lawrence Klein (1920-2013, Nobel Laureate 1980). Trygve Haavelmo (1911-1999, Nobel Laureate 1989), in an essay published in 1944, proposed estimation of econometric relations within a stochastic context, thus defending the econometric approach against the criticism that Keynes (1973, pp. 295-329) had levelled at the construction of macroeconomic models.

In the United States development of quantitative analysis received a boost through utilisation in support of the war effort during the Second World War, although mainly at the level of operational research, to solve planning problems in transport and suchlike. Modern econometrics, aiming at constructing large econometric models, only emerged in the immediate aftermath of World War II, at the Cowles Commission; the first econometric model of the US economy was devised by Klein. Increasing public intervention in the economy entailed the need to anticipate macroeconomic trends, thus favouring development of the new analytical methods. Cold War political tensions and expectations of a new Great Crisis in market economies when war expenditure dried up created an atmosphere in which the optimistic forecasts of the Cowles Commission economists came to constitute a crucial test for the new analytical techniques, soon to be widely adopted.

Noteworthy, too, were certain developments concerning methods of time series analysis, with the ARMA models (autoregressive moving average: cf. Box and Jenkins 1970), and with the VAR method (vector autoregressive: cf. Sims 1980, 1982), proposed as an alternative to traditional econometrics. The latter had come in for radical criticism; in particular Lucas (1976) had maintained that the structural parameters of macroeconomic models are subject to change when confronted with discretional economic policy measures, so that the models themselves cannot be used to predict the consequences of adopting policy measures. Moreover, econometric enquiries cannot verify (or reject) theories, since verification would simultaneously concern the theory itself and the auxiliary assumptions needed to translate it into an econometric model. Sims, on the other hand, proposed an a-theoretical econometrics, in which the structure of the model is not predetermined: econometric analysis is meant to specify case by case the most suitable model.^{[2]} Thus, the distance between economic theory and econometrics widens, since economic theory seems to be losing the role of prompter of hypotheses to submit to econometric testing.

- [1] Duality between price and quantity system lay at the centre of the model of homotheticgrowth proposed by von Neumann (1937), also based on the assumption of constantreturns to scale; in addition, it stressed another correspondence - between profit rate andrate of growth.
- [2] In maintaining this theory, Sims is oblivious of the fact that conceptualisation, underlying the categories utilised for collecting statistical data, constitutes a stage oftheorising.