Implementation and Outcomes of Excellence-Driven Policies and Initiatives
When considering the implementation mechanisms of excellence-driven policies and initiatives, a surprising fact was found—participating universities that are supposed to be the leaders of higher education system got more restrictions on their autonomy than other (“normal”) universities. This is a very key characteristic of the implementation approach used by the governments. The allocation of big money makes the governments worry about its efficient use. To ensure this efficiency and effectiveness, governments build complicated instruments to control the universities. For many centuries, the autonomy and internal energy of universities were the main sources of higher education development. The excellence initiatives represent different approaches where the push for the excellence comes from outside, from above the universities. The challenge for the governments is to find the right push instruments to ensure flexibility and internal motivation of participating universities.
The following questions were considered to elaborate on the governments approaches to develop specific implementation instruments:
• Who is in charge? What is the role of the government or the Ministry of Higher Education/Education on the implementation of excellence-driven policies and initiatives?
• How do governments allocate money? What is the degree of freedom?
• How do central authorities monitor results? How do they measure the effec-
tiveness of policy implementation? How do they evaluate the progress?
• Do the authorities intervene and how?
In all the cases, the Ministries of Education (and Science or Higher Education) are in charge of the implementation process. Mainly, they partner with a national higher education or research funding agency. They usually delegate the function of day-to-day operation support to designated a program implementation agency (PIA). The role of such agency is to interpret the Ministry's policies, collect data, provide logistical support for the expert evaluation, and ensure the communication among the universities and between the Ministry and universities. In all the cases, these agencies were involved (even through the monitoring) into internal business of universities. It consequently manifests new modality of the relationships between the universities and the government.
In most cases, such agencies adopt business approaches when the program is being implemented. They use key performance indicators (KPI) to evaluate universities' progress, and encourage universities to hire consultancy companies to build effective management structure. The Russian agency hired a consulting company to teach universities how to use project management in their operation. These details confirm that the excellence initiatives are linked with the economic mobilization of the higher education systems under New Public Management frameworks (Bleiklie 1988; Stech 2011).
In some cases, such agency reports not to the Ministry unit that is responsible for higher education policy, but to some special project units. It means that the implementation of the excellence initiative is becoming a separate stream within the higher education policy implementation. In a number of countries such as Germany or Russia, program implementation agencies serve as technical support organizations carrying out selection or monitoring procedures. Furthermore, PIA exerts significant impact on resource allocation. Relying on decisions made by agency experts, Ministries grant, extend or cut off funding.
In the majority of the initiatives, the governments allocate special development grants to the participating universities which often mean that universities can only spend this grant for specific type of expenses. What is more, some governments, such as Canada in 2014 for example, set research and development priorities in a top-bottom way based on their own views when allocating money.
The accuracy of the spending is being carefully monitored by the governments of project implementation agencies. Interviews conducted during the research suggest that the intervention of the Ministry of Finance (or equal agency) is a quite common feature of the implementation process. This is another manifestation of the limits of the university autonomy imposed by the excellence policies.
According to Salmi's (2009) calculation on resource allocation per university by excellence initiative, the amount of money provided for universities differs significantly from country to country. While, Australia infused from $1million to $4 million to each Centre of Excellence in (2003), Chinese government has devoted nearly $300 million to Peking University and Tsinghua University in 1999. France has provided its “Operational Campus” with nearly $620 million in 2008.
It should be emphasized that monitoring of the implementation, as well as the monitoring of the results of the program is a difficult task. First, the implementation agency should find the right balance when increasing bureaucratic pressure on universities asking them for regular reporting; second, the time of such projects is too short to see the final fruits of the intervention. It means that the monitoring system inevitably uses short-term indicators to evaluate the progress.
In many countries, the academics complain that the implementation agencies or the ministries are pressing the universities for more reports (Hazelkorn 2011). Almost in all cases, the monitoring systems include annual or even semi-annual scanning of the changes in universities' characteristics and criteria used by the international rankings. Therefore, universities feel constant pressure to publish more and in better journals, to attract more international students and research contracts.
Even more, there is almost no outcome, but mostly process indicators and parameters (like number of international students) are being used to evaluate excellence policies. In Russia such indicators include number of joint programs, number of international researchers hired by universities (Froumin and Povalko 2014).
Thus, the monitoring systems are becoming an instrument of influencing internal policies of universities. As it was shown above, almost all excellence initiatives imply the development of the strategic plan (program, action plan) by the participating universities. The PIA follows the implementation of these plans through the reporting and monitoring systems. These strategic plans or “roadmaps” are usually based on specific activities or strategic projects. The example of the Russian Federation illustrates the significance of such “roadmaps” not so much for universities, as for program implementation agencies. Fifteen Russian universities were asked to develop and present their roadmaps before the PIA. One university out of the whole group was expelled from the excellence program by the reason of unsatisfactory “roadmap”.
Criteria used to assess universities claiming for excellence serve as formal guidelines in many cases. Moreover, it is proved by the practice that universities reorganize their activity to comply with the criteria. However, their real performance quality could remain the same or, what is more, decline.
Several universities participating in excellence-driven programs were examined in the research. The analysis has shown that in a year or two many activities carried out by these universities become bureaucratized. Formal performance indicators imposed by international rankings such as the number of publications or the ratio of foreign students lead to the fact that higher education institutions introduce cumbersome systems of internal control to become top rated. To achieve their goals which are sometimes too ambitious, university administrators build a hierarchy to control the performance of each organizational unit or even each research or teaching employee. Our respondents complained that reporting back is sometimes more time consuming than doing their primary job. All the countries without exception use international review as an important instrument for the evaluation of progress. The Ministries recommend universities to create their own international expert panels to review the progress.
The discussion about the outcomes of the excellence initiatives is limited by the data available. There are three types of outcomes that are usually discussed in the literature and in the governments' reports: the changes in the ranking position of participating universities; the changes in other indicators used by the ministries within the monitoring of the initiatives; internal changes at the universities. It could be argued that the changes in higher education system as a whole should be considered as an outcome of excellence-driven policy or initiative. However, the analysis of the changes in the ranking positions does not show sustainable impact of such policies and initiatives (Table 2).
National reports on the excellence initiatives provide the information about other changes in productivity and quality of the participating universities. They report about increase in the quality of incoming students, about new facilities (mainly research facilities) and more international partnerships (Hazelkorn 2007; Salmi and Froumin 2013).
The interviews also show significant innovations in the management structure and management processes at the participating universities. They include: new incentives for the professors and researchers, interdisciplinary research centres and graduate programs. Units that are dealing with international publications, PR, and links with the industry have increased in scale and quality. In many cases, universities reformed their governance structure giving more power to the committees formed with external (international) experts. Many of these changes reflected the move of the university management to business model. Unfortunately, at some instances this business-type behaviour leads to questionable practices.
There are interesting examples when universities “go the vole” to comply with rankings criteria. Adventurous universities offer huge amounts of money to highly cited and internationally recognized scholars to change their affiliation. There are examples which boggle the mind when universities pay to journals indexed by Scopus or Web of Science for publication of the papers.
Table 2 Universities in TOP 100 of world university rankings
No |
Country |
2008 |
2011 |
2014 |
|||||
ARWU |
QS/THE |
ARWU |
QS |
THE |
ARWU |
QS |
THE |
||
1 |
United States |
54 |
38 |
53 |
31 |
51 |
52 |
28 |
45 |
2 |
United Kingdom |
11 |
17 |
10 |
19 |
12 |
8 |
19 |
11 |
3 |
Australia |
3 |
7 |
4 |
8 |
4 |
4 |
8 |
5 |
4 |
Netherlands |
2 |
4 |
2 |
3 |
4 |
4 |
6 |
6 |
5 |
Canada |
4 |
4 |
4 |
4 |
5 |
4 |
5 |
4 |
6 |
Germany |
6 |
3 |
6 |
4 |
4 |
4 |
3 |
6 |
7 |
Switzerland |
3 |
3 |
4 |
3 |
3 |
5 |
4 |
3 |
8 |
Japan |
4 |
4 |
5 |
6 |
2 |
3 |
5 |
2 |
9 |
France |
3 |
2 |
3 |
2 |
3 |
4 |
2 |
2 |
10 |
Sweden |
4 |
2 |
3 |
2 |
3 |
3 |
2 |
3 |
11 |
China |
0 |
2 |
0 |
2 |
2 |
0 |
3 |
2 |
12 |
Russia |
1 |
0 |
1 |
0 |
0 |
1 |
0 |
0 |
Academic Rankings of World Universities (2014), QS World University Ranking (2014), The World University Rankings—Times Higher Education (2014)
These findings make it reasonable to summarize this part of the paper arguing that the design of excellence-driven policies and initiatives based on clear formal indicators provides universities with the “guiding stars”. It is clear for universities what should be done to perform well in terms of the excellence programs. But the question of how it should be done remains open and by no means all universities answer it for the real benefit of their development. Usually, the changes in ranking position are considered as the main outcome of success or failure for the university, as well as the state. An even more challenging issue is the real impact of excellence-driven policies on universities and on overall higher education systems in general—particularly in the context of Bologna Process.