The proliferation and diversification of HERS

Today, many international ranking systems include subject specific rankings and/ or regionally focused rankings in addition to their world university rankings. The three biggest university ranking agencies, ARWU, THE and QS, have diversified their rankings portfolio to produce specialist university rankings, university rankings in subject areas/subjects, rankings of universities in a specific region and rankings of universities under the age of 50. These specialist rankings may be a subset of the outcomes generated by the respective world university rankings tables, a subset of the universities with different weights applied to the indicators/ metrics, or the product of a completely new methodology or weighting matrix. In addition to providing some opportunity for less prominent universities to shine in certain areas, this diversification and proliferation has allowed the major rankings agencies to appeal to an even larger potential customer base in terms of including universities who are not prominent in the global HERS.

The following table shows the proliferation of rankings as annually published by the Big Three rankings systems:

Table 3.2 The annual rankings published by QS, THE and Shanghai Rankings (ARWU) as

on October 2017 (QS Quacquarelli Symonds Limited, 2017; Times Higher

Education, 2017; ShanghaiRanking Consultancy, 2017)______________________

Times Higher Education (THE)

Quacquarelli Sym-monds (QS)

Shanghai Rankings (ARWU)

World university

THE World Uni-

QS World Uni-

Academic Ranking

rankings

versiw Rankings

versity Rankings

of World Uni-

(WUR)

(WUR)

versifies (ARWU)

Region focused

THE US College

QS Asia Rankings;

rankings

Rankings; THE Latin America Rankings; THE Japan University Rankings; THE Asia University' Rankings

QS Latin America Rankings; QS Arab Region Rankings; QS Emerging Europe and Central Asia (EECA) Rankings,

Subject focused

THE WUR by

QS Rankings by'

Global Ranking of

rankings

Subject

Subject

Academic Subjects, Academic Ranking of World Universities by' Subject Field

Rankings by' age

THE Young University' Rankings

(YUR)

QS Top 50 under 50

Rankings by' eco-

THE Emerging

QS Rankings:

nomic classification

Economies

BRICS

Other rankings

THE World Reputation Rankings THE Global University' Employability

Rankings

QS System Strength Rankings

QS Graduate Employability Rankings

QS Best Student Cities

In addition to the table above, in February 2018 THE announced the development of new sub-ranking, the ‘THE Innovation rankings’, which was launched at THE’s Innovation and Impact Summit in South Korea in April 2019. The new sub-rankings are focused on the impact that a university' has on the economy and wider society through its innovations and inventions (THE Reporters, 2018).

Many' higher education institutions are beginning to, or already have developed their own systems for assessing the quality' of learning and teaching at a departmental level, which incorporates the best of the observed global practices whilst ensuring these meet individual local and regional requirements (Downing, 2013). Definitions of ‘quality'’ in terms of learning and teaching are varied and elusive across the globe and even between universities in the same country' or sector. Consequently, assessing this aspect of university' performance, even at a local level is often fraught with difficulty and the disagreement amongst academics about what constitutes ‘quality’ is often based on their own approach to learning and teaching. Therefore, a globally satisfactory resolution to assessing the quality of learning and teaching is almost certainly an impossibility, and various inadequate proxy measures like faculty student ratios are employed by the HERS instead. This causes much controversy and is probably one of the most frequently voiced criticisms of the major ranking systems. However, Downing (2013) suggests that this trend should not necessarily lead to a lack of differentiation because universities will always interpret best practice in terms of their local and regional requirements and contexts. Furthermore, most countries have reasonably well-developed quality assurance systems of their own which universities are obliged to comply with. Sowter (2015) theorises that, with time, HERS will become more established and the various methodologies will start to settle. It is also possible that the multiplicity of different types of comparative and transparency tools may eventually diminish the authority of the current big three market leaders (Hazelkorn, 2013). For the publishers, high-profile ranking systems have become highly profitable products, just as transparency and accountability tools (and, in particular, research assessment) have increased the profitability of scientific publishing (Scott, 2013). Consequently, the number and type of commercial offerings to universities (from reports detailing where respondents to the various surveys originate to conferences and consultancies) has grown exponentially and added to the profitability of the HERS as well as the controversy which surrounds them. The increase in external scrutiny means that universities have had to reorganise and build a distinct identity and reputation in order to compete for the best students, faculty and funding (Steiner, Sammalisto, & Sundstrom, 2012).

 
Source
< Prev   CONTENTS   Source   Next >