Contemporary International University Rankings or League Tables

Many countries publish national rankings which are tools for their own students, faculty and funding bodies. An example is the ranking of top Chinese universities from Research Center for Chinese Science Evaluation (RCCSE) at Wuhan University and the Network of Science & Education Evaluation in China (nseac.com). ARWU's Resource page provides a listing of local rankings from 28 different countries. With the internationalization of education at an organizational level, institutions and even countries compete for students and researchers and not surprisingly, this

Fig. 2 Growth of AsianPacific Articles in Web of Science from 1900–2010. (Extracted from Web of Science, August 2013)

has led to international ranking systems. Commercial sources, universities, evaluation authorities and scientometric research organizations compile today's university rankings. The rankings may incorporate bibliometric data from Thomson-Reuters or Scopus, peer review or “reputational surveys”. Some research institutions are creating new algorithms from bibliometric sources or from web metrics.

Some of the better-known rankings include:

• ARWU (Academic Ranking of World Universities) from 2003;

Center for World-Class Universities at Shanghai Jiao Tong University (Center,

2013)

• National Taiwan University Rankings, “Performance Rankings of Scientific Papers of World Universities” from 2012-; formerly HEEACT (2007–2011); (National, 2012-)

• THE World University Rankings from 2011 (Times, 2013 − 14)

• Leiden Rankings from2008; Center for Science and Technology Studies (CWTS,

2013)

• SIR (SCImago Institutional Rankings) from 2009—(SCImago, 2013)

• QS World University Rankings from 2004 (Quacquarelli Symonds 2013) (republished by US News and World Reports as World's Best Colleges and Universities from 2008-)

The University of Zurich (2013) presents a clear overview of the rankings listed above. Chen and Liao (2012) statistically analyze the data and calculate correlations among the rankings, especially ARWU, HEEACT (now NTU) and THE.

Shanghai Jiao Tong's Center for World-Class Universities produces Academic Rankings of World Universities (ARWU). It has the World's Top 500 and top 200 in five fields and five subjects. Nobel Prize winners in two indicators, Thomson Reuters bibliometric data and articles from Nature and Science comprise the rankings for all but those schools strongest in social sciences (Liu and Cheng 2005). The Academic Ranking of World Universities (ARWU) is published and copyrighted by Shanghai Ranking Consultancy, which is not affiliated with any university or government agency. Billaut et al. (2010) take a critical look at ARWU while DoCampo (2011) examines ARWU relative to university systems and country metrics.

Similar to, but not as well known as ARWU, is the former HEEACT(Higher Education Evaluation and Accreditation Council of Taiwan) ranking which is now published by the National Taiwan University and renamed NTU Ranking. It presents a worldwide ranking for 500 universities and rankings by six fields and 14 subjects. All the rankings are based on data from Thomson Reuters Essential Science Indicators.

CWTS at Leiden and SCImago expand the measurements used for rankings by experimenting with new metrics. Leiden University's Center for Science and Technology Studies (CWTS) developed its own ranking system using bibliometric indicators from Thomson Reuters to measure the scientific output of 500 major universities worldwide. It uses no reputational data or data collected from the universities themselves. The researchers modify the existing data to create normalized scores and continue to experiment with new measures. The web site provides an overall ranking and the user can select field, region, country and indicator. These rankings receive little attention in the international press but the researchers from Leiden publish the most papers about “bibliometrics” based on searches in WOS and SCOPUS (searched 10 September, 2013).

SIR, SCImago's Institutions Rankings, uses metrics from SCOPUS.It ranks over 2700 organizations including research and medical institutions. Ranking are worldwide, by region and by country. Measures include output, percent international collaboration, normalized citations and the percent of articles published in the first quartile of their categories using SJR, SCImago's own journal impact score. SCImago claims that SIR reports are not league tables and the goal is to provide policy makers and research managers with a tool to evaluate and improve their research results. Reports are all in PDF format.

THE and QS have broader target markets, with a focus beyond the research community. Originally published as part of the QS rankings, THE began publishing its own rankings, powered by Thomson Reuters in 2010–2011. It ranks 400 worldwide universities. Its ranking metrics include teaching, research, knowledge transfer and international outlook. There are rankings by region and broad subject area and separate rankings by reputation and for universities less than 50 years of age.

QS continues to publish its rankings, with less emphasis on evidence based bibliometrics and more emphasis on qualitative “Academic reputation”. Recognizing the need to internationalize the market for North American college–bound students, U.S. News and World Report began republishing the then THE-QS in 2008 and it continues to republish the QS rankings. According to Robert Morse (2010), U.S News is working together with QS. A noticeable difference in the QS rankings is that 20 out of the top 50 universities are from Commonwealth or former Commonwealth countries.

The Berlin Principles emphasize the importance of accountability for the rankers, not only the institutions they are ranking. Enserink (2007), in his article in Science “Who Ranks the University Rankers”, examines the various international rankings. Other authors from such prestigious journals as Chronicle of Higher Education, Nature and Science have examined the effect of rankings on university behavior (Declan 2007; Labi 2008; Saisana et al. 2011).

Table 6 Comparison of Methodology of Two Research Rankings

Table 7 Comparison of THE and QS. (© Pagell 2009; updated 2013)

Tables 6 and 7 summarize the methodologies of selected international ranking, as described above. They illustrate the differences in metrics and weights of the various indicators. More information on methodology is available from the websites in the last row of the table.

QS modifies its metric weightings for rankings by subject and field, putting even more weight on reputation for social science and humanities.

In addition to modifications of existing metrics from Thomson-Reuters and Scopus by Leiden and SCImago, the use of web data is now receiving serious consideration. Consejo Superior de Investigaciones Científicas (CSIC)first issued the semi-annual Ranking Web of Universities in 2004. CSIC claims that it is an independent, objective, free, open scientific exercise for the providing reliable, multidimensional, updated and useful information about the performance of universities from all over the world based on their web presence and impact. Built from publicly available web data, it includes almost 12,000 institutions arranged by world, region, and country' Rankings are based on impact (external links to…), presence openness, including repositories, on the one bibliometric element, excellence, the top 10 % of scholarly output with data from SCImago available from about 5100 institutions weighted at about 17 %. (CSIC csic.es).

Comparing a variety of rankings and ranking criteria clarify the importance of understanding the different metrics and weightings.

Table 8 uses 2013 Shanghai Jiao Tong (ARWU) as the basis for the top ten, and compares them to the top ten from the 2013 rankings from THE, QS, SCImago, Leiden and Webometrics and the 2012 rankings from NTU.

18 universities make up the top 10 on the four main lists (ARWU, NTU, THE and QS). Harvard, Stanford, MIT and Oxford are top ten on all of them; Harvard leads the pack across all the rankings. It is interesting to note the similarities and differences among the schemes and between the international lists and Hughes original 1925 rankings. Of Hughes Top 10 in 1925, only one school, University of Wisconsin, was not in the top ten in one of the selected rankings and 16 of the 19 are on at least one top 30 list. Internationalization brings UK universities into the top 20 and time has shifted the U.S. balance away from public institutions in the mid-west. Two top technology universities are in the top tier.

Another interesting factor in the tables is the difference in the SCImago and Leiden rankings for top papers, highlighting differences between the contents of SCOPUS and WOS.Webometrics top four are the same top four as ARWU's research rankings

The evaluating bodies list universities by their rank, based on an underlying scoring system. Table 9 shows the importance of checking underlying scores to get a better understanding of what it means to be one or 100. It shows the scores for universities one, two and 100 and the percent of separation from 1st to 100th. For example, in the QS rankings the first and 100th universities show a 31.6 % difference while in the NTU rankings the first and 100th universities are over 79 % apart. Only U.S. and U.K. universities are in the top ten lists. The number of Asian universities in the top 100 has been growing. Table 10 lists Asia's top ten from four bibliometric rankings and Webometrics. There are a total of 24 universities on the list and the majority is now ranked in the top 100 in world. The strongest showings are from Japan and China.

An interesting, specialized addition to scholarly rankings comes from Nature which is publishing a rolling year's ranking for Asia-Pacific institutions and countries based on its own publications. The ranking includes only total publications and uses two calculations for giving an institution credit when there are multiple authors. University of Tokyo is the standout in the Nature ranking for Asia which is comparable to those listed above but includes more countries, (Nature 2013).

Table 8 World Rankings

Other top 10 on 2 lists, Imperial College; on one list: Yale, UCLA, U Washington, Johns Hopkins, UC San Francisco, University College London and U

Michigan

a NTU Rankings due out later in October

b Rankings for normalized impact and top quartile

Bibliometrics and University Research Rankings Demystified for Librarians

Table 9 Scoring Differences among Ranking Schemes for Universities 1, 2 and 100. (Extracted from rankings 2 October 2013)

RANK/Score

1

2

100

% from 1–2

% from 1–100

THE

94.9

93.9

52.6

1.05 %

44.57 %

ARWU

100

72.6

24.3

27.40 %

75.70 %

QS

100

99.2

68.4

0.80 %

31.60 %

NTU/2012

96.36

51.2

19.85

46.87 %

79.40 %

Table 10 Top 10 Asian Universities (ex. Israel) in 2013

RANK

ARWU 2013

THE 2013

NTU (HEACT)

2012

QS 2013

Webometrics 2013

1

Tokyo

Tokyo

Tokyo

NUS

NUS

2

Kyoto

NUS

Kyoto

U Hong Kong

Tsinghau

3

Osaka

U Hong Kong

Osaka

Tokyo

Tokyo

4

Hokaido

Seoul National

Seoul Ntl U

HKUST

NTU Taiwan

5

Kyushu

Peking

NUS

Kyoto

Peking

6

Nagoya

Tsinghau

Tohuku

Seoul

Zhejaing

7

NUS Singapore

Kyoto

Peking

Chinese U

(HK)

Wuhan

8

Ntl U Taiwan

KAIST

Tsinghau

NTU

Singapore

Shanghai Jio

Tong

9

Seoul Ntl U

HKUST

NTU Taiwan

Peking

Fudan

10

Tokyo Inst Tech

Pohang U of

Zhejaing

Tsinghau

Seoul Ntl U

Science and

Technology

Top 100

3

All

8

All

8

Overlap

6

8

9

8

7

Extracted from sources in Table 7

 
< Prev   CONTENTS   Next >