Making sense of international league tables
David Jobbins is a writer and editor specialising in international higher education. He was international editor of The Times Higher Education Supplement from 1992 to 2007 and writes regularly for University World News. He is also a special professor in international education at the University of Nottingham. Below he explains how those considering an education abroad can use international rankings to guide them.
International league tables are fast becoming just as essential a tool for university applicants as national league tables such as The Complete University Guide and its competitors.
Partially due to the impact of the increased tuition fees, a growing number of UK students are considering a university course overseas. According to a study conducted by Prospects in May, 24% of school leavers are planning to study abroad, while 73% are at least considering the idea. So it makes sense for these students to look at the international rankings — which they are likely to frequently encounter in the promotional material from institutions on their wish list — alongside other information about the country and university being considered.
But how accurate a picture do they give would-be students and others of the health of a country’s higher education system, and of the institutions that it contains? How good a guide are they to an individual’s university experience when an important choice — where shall I study? — is made more difficult by the further question —– in which country?
Universities that perform well in national rankings frequently barely register in the global league tables. Essentially this is because the international university rankings use criteria such as academic and employer surveys, the number of citations per faculty, the proportion of international staff and students and faculty and alumni prize winners. National rankings tend to give more prominence to the undergraduate student experience, together with the academic quality of a university’s intake, graduate employment, research quality and dropout rates.
There are three major international rankings that applicants are likely to come across, all published annually, and all freely available to the user.
The oldest is the Academic Ranking of World Universities (ARWU), published by the Centre for World-Class Universities and the Institute of Higher Education at Shanghai Jiao Tong University in China.
First published in 2003, it has since been updated annually, with the latest edition appearing on 15 August 2012. ARWU publishes the best 500 out of the 1,000+ universities it ranks each year.
ARWU’s approach differs from other rankings in its historical nature and the emphasis on research publications. Its methodology includes:
- Alumni of an institution winning Nobel Prizes and Fields Medals
- Staff of an institution winning Nobel Prizes and Fields Medals
- Highly cited researchers in 21 broad subject categories
- Papers published in the journals Nature and Science
- Papers indexed in Science Citation Index-expanded and Social Science Citation Index
- Per capita academic performance of an institution
Universities with a bias towards the sciences and with a history of research excellence have an inbuilt advantage over competitors with a mix of arts, humanities and the sciences and a broader approach to their activities. The over-reliance on citations indices and publication rates, together with the inclusion of Nobel prizes won, has led to critisicm in the past, and to accusations of double and triple counting.
ARWU’s origins lay in a wish on the part of the Chinese government to benchmark its universities against the best in the world. Its roots within the university culture mean that is has been regarded with less suspicion than its competitors.
Because of its methodology, there are fewer changes year on year in the ARWU compared with its rivals. Harvard led the rankings in 2012, as it has done every year since 2003. The top 10 list is completely unchanged from the previous year.
Consistency can be a virtue in organisations that change their nature as slowly as complex universities. But it will be the top 10 where research excellence is concentrated and its use as a guide to the undergraduate experience is therefore restricted.
A number of others, including the Leiden Ranking, the Scimago Institutions Rankings and the High Impact Universities Research Performance Index all follow a similar approach and are therefore similarly of less value to potentially mobile students.
In contrast, the Times Higher Education Supplement’s World University Rankings, which first appeared in 2004, broadened the reliance on research metrics into the more subjective field of reputation. It used a system of peer review to identify the leading institutions in the eyes of the academic community, allocating 40% of the potential score to the results.
It also sought to measure the international character of universities through the proportion of academic staff of other nationalities.
The exercise was supported by the educational and careers advice company QS, which entered into a partnership with the THES to supply the data. The partnership continued after the THES was renamed Times Higher Education (THE) in 2008 and rankings were published annually until 2009.
Dramatically, shortly after the 2009 rankings were published,THE split with QS and entered into a partnership with Thomson Reuters, publishing for the first time in September 2010, and annually thereafter.
The rankings methodology has been modified since 2010, however, making year-on-year comparisons problematic. The latest version can be found here.
The main ranking lists a global top 200 in order, and a further 200 universities in broader bands. Universities can also be ranked by geographic region or by six broad subject areas. So it would be possible to establish the leading university in Asia or Europe, or the best university for life sciences or arts and humanities, but not the best for biology or for philosophy.
THE sets out to assess research-led universities across core missions such as teaching, research, knowledge transfer and international outlook.
It uses 13 performance indicators to provide what it calls the most “comprehensive and balanced” comparisons, using a methodology outlined here. This year’s ranking places the California Institute of Technology first in the world, ahead of Stanford in the US and Oxford in the UK.
After its divorce from THE, QS continued to publish its World University Rankings. The research behind the rankings currently considers over 2,000 universities and ranks over 700. The top 400 are ranked individually, whereas those placed 401 and over are ranked in groups.
The rankings are based on data covering four key areas of concern for students: research, employability, teaching and internationalisation. Academic reputation is based on a peer review of research activity based on survey returns from academics.
A key feature that QS believes is increasingly relevant to students is prospective universities’ reputation among employers. Its employer reputation indicator is based on a global online survey of employers, based on three years’ worth of “latest response” data, totalling over 25,000 in 2012.
Employers are asked to identify the universities that produce the best graduates. A guide to the methodology can be seen here.
Both THE and QS have now branched out from monolithic global rankings. In 2012 THE launched the 100 Under 50 — a ranking of the top 100 universities that had been operating less than 50 years using the same 13 indicators as its World University Rankings, but with a methodology recalibrated to reflect the special characteristics of younger universities.
And QS has drilled down into the broad subject areas to provide rankings of the leading universities for specific subjects. It has also developed regional rankings for Asia and Latin America.
If university rankings are too specific, the global higher education network Universitas 21 has developed a ranking of countries’ whole higher education systems, aiming to highlight the importance of creating a strong environment for higher education institutions to contribute to economic and cultural development, provide a high-quality experience for students and help institutions compete for overseas applicants.
The Universitas 21 ranking of higher education systems draws on data from 48 countries and territories across 20 different measures grouped under four headings: resources (investment by government and private sector), output (research and its impact, as well as the production of an educated workforce which meets labour market needs), connectivity (international networks and collaboration which protects a system against insularity) and environment (government policy and regulation, diversity and participation opportunities).
Overall, the top five higher education systems were found to be the United States, Sweden, Canada, Finland and Denmark.
Concern at the freebooting nature of academic rankings — not necessarily the ones mentioned in this article — led to a three-year review of league tables’ fitness for purpose culminating in the 2006 publication of a report from an expert group convened by the Unesco-European Centre for Higher Education and the Institute for Higher Education Policy in Washington DC.
The group, formalised as the International Observatory on Academic Ranking and Excellence (IREG), set out the so-called Berlin Principles on Ranking of Higher Education Institutions. The 16 Berlin Principles focus on good practice that will be useful for the improvement and evaluation of ranking systems over time.
Among its recommendations, the group said rankings should:
- Use transparent methodology
- Measure outcomes, not inputs
- Use audited and verifiable data whenever possible
- Take into account the different missions and goals of institutions.
The expert group said: “It is important that those producing rankings and league tables hold themselves accountable for quality in their data collection, methodology and dissemination.”
As yet no rankings — national or international — have won IREG’s kite mark. But an audit exercise that may lead to the first award is under way. Watch out for the ranking of rankings.