We use cookies to ensure the best user experience and to serve tailored advertising. To learn more about our cookies and how to manage them, please visit our cookie policy

I AGREE
Book your virtual open day or event
Insights

University and subject league tables methodology

Find out where our university league table data comes from, as well as information about the measures and the methodology behind them.

CONTENTS

  1. Where does the data come from?

  2. What measures do we use?

  3. 2021 league table scores

  4. Arts, Drama and Music institutions

  5. How we ensure accuracy

  6. Notes

Where does the data come from?

All the data in our university league tables comes from sources in the public domain.

The National Student Survey and the 2014 Research Excellence Framework are overseen by the funding councils for UK universities:

The Higher Education Statistics Agency (HESA) – the official agency for the collection, analysis and publication of measurable information about the universities – provided data for:

  • Entry standards
  • Student-staff ratios
  • Spending on academic services
  • Facilities spending
  • Good honours degrees
  • Graduate prospects
  • Degree completion
  • International student enrolments

In a few cases, the source data wasn't available and was obtained directly from the individual universities. We have a protocol for queries and corrections to the league tables.

What measures do we use?

Entry Standards (maximum score n/a)

The average UCAS tariff score of new undergraduate students.

Where does it come from?

HESA data for 2018–19.

How does it work?

Each student's exam results were converted to a numerical score (UK A Level A*=56, A=48 ... E=16, etc.; Scottish Highers A=33, B=27, etc.) and added up to give a score total. HESA then calculated an average for all students at the university. The results were then adjusted to take account of the subject mix at the university. Students on a foundation year were excluded.

What should you look out for?

A high average score (over 200, or A*AAA, at some universities) doesn't mean all students score so highly or that you need to take lots of A Levels to get in. The actual grades needed will vary by subject and few, if any, courses will ask for grades in more than three subjects. Universities that have a specific policy of accepting students with low grades as part of an access policy will tend to have their average score depressed.

The UCAS tariff does not include some international qualifications that are frequently taken by international students. This means that the average tariff score for providers with international students can be lower than it would have been if these qualifications were assigned tariff points by UCAS.

Student satisfaction (maximum score 5.00)

A measure of student views of the teaching quality at the university.

Where does it come from?

The National Student Survey (NSS), a survey of final-year undergraduate students in 2019.

How does it work?

The NSS asked questions about a variety of aspects of teaching. The average satisfaction score for all questions, except the students’ union question, was calculated and then adjusted for the subject mix at the university. Due to the distribution of the data, and to avoid this measure having an undue influence on the overall ranking, the z-score is divided by three.

What should you look out for?

The survey is a measure of student opinion, not a direct measure of quality, so it may be influenced by a variety of biases such as the effect of prior expectations. A high-ranked university expected to deliver really excellent teaching could score lower than a lower-ranked university that, while offering lower-quality teaching, does better than students expect from it.

Research quality (maximum score 4.00)

A measure of the quality of the research undertaken in the university.

Where does it come from?

The 2014 Research Excellence Framework (REF) undertaken by the funding councils.

How does it work?

Each university department entered in the assessment exercise achieved a quality profile that gave the proportion of research in each of four categories from 4* to 1* (with any remaining activity being unclassified). For the research assessment measure, the categories 4* to 1* were given a numerical value of 4 to 1, which allowed a grade point average to be calculated. An overall average was then calculated weighted according to the number of staff in each department.

What should you look out for?

Universities could decide who they wanted to return for the REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile.

Research intensity (maximum score 1.00)

A measure of the proportion of staff involved in research.

Where does it come from?

The 2014 Research Excellence Framework and HESA data from October 2013. This information and data are due to be updated in 2022.

How does it work?

The number of staff submitted to the REF was divided by the number who were eligible.

What should you look out for?

Universities could decide who they wanted to return for REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile and so the research intensity measure can be an underestimate of the actual research intensity.

Graduate prospects (maximum score 100.0)

A measure of the success in employability or further study of graduates completing their first degree.

Where does it come from?

HESA data for 2016–17.

How does it work?

The number of graduates who take up employment or further study is divided by the total number of graduates with a known destination, before being expressed as a percentage. Only employment in an area that normally recruits graduates is included. The results are then adjusted to take account of the subject mix at the university.

This set of data is due to be replaced in 2021 by the new Graduate Outcomes Survey (GOS), which will have a different way of collecting information on graduate destinations and the date after graduation.

What should you look out for?

A relatively low score on this measure doesn't mean many graduates were unemployed. It may be that some had low-level jobs that don’t normally recruit graduates. Some universities recruit a high proportion of local students and so if they're located in an area where graduate jobs are hard to come by, this can depress the outcome.

Student-staff ratio (maximum score n/a)

A measure of the average staffing level in the university.

Where does it come from?

Calculated using HESA data for 2018–19.

How does it work?

A student-staff ratio (the total number of undergraduate and postgraduate students divided by the number of academic staff) was calculated, with the results adjusted for subject mix.

What should you look out for?

A low student-staff ratio (a small number of students for each member of staff) doesn't guarantee good quality of teaching or good access to staff.

Academic services spend (maximum score n/a)

The expenditure per student on all academic services.

Where does it come from?

HESA data for 2016–17, 2017–18 and 2018–19.

How does it work?

A university's expenditure on library and computing facilities (staff, books, journals, computer hardware and software, but not buildings) plus museums, galleries and observatories. This figure was divided by the number of full-time equivalent students in the latest year. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

Some universities are the location for major national facilities, such as the Bodleian Library in Oxford and the national computing facilities in Bath and Manchester. The local and national expenditure is very difficult to separate and so these universities will tend to score more highly on this measure.

Facilities spend (maximum score n/a)

The expenditure per student on staff and student facilities.

Where does it come from?

HESA data for 2016–17, 2017–18 and 2018–19.

How does it work?

A university's expenditure on student facilities (e.g. sports, careers services, health, counselling) was divided by the number of full-time equivalent students in the latest year. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

This measure tends to disadvantage some collegiate universities as it mostly includes central university expenditure. In Oxford and Cambridge, for example, a significant amount of facilities expenditure is by the colleges, but it's not yet been possible to extract comparable data from the college accounts.

Good honours (maximum score 100.0)

The percentage of first-degree graduates achieving a first or upper second-class honours degree.

Where does it come from?

HESA data for 2018–19.

How does it work?

The number of graduates with a first or upper second-class degree was divided by the total number of graduates with classified degrees. Unclassified enhanced first degrees, such as an MEng awarded after a four-year engineering course, were treated as equivalent to a first or upper second for this purpose, while Scottish Ordinary degrees (awarded after three years rather than the usual four in Scotland) were excluded. The results were then adjusted to take account of the subject mix at the university.

What should you look out for?

Degree classifications are controlled by the universities themselves, though with some moderation by the external examiner system. It can be argued that they're not a very objective measure of quality. However, degree class is the primary measure of individual success in UK higher education and will have an impact elsewhere, such as employment prospects.

Degree completion (maximum score 100.0)

A measure of the completion rate of first-degree undergraduates studying at the university.

Where does it come from?

HESA performance indicators based on data for 2018–19 and earlier years.

How does it work?

HESA calculated the expected outcomes for a cohort of students based on what happened to students in the current year. The figures in the tables show the percentage of students who were expected to complete their course or transfer to another institution.

What should you look out for?

This measure of completion is a projection based on a snapshot of data, so it's vulnerable to statistical fluctuations.

Summary of measures

Measure

Data source

Years

Subject mix adjustment?

Weight

Entry standards

HESA

2018–19

Yes

1.0

Student satisfaction

NSS

2019

Yes

1.5

Research quality

REF

2014

 

1.0

Research intensity

HESA

2013

 

0.5

Graduate prospects

HESA

2016–17

Yes 1.0

Student-staff ratio

HESA

2018–19

Yes

1.0

Academic services spend

HESA

2016–17, 2017–18, 2018–19

 

0.5

Facilities spend

HESA

2016–17, 2017–18, 2018–19

 

0.5

Good honours

HESA

2018–19

Yes

1.0

Degree completion

HESA

2017–18, 2018–19

 

1.0

How we compile the tables

  • A statistical technique called the z-transformation is applied to each measure to create a score – this ensures each measure contributes the same amount to the overall score and so avoids any need for scaling
  • Five of the measures are then adjusted to take account of the subject mix at a university – the adjustment removes this subject effect (a side-effect of this is that it’s impossible to recalculate the total score in the tables using the published data as you'd need full access to all the raw data)
  • The z-scores (adjusted z-scores where the subject mix was taken into account) on each measure are then weighted by 1.5 for student satisfaction, 0.5 for research intensity and the two spend measures, and 1.0 for the rest, and totalled to give an overall score for the university
  • These total z-scores are then transformed to a scale where the top score was set at 1,000, with the remainder being a proportion of the top score – this scaling doesn't affect the overall ranking but it avoids giving any university a negative overall score

2021 league table scores

  Mean Max Min
Entry standards 134 212 93
Student satisfaction 4.04 4.34 3.80
Research quality 2.70 3.36 1.40
Research intensity 0.47 0.95 0.07
Graduate prospects 75.5 94.0 56.5
Student-staff ratio 16.1 24.5 10.4
Academic services spend 1715 2982 884
Facilities spend 684 1939 189
Good honours 76.2 94.2 58.2
Degree completion 84.7 98.8 58.5

The measures for the subject tables are the same as for the main university league table, except only five measures are used: student satisfaction, research quality, research intensity, entry standards and graduate prospects.

  • Two years of graduate prospects data were aggregated to make the data more reliable and scores were withheld where the number of students was too small to calculate a reliable percentage
  • The calculation of the overall score is also the same, except there's no need for any subject mix adjustment and weights are 1.0 for student satisfaction, entry standards and graduate prospects, 0.67 for research assessment, and 0.33 for research intensity
  • In the tables for Dentistry and Medicine, the graduate prospects score is ignored because it's almost identical for all institutions. In the Nursing table it's halved for similar reasons
  • To qualify for inclusion in a subject table, a university has to have data for student satisfaction and at least one of entry standards or graduate prospects 
  • A blank in one of the columns isn't a zero score but indicates that no valid data was available, and the final score was calculated using the data that was available

Arts, Drama and Music institutions

  Mean  Max Min 
Entry standards 138 168 117
Student satisfaction 4.03 4.27 3.73
Research quality 2.85 3.49 2.29
Research intensity 0.46 0.96 0.15
Graduate prospects 75.6 100 48.8
Student-staff ratio 12.4 23.3 7.4
Academic services spend 1339 4224 0
Facilities spend 340 898 90
Good honours 81.8 98.5 60.3
Degree completion 91.1 97.2 80.9
  • Our Arts, Drama & Music league table includes a number of specialist colleges that don't meet the full criteria for inclusion in the main table – some of these institutions will also be listed in their relevant subject tables
  • Note that other institutions also offer courses in Arts, Drama and Music – you can find them in the relevant subject tables and on the main universities ranking table
  • The methodology used is exactly the same as for the main table

How we ensure accuracy

Each university is provided with complete sets of its own HESA data well in advance of publication. In addition, where we find anomalous figures in the HESA data, universities are given a further opportunity to check for and notify any errors.

Notes

Main university table

  • Birkbeck, University of London declined to release its data for use in all UK league tables
  • Highlands & Islands: given its unique collegiate structure with 13 academic partners, it'd be inappropriate for the university to appear in the table
  • The Open University doesn't appear in the table because its students are distant learners and so data are mostly unavailable
  • NSS data not available: Cambridge and Oxford 
  • Entry standards data provided by the university: Cardiff Metropolitan and Kent
  • SSR data provided by the university: Canterbury Christ Church, Exeter, Salford, Suffolk, Trinity Saint David
  • Staff and student facilities spend provided by the university: Manchester Metropolitan (2015–16); historical datasets unreliable for Suffolk, 2016–17 and 2017–18 used
  • Academic services spend provided by the university: Edinburgh Napier (2015–16 and 2016–17), Manchester Metropolitan (2015–16), Salford (2015–16) and Central Lancashire (2015–16 and 2016–17); historical datasets unreliable for Suffolk, 2016–17 and 2017–18 used
  • Graduate prospects data provided by the university: Newcastle
  • Completions data not available: Buckingham (2016–17 data used)
  • Research GPA data not available: Buckingham, Leeds Arts University, Plymouth Marjon, Ravensbourne, Suffolk (minimum used)
  • Research intensity data not available: Buckingham, Leeds Arts University, Plymouth Marjon, Ravensbourne, Suffolk (minimum used); data provided by the university: Staffordshire
  • The rankings in the Complete University Guide are based on data from before the start of the coronavirus pandemic, so they do not take into account university performances during the pandemic.

Arts, Drama & Music league table

  • NSS data not available: Courtauld Institute of Art, Royal Central School of Speech and Drama, and Royal Conservatoire of Scotland (2016 score plus average sector change from 2016–2018 used)
  • Academic services spend data not available: Conservatoire for Dance and Drama, and Royal Conservatoire of Scotland (minimum used)
  • Facilities spend data not available: Royal Conservatoire of Scotland (minimum used)
  • Research GPA data not available: Conservatoire for Dance and Drama, Liverpool Institute for Performing Arts (minimum used)
  • Research intensity data not available: Conservatoire for Dance and Drama, Liverpool Institute for Performing Arts (minimum used)
  • The rankings in the Complete University Guide are based on data from before the start of the coronavirus pandemic, so they do not take into account university performances during the pandemic.

Advisory group

We consult the universities on methodology as well. Once a year, an advisory group with university experts meets to discuss the methodology and how it can be improved.

We've taken every effort to ensure accuracy but can't take responsibility for errors or omissions. The data providers don't necessarily agree with the data aggregations or manipulations we use and are also not responsible for any inferences or conclusions derived from them.

Is this page useful?

Yes No

Sorry about that...

HOW CAN WE IMPROVE IT?

SUBMIT

Thanks for your feedback!