Discover our tailored Clearing advice for students, parents and teachers.
Insights

University and subject league tables methodology

Find out where our university league table data comes from, as well as information about the measures and the methodology behind them.

CONTENTS

  1. Where does the data come from?

  2. What measures do we use?

  3. 2022 league table scores

  4. Arts, Drama and Music institutions

  5. How we ensure accuracy

  6. Notes

Where does the data come from?

All the data in our university league tables comes from sources in the public domain.

The National Student Survey and the 2014 Research Excellence Framework are overseen by the funding councils for UK universities:

The Higher Education Statistics Agency (HESA) – the official agency for the collection of measurable information about the universities – collected data for:

  • Entry standards
  • Student-staff ratios
  • Spending on academic services
  • Facilities spending
  • Graduate prospects
  • Degree completion
  • International student enrolments

The data were supplied to the Guide by JISC.

In a few cases, the source data wasn't available and was obtained directly from the individual universities. We have a protocol for queries and corrections to the league tables.

What measures do we use?

Entry Standards (maximum score n/a)

The average UCAS tariff score of new undergraduate students.

Where does it come from?

HESA data for 2019–20.

How does it work?

Each student's exam results were converted to a numerical score (UK A Level A*=56, A=48 ... E=16, etc.; Scottish Highers A=33, B=27, etc.) and added up to give a score total. HESA then calculated an average for all students at the university. The results were then adjusted to take account of the subject mix at the university. Students on a foundation year were excluded.

What should you look out for?

A high average score (over 200, or A*AAA, at some universities) doesn't mean all students score so highly or that you need to take lots of A Levels to get in. The actual grades needed will vary by subject and few, if any, courses will ask for grades in more than three subjects. Universities that have a specific policy of accepting students with low grades as part of an access policy will tend to have their average score depressed.

The Scottish education system makes it easier for students to accumulate a large number of subjects and this tends to boost the scores for universities which admit a large proportion of Scottish students.

The UCAS tariff does not include some international qualifications that are frequently taken by international students. This means that the average tariff score for providers with international students can be lower than it would have been if these qualifications were assigned tariff points by UCAS.

Student satisfaction (maximum score 5.00)

A measure of student views of the teaching quality at the university.

Where does it come from?

The National Student Survey (NSS), a survey of final-year undergraduate students in 2020.

How does it work?

The NSS asked questions about a variety of aspects of teaching. The average satisfaction score for all questions, except the Students’ Union question, was calculated and then adjusted for the subject mix at the university. Due to the distribution of the data, and to avoid this measure having an undue influence on the overall ranking, the z-score is divided by three.

What should you look out for?

The survey is a measure of student opinion, not a direct measure of quality, so it may be influenced by a variety of biases such as the effect of prior expectations. A high-ranked university expected to deliver really excellent teaching could score lower than a lower-ranked university that, while offering lower-quality teaching, does better than students expect from it.

Research quality (maximum score 4.00)

A measure of the quality of the research undertaken in the university.

Where does it come from?

The 2014 Research Excellence Framework (REF) undertaken by the funding councils.

How does it work?

Each university department entered in the assessment exercise achieved a quality profile that gave the proportion of research in each of four categories from 4* to 1* (with any remaining activity being unclassified). For the research assessment measure, the categories 4* to 1* were given a numerical value of 4 to 1, which allowed a grade point average to be calculated. An overall average was then calculated weighted according to the number of staff in each department. If no submission to the REF was made then the minimum score was used.

What should you look out for?

Universities could decide who they wanted to return for the REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile.

Research intensity (maximum score 1.00)

A measure of the proportion of staff involved in research.

Where does it come from?

The 2014 Research Excellence Framework and HESA data from October 2013. This information and data are due to be updated in 2022.

How does it work?

The number of staff submitted to the REF was divided by the number who were eligible. If no submission to the REF was made then the minimum score was used.

What should you look out for?

Universities could decide who they wanted to return for REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile and so the research intensity measure can be an underestimate of the actual research intensity.

Graduate prospects – outcomes (maximum score 100.0)

A measure of the success in employability or further study of graduates completing their first degree.

Where does it come from?

The Graduate Outcomes survey which takes place fifteen months after the graduation of students in HESA data for 2017–18.

How does it work?

The number of graduates who take up employment or further study is divided by the total number of graduates with a known destination, before being expressed as a percentage. Only highly-skilled employment (i.e. where graduates are normally recruited) was included. The results are then adjusted to take account of the subject mix at the university.

What should you look out for?

A relatively low score on this measure doesn't mean many graduates were unemployed. It may be that some had low-level jobs that don’t normally recruit graduates. Some universities recruit a high proportion of local students and so if they're located in an area where graduate jobs are hard to come by, this can depress the outcome.

Graduate prospects – on track (maximum score 100.0)

A measure of the proportion of graduates who agree that their activity is on track with their future plans.

Where does it come from?

The Graduate Outcomes survey which takes place fifteen months after the graduation of students in HESA data for 2017–18.

How does it work?

The proportion of graduates who agreed or strongly agreed with the statement ‘My [activity] fits with my future plans’.

What should you look out for?

Graduate's plans vary and this is a measure of how happy they are with their current situation in relation to those plans. The graduate prospects score provides a measure of absolute success while this provides a measure of how happy the student is with their situation regardless of how successful it might appear to be.

Student-staff ratio (maximum score n/a)

A measure of the average staffing level in the university.

Where does it come from?

Calculated using HESA data for 2019–20.

How does it work?

A student-staff ratio (the total number of undergraduate and postgraduate students divided by the number of academic staff) was calculated, with the results adjusted for subject mix.

What should you look out for?

A low student-staff ratio (a small number of students for each member of staff) doesn't guarantee good quality of teaching or good access to staff.

Academic services spend (maximum score n/a)

The expenditure per student on all academic services.

Where does it come from?

HESA data for 2016–17, 2017–18 and 2018–19.

How does it work?

A university's expenditure on library and computing facilities (staff, books, journals, computer hardware and software, but not buildings) plus museums, galleries and observatories. This figure was divided by the number of full-time equivalent students in the latest year. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

Some universities are the location for major national facilities, such as the Bodleian Library in Oxford and the national computing facilities in Bath and Manchester. The local and national expenditure is very difficult to separate and so these universities will tend to score more highly on this measure.

Facilities spend (maximum score n/a)

The expenditure per student on staff and student facilities.

Where does it come from?

HESA data for 2016–17, 2017–18 and 2018–19.

How does it work?

A university's expenditure on student facilities (e.g. sports, careers services, health, counselling) was divided by the number of full-time equivalent students in the latest year. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

This measure tends to disadvantage some collegiate universities as it mostly includes central university expenditure. In Oxford and Cambridge, for example, a significant amount of facilities expenditure is by the colleges, but this does not appear in the university’s accounts.

Degree completion (maximum score 100.0)

A measure of the completion rate of first-degree undergraduates studying at the university.

Where does it come from?

HESA performance indicators based on data for 2018–19 and 2019–20.

How does it work?

HESA calculated the expected outcomes for a cohort of students based on what happened to students in the current year. The figures in the tables show the percentage of students who were expected to complete their course or transfer to another institution.

What should you look out for?

This measure of completion is a projection based on a snapshot of data, so it's vulnerable to statistical fluctuations.

Summary of measures

Measure

Data source

Years

Subject mix adjustment?

Weight

Entry standards

HESA

2019–20

Yes

1.0

Student satisfaction

NSS

2020

Yes

1.5

Research quality

REF

2014

 

1.0

Research intensity

HESA

2013

 

0.5

Graduate prospects – outcomes

HESA

2017–18

Yes 0.67
Graduate prospects – on track HESA 2017–18 Yes 0.33

Student-staff ratio

HESA

2019–20

Yes

1.0

Academic services spend

HESA

2016–17, 2017–18, 2018–19

 

0.5

Facilities spend

HESA

2016–17, 2017–18, 2018–19

 

0.5

Degree completion

HESA

2018–19, 2019–20

 

1.0

How we compile the tables

  • A statistical technique called the z-transformation is applied to each measure to create a score – this ensures each measure contributes the same amount to the overall score and so avoids any need for scaling
  • Five of the measures are then adjusted to take account of the subject mix at a university (a side-effect of this is that it’s impossible to recalculate the total score in the tables using the published data as you'd need full access to all the raw data)
  • The z-scores (adjusted z-scores where the subject mix was taken into account) on each measure are then weighted (see table above) and totalled to give an overall score for the university
  • These total z-scores are then transformed to a scale where the top score was set at 1,000, with the remainder being a proportion of the top score – this scaling doesn't affect the overall ranking but it avoids giving any university a negative overall score

2022 league table scores

  Mean Max Min
Entry standards 132 208 95
Student satisfaction 4.03 4.30 3.72
Research quality 2.70 3.36 1.40
Research intensity 0.47 0.95 0.07
Graduate prospects – outcomes 71.6 95.1 53.1
Graduate prospects – on track 76.7 90 60.5
Student-staff ratio 16.2 26.3 10.1
Academic services spend 1715 2982 884
Facilities spend 684 1939 189
Degree completion 84.8 99.1 53.1

The measures for the subject tables are the same as for the main university league table, except only six measures are used: student satisfaction, research quality, research intensity, entry standards, graduate prospects – outcomes and graduate prospects – on track.

  • If subject level NSS data were not available, aggregated data across similar subjects were used; if this was not available, the previous year was used
  • If insufficient graduate prospects data were available, aggregated data were considered to see if that generated sufficient data; scores were withheld where the number of students was too small to calculate a reliable percentage
  • The calculation of the overall score is also the same, except there's no need for any subject mix adjustment and weights are 1.0 for student satisfaction and entry standards, 0.67 for research assessment and graduate prospects – outcomes, and 0.33 for research intensity and graduate prospects – on track
  • In the tables for Dentistry and Medicine, the graduate prospects score is ignored because it's almost identical for all institutions. In the Nursing table it's halved for similar reasons
  • To qualify for inclusion in a subject table, a university has to have data for student satisfaction and at least one of entry standards or graduate prospects 
  • A blank in one of the columns isn't a zero score but indicates that no valid data was available, and the final score was calculated using the data that was available

Arts, Drama and Music institutions

  Mean  Max Min 
Entry standards 137 167 118
Student satisfaction 3.98 4.24 3.62
Research quality 2.85 3.49 2.29
Research intensity 0.46 0.96 0.15
Graduate prospects – outcomes 72.5 96.7 42.8
Graduate prospects – on track 79.5 95.7 61.8
Student-staff ratio 12.2 18.0 8.0
Academic services spend 1442 4224 157
Facilities spend 340 898 90
Degree completion 92.3 98.9 83.4
  • Our Arts, Drama & Music league table includes a number of specialist colleges that don't meet the full criteria for inclusion in the main table – some of these institutions will also be listed in their relevant subject tables
  • Note that other institutions also offer courses in Arts, Drama and Music – you can find them in the relevant subject tables and on the main universities ranking table
  • The methodology used is exactly the same as for the main table

How we ensure accuracy

Each university is provided with complete sets of its own HESA data well in advance of publication. In addition, where we find anomalous figures in the HESA data, universities are given a further opportunity to check for and notify any errors.

Notes

Main university table

  • Birkbeck, University of London declined to release its data for use in all UK league tables
  • Highlands & Islands: given its unique collegiate structure with 13 academic partners, it'd be inappropriate for the university to appear in the table
  • The Open University doesn't appear in the table because its students are distant learners and so data are mostly unavailable
  • NSS data not available: Cambridge and Oxford 
  • Entry standards data provided by the university: Cardiff Metropolitan and Kent
  • Student-staff ratio (SSR) data provided by the university: Canterbury Christ Church, Exeter, Salford, Suffolk, Trinity Saint David
  • Staff and student facilities spend provided by the university: Manchester Metropolitan (2015–16); historical datasets unreliable for Suffolk, 2016–17 and 2017–18 used
  • Academic services spend provided by the university: Edinburgh Napier (2015–16 and 2016–17), Manchester Metropolitan (2015–16), Salford (2015–16) and Central Lancashire (2015–16 and 2016–17); historical datasets unreliable for Suffolk, 2016–17 and 2017–18 used
  • Graduate prospects data provided by the university: Newcastle
  • Completions data not available: Buckingham (2016–17 data used)
  • Research GPA data not available: Buckingham, Leeds Arts University, Plymouth Marjon, Ravensbourne, Suffolk (minimum used)
  • Research intensity data not available: Buckingham, Leeds Arts University, Plymouth Marjon, Ravensbourne, Suffolk (minimum used); data provided by the university: Staffordshire
  • The rankings in the Complete University Guide are largely based on data from before the start of the coronavirus pandemic, so they do not take into account university performances during the pandemic.

Arts, Drama & Music league table

  • NSS data not available: Courtauld Institute of Art, Royal Central School of Speech and Drama, and Royal Conservatoire of Scotland (2016 score plus average sector change from 2016–2018 used)
  • Academic services spend data not available: Conservatoire for Dance and Drama, and Royal Conservatoire of Scotland (minimum used)
  • Facilities spend data not available: Royal Conservatoire of Scotland (minimum used)
  • Research GPA data not available: Conservatoire for Dance and Drama, Liverpool Institute for Performing Arts (minimum used)
  • Research intensity data not available: Conservatoire for Dance and Drama, Liverpool Institute for Performing Arts (minimum used)
  • The rankings in the Complete University Guide are largely based on data from before the start of the coronavirus pandemic, so they do not take into account university performances during the pandemic.

Advisory board

We consult the universities on methodology as well. Once a year, an advisory board with university experts meets to discuss the methodology and how it can be improved.

We've taken every effort to ensure accuracy but can't take responsibility for errors or omissions. The data providers don't necessarily agree with the data aggregations or manipulations we use and are also not responsible for any inferences or conclusions derived from them.

Is this page useful?

Yes No

Sorry about that...

HOW CAN WE IMPROVE IT?

SUBMIT

Thanks for your feedback!