How the League Table Works

A key to the columns and what to look out for – the League Table measures nine key aspects of university activity using the most recent data available at the time of compilation.

  • As we have mentioned (see Methodology), a statistical technique called the Z-transformation was applied to each measure to create a score for that measure.
  • The Z-scores on each measure were then weighted by 1.5 for Student Satisfaction and Research Assessment, 0.5 for the two spend measures, and 1.0 for the rest and summed to give a total score for the University.
  • Finally, these total scores were transformed to a scale where the top score was set at 1,000 with the remainder being a proportion of the top score. This scaling does not affect the overall ranking but it avoids giving any university a negative overall score.
  • In addition, some measures (Student Satisfaction, Entry Standards, Student/Staff Ratio, Good Honours, and Graduate Prospects) have been adjusted to take account of the subject mix at the institution. A summary of the 2015 League Table scores is given below:
 

Student Satisfaction

Research Assessment

Entry Standards

Student Staff Ratio

Academic Services Spend

Facilities Spend

Good Honours

Graduate Prospects

Completion

Mean

4.0

2.2

358

17.5

1174

453

67.5

64.7

86.3

Max

4.3

3.0

614

30.2

5280

1291

91.6

88.8

98.9

Min

3.8

1.1

225

10.2

331

77

43.5

42.7

67.5

  • The details of how the measures were compiled, together with some advice about their interpretation, is given below.
  • You should note that the Green Score is not used in the compilation of the University League Table. However, it is becoming an increasingly important benchmark upon which universities are measured by prospective students and we are pleased to include it, courtesy of People & Planet Green League.

Student Satisfaction (maximum score 5.00)

What is it?

  • A measure of the view of students of the teaching quality at the university.

Where does it come from?

  • The National Student Survey, a survey of final-year students in 2013.

How does it work?

  • The National Student Survey asked questions about a variety of aspects of teaching. The average satisfaction score for all questions except the three about learning resources was calculated and then adjusted for the subject mix at the university.  Due to the distribution of the data, and to avoid this measure having an undue influence on the overall ranking, the z-score is divided by three. 

What should you look out for?

  • The survey is a measure of student opinion, not a direct measure of quality. It may therefore be influenced by a variety of biases, such as the effect of prior expectations. A top-notch university expected to deliver really excellent teaching could score lower than a less good university which, while offering lower quality teaching, nonetheless does better than students expect from it.
  • Some Scottish universities were not included in the survey and were given the average outcome for all universities in the survey.

Research Assessment (maximum score 4.00)

What is it?

  • A measure of the average quality of the research undertaken in the university.

Where does it come from?

  • The 2008 Research Assessment Exercise undertaken by the funding councils.

How does it work?

  • Each university department entered in the assessment exercise achieved a quality profile which gave the proportion of research in each of four categories from 4* to 1* (with any remaining activity being unclassified).
  • For the research assessment measure, the categories 4* to 1* were given a numerical value of 4 to 1 which allowed a grade point average to be calculated. An overall average was then calculated weighted according to the number of staff in each department.
  • Then a measure of research intensity (the proportion of staff in the university undertaking the research that contributed to the research quality rating) was calculated. About half of universities agreed to release this information; for the other half it was estimated using a statistical transformation of HESA staff data.
  • The two measures were combined in a ratio of 2:1 with research quality more heavily weighted.

What should you look out for?

  • Universities could decide who they wanted to return for assessment. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile.

Entry Standards (maximum score n/a)

What is it?

  • The average UCAS tariff score of new students.

Where does it come from?

  • HESA data for 2012–13.

How does it work?

  • Each student's examination results were converted to a numerical score (A-level A=120, B=100 ... E=40, etc; Scottish Highers A=72, B=60, etc) and added up to give a score total. HESA then calculates an average for all students at the university. The results were then adjusted to take account of the subject mix at the university.

What should you look out for?

  • A high average score (it is over 400, or more than three As at A-level, at some universities) does not mean that all students score so highly or that you need to take lots of A-levels to get in.
  • The actual grades needed will vary by subject and few if any courses will ask for grades in more than three subjects (even if some students do take more).
  • Universities which have a specific policy of accepting students with low grades as part of an access policy will tend to have their average score depressed.

Student–Staff Ratio (maximum score n/a)

What is it?

  • A measure of the average staffing level in the university.

Where does it come from?

  • Calculated using HESA data for 2012–13.

How does it work?

  • A student–staff ratio (i.e. the number of students divided by the number of staff) was calculated in a way designed to take account of different patterns of staff employment in different universities. Again, the results were adjusted for subject mix.

What should you look out for?

  • A low student–staff ratio, i.e., a small number of students for each member of staff, does not guarantee good quality of teaching or good access to staff.

Academic Services Spending (maximum score n/a)

What is it?

  • The expenditure per student on all academic services.

Where does it come from?

  • HESA data for 2010–11, 2011–12, and 2012–13.

How does it work?

  • A university's expenditure on library and computing facilities (books, journals, staff, computer hardware and software, but not buildings), museums, galleries and observatories was divided by the number of full-time equivalent students.
  • Libraries and information technology are becoming increasingly integrated (many universities have a single Department of Information Services encompassing both) and so the two areas of expenditure have both been included alongside any other academic services.
  • Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

  • Some universities are the location for major national facilities, such as the Bodleian Library in Oxford and the national computing facilities in Bath and Manchester.
  • The local and national expenditure is very difficult to separate and so these universities will tend to score more highly on this measure.

Facilities Spending (maximum score n/a)

What is it?

  • The expenditure per student on staff and student facilities.

Where does it come from?

  • HESA data for 2010–11, 2011–12, and 2012–13.

How does it work?

  • A university's expenditure on student facilities (sports, careers services, health, counselling, etc) was divided by the number of full-time equivalent students.
  • Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

  • This measure tends to disadvantage some collegiate universities, as it mostly includes central university expenditure. In Oxford and Cambridge, for example, a significant amount of facilities expenditure is by the colleges but it has not yet been possible to extract comparable data from the college accounts.

Good Honours (maximum score 100.0)

What is it?

  • The percentage of graduates achieving a first or upper second class honours degree.

Where does it come from?

  • HESA data for 2012–13.

How does it work?

  • The number of graduates with first or upper second class degrees was divided by the total number of graduates with classified degrees.
  • Enhanced first degrees, such as an MEng awarded after a four-year engineering course, were treated as equivalent to a first or upper second for this purpose, while Scottish Ordinary degrees (awarded after three years rather than the usual four in Scotland) were excluded altogether. The results were then adjusted to take account of the subject mix at the university.

What should you look out for?

  • Degree classifications are controlled by the universities themselves, though with some moderation by the external examiner system.
  • It can be argued, therefore, that they are not a very objective measure of quality. However, degree class is the primary measure of individual success in British higher education and will have an impact elsewhere, such as employment prospects.

Graduate Prospects (maximum score 100.0)

What is it?

  • A measure of the employability of a university's graduates.

Where does it come from?

  • HESA data for 2011–12.

How does it work?

  • The number of graduates who take up employment or further study divided by the total number of graduates with a known destination expressed as a percentage.
  • Only employment in an area that normally recruits graduates was included. The results were then adjusted to take account of the subject mix at the university.

What should you look out for?

  • A relatively low score on this measure does not mean that many graduates were unemployed. It may be that some had low-level jobs such as shop assistants, which do not normally recruit graduates.
  • Some universities recruit a high proportion of local students. If they are located in an area where graduate jobs are hard to come by, this can depress the outcome.
  • A measure of the employability of graduates has been included in the HEFCE performance indicators but this is only available at institution level. The HESA data was used so that a subject-mix adjustment was made.

Completion (maximum score 100.0)

What is it?

  • A measure of the completion rate of those studying at the university.

Where does it come from?

  • HESA performances indicators, based on data for 2012–13 and earlier years.

How does it work?

  • HESA calculated the expected outcomes for a cohort of students based on what happened to students in the current year.
  • The figures in the tables show the percentage of students who were expected to complete their course or transfer to another institution.

What should you look out for?

  • This measure of completion is a projection based upon a snapshot of data. It is therefore vulnerable to statistical fluctuations.

Green Score

What is it?

  • A comprehensive and independent ranking of universities by environmental and ethical performance. Remember that the Green Score is not one of the nine measures used to compile our University League Table rankings.

Where does it come from?

  • The Green Scores come from the award-winning Green League, provided courtesy of People & Planet, the UK's largest student campaigning network.

How does it work?

  • The People & Planet Green League ranking combines data obtained directly from the universities with Estates Management statistics data from HESA.
  • It takes a dual approach to environmental management – looking both at universities’ commitment to systemic improvement and at their actual performance. Both are essential indicators of universities’ commitment to and actual transition towards a low-carbon, post-oil future which will require resilience and innovation from all sectors of society. Read more about the Green League’s methodology.

What should you look out for?

  • Studying or working at the greenest UK universities will provide you with skills and knowledge to get the green jobs of the future; going to the least green universities can dramatically increase your own carbon footprint.

Conclusions

Universities' positions in the tables inevitably reflect more than their performance over a single year.

  • Many of those at the top have built their reputations and developed their expertise over many decades or even centuries, while some of those at the bottom are still carving out a niche in the unified higher education system.
  • Perhaps the least surprising conclusions to be drawn from the tables are that Oxbridge and parts of the University of London remain the dominant forces in British higher education and that, on the measures adopted here, the newer universities still have ground to make up on the old.
  • However, many of the latter have different priorities from those of any of their more established counterparts and can often demonstrate strengths in other areas.

In an exercise such as this, some distortions are inevitable and the main ones have been identified in the What should you look for? sections above. The use of a variety of indicators is intended to diminish such effects, but they should be borne in mind when making comparisons.

The most comprehensive, definitive and complete university guide
@compuniguide
on.fb.me/compuniguide