Methodology

Where do the data come from?

All the data come from sources in the public domain:

  • The National Student Survey and the 2014 Research Excellence Framework are overseen by the funding councils for UK universities (these are The Higher Education Funding Council for England (HEFCE), the Scottish Higher Education Funding Council (SHEFC), and the Higher Education Funding Council for Wales (HEFCW)).
  • The Higher Education Statistics Agency (HESA) provided data for entry standards, student-staff ratios, spending on academic services, facilities spending, good honours degrees, graduate prospects, completion and international student enrolments. HESA is the official agency for the collection, analysis and dissemination of quantitative information about the universities.
  • In just a few cases the source data were not available and were obtained directly from the individual universities (see notes below).
  • You can read about the background to the league tables and the inclusion criteria; how to use the tables; and the press releases relating to the 2016 league tables.

What are the measures that we use?

Entry Standards (maximum score n/a)

What is it? 

  • The average UCAS tariff score of new undergraduate students.

Where does it come from? 

  • HESA data for 2013–14.

How does it work? 

  • Each student's examination results were converted to a numerical score (A level A=120, B=100 ... E=40, etc; Scottish Highers A=72, B=60, etc) and added up to give a score total. HESA then calculated an average for all students at the university. The results were then adjusted to take account of the subject mix at the university.  Students on a foundation year were excluded.

What should you look out for? 

  • A high average score (it is over 400, or more than three As at A level, at some universities) does not mean that all students score so highly or that you need to take lots of A levels to get in. The actual grades needed will vary by subject and few if any courses will ask for grades in more than three subjects (even if some students do take more). Universities which have a specific policy of accepting students with low grades as part of an access policy will tend to have their average score depressed.

Student Satisfaction (maximum score 5.00)

What is it?

  • A measure of student views of the teaching quality at the university.

Where does it come from? 

  • The National Student Survey (NSS), a survey of final-year undergraduate students in 2014.

How does it work? 

  • The National Student Survey asked questions about a variety of aspects of teaching. The average satisfaction score for all questions except the three about learning resources was calculated and then adjusted for the subject mix at the university. Due to the distribution of the data, and to avoid this measure having an undue influence on the overall ranking, the z-score is divided by three.

What should you look out for? 

  • The survey is a measure of student opinion, not a direct measure of quality so it may be influenced by a variety of biases, such as the effect of prior expectations. A top-notch university expected to deliver really excellent teaching could score lower than a less good university which, while offering lower quality teaching, nonetheless does better than students expect from it. A few Scottish universities were not included in the survey and were given the average outcome for all universities in the survey.

Research Assessment (maximum score 4.00)

What is it? 

  • A measure of the quality of the research undertaken in the university.

Where does it come from? 

  • The 2014 Research Excellence Framework (REF) undertaken by the funding councils.

How does it work? 

  • Each university department entered in the assessment exercise achieved a quality profile which gave the proportion of research in each of four categories from 4* to 1* (with any remaining activity being unclassified).  For the research assessment measure, the categories 4* to 1* were given a numerical value of 4 to 1 which allowed a grade point average to be calculated. An overall average was then calculated weighted according to the number of staff in each department. 

What should you look out for?

  • Universities could decide who they wanted to return for the REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile.

Research Intensity (maximum score 1.00)

What is it? 

  • A measure of the proportion of staff involved in research.

Where does it come from? 

  • The 2014 Research Excellence Framework and HESA data at October 2013.

How does it work? 

  • The number of staff submitted to the Research Excellence Framework was divided by the number who were eligible to give a proportion who were submitted.

What should you look out for? 

  • Universities could decide who they wanted to return for REF. In some cases, quite good researchers were omitted as a way of getting the best possible quality profile and so the research intensity measure can be an under-estimate of the actual research intensity.

Graduate Prospects (maximum score 100.0)

What is it? 

  • A measure of the employability of a university's first degree graduates.

Where does it come from? 

  • HESA data for 2012–13.

How does it work? 

  • The number of graduates who take up employment or further study divided by the total number of graduates with a known destination expressed as a percentage. Only employment in an area that normally recruits graduates was included. The results were then adjusted to take account of the subject mix at the university.

What should you look out for? 

  • A relatively low score on this measure does not mean that many graduates were unemployed. It may be that some had low-level jobs such as shop assistants, which do not normally recruit graduates. Some universities recruit a high proportion of local students and so if they are located in an area where graduate jobs are hard to come by this can depress the outcome.

Student–Staff Ratio (maximum score n/a)

What is it? 

  • A measure of the average staffing level in the university.

Where does it come from? 

  • Calculated using HESA data for 2013–14.

How does it work? 

  • A student–staff ratio (i.e. the total number of undergraduate and postgraduate students divided by the number of academic staff) was calculated. Again, the results were adjusted for subject mix.

What should you look out for? 

  • A low student–staff ratio, i.e. a small number of students for each member of staff, does not guarantee good quality of teaching or good access to staff.

Academic Services Spend (maximum score n/a)

What is it? 

  • The expenditure per student on all academic services.

Where does it come from?

  • HESA data for 2011–12, 2012–13, and 2013–14.

How does it work?

  • A university's expenditure on library and computing facilities (staff, books, journals, computer hardware and software, but not buildings), museums, galleries and observatories was divided by the number of full-time equivalent students in the latest year. Libraries and information technology are becoming increasingly integrated (many universities have a single Department of Information Services encompassing both) and so the two areas of expenditure have both been included alongside any other academic services. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for?

  • Some universities are the location for major national facilities, such as the Bodleian Library in Oxford and the national computing facilities in Bath and Manchester. The local and national expenditure is very difficult to separate and so these universities will tend to score more highly on this measure.

Facilities Spend (maximum score n/a)

What is it? 

  • The expenditure per student on staff and student facilities.

Where does it come from? 

  • HESA data for 2011–12, 2012–13, and 2013–14.

How does it work? 

  • A university's expenditure on student facilities (sports, careers services, health, counselling, etc) was divided by the number of full-time equivalent students in the latest year. Expenditure over three years was averaged to allow for uneven expenditure.

What should you look out for? 

  • This measure tends to disadvantage some collegiate universities, as it mostly includes central university expenditure. In Oxford and Cambridge, for example, a significant amount of facilities expenditure is by the colleges but it has not yet been possible to extract comparable data from the college accounts.

Good Honours (maximum score 100.0)

What is it? 

  • The percentage of first degree graduates achieving a first or upper second class honours degree.

Where does it come from? 

  • HESA data for 2013–14.

How does it work? 

  • The number of graduates with first or upper second class degrees was divided by the total number of graduates with classified degrees. Unclassified enhanced first degrees, such as an MEng awarded after a four-year engineering course, were treated as equivalent to a first or upper second for this purpose, while Scottish Ordinary degrees (awarded after three years rather than the usual four in Scotland) were excluded altogether. The results were then adjusted to take account of the subject mix at the university.

What should you look out for? 

  • Degree classifications are controlled by the universities themselves, though with some moderation by the external examiner system. It can be argued, therefore, that they are not a very objective measure of quality. However, degree class is the primary measure of individual success in British higher education and will have an impact elsewhere, such as employment prospects.

Degree Completion (maximum score 100.0)

What is it? 

  • A measure of the completion rate of first degree undergraduates studying at the university.

Where does it come from? 

  • HESA performance indicators, based on data for 2013–14 and earlier years.

How does it work? 

  • HESA calculated the expected outcomes for a cohort of students based on what happened to students in the current year. The figures in the tables show the percentage of students who were expected to complete their course or transfer to another institution.

What should you look out for? 

  • This measure of completion is a projection based upon a snapshot of data. It is therefore vulnerable to statistical fluctuations.

Summary of measures

Measure

Data Source

Years

Subject Mix Adjustment?

Weight

Entry standards

HESA

2013–14

Yes

1.0

Student satisfaction

NSS

2014

Yes

1.5

Research assessment

REF

2014

 

1.0

Research intensity

HESA

2013

 

0.5

Graduate prospects

HESA

2012–13

 Yes 1.0

Student-staff ratio

HESA

2013–14

Yes

1.0

Academic services spend

HESA

2011–12, 2012–13, 2013–14

 

0.5

Facilities spend

HESA

2011–12, 2012–13, 2013–14

 

0.5

Good honours

HESA

2013–14

Yes

1.0

Degree completion

HESA

2012–13, 2013–14

 

1.0

How did we compile the tables?

The main League Table measures ten key aspects of university activity using the most recent data available at the time of compilation. 

  • A statistical technique called the z-transformation was applied to each measure to create a score for that measure. This ensures that each measure contributes the same amount to the overall score and so avoids any need for scaling. (For the statistically-minded, it involves subtracting the mean score from each individual score and then dividing by the standard deviation of the scores.)
  • Five of the measures were then adjusted to take account of the subject mix at a university. A university with a medical school, for example, will tend to admit students with a higher tariff score than one without simply because it has a medical school. The adjustment removes this subject effect. (A side-effect of this is that it is impossible to recalculate the total score in the tables using the published data, as you would need full access to all the raw data to be able to do that.)
  • The z-scores (adjusted z-scores where the subject mix was taken into account) on each measure were then weighted by 1.5 for Student Satisfaction, 0.5 for research intensity and the two spend measures and 1.0 for the rest and summed to give a total score for the university.
  • Finally, these total z-scores were transformed to a scale where the top score was set at 1,000 with the remainder being a proportion of the top score. This scaling does not affect the overall ranking but it avoids giving any university a negative overall score.

Summary of the 2016 League Table scores

 

Entry Standards

Student Satisfaction

Research Quality

Research Intensity

Graduate Prospects

Student– Staff Ratio

Academic Services Spend

Facilities Spend

Good Honours Degree Completion

Mean

352

4.05

2.70

0.47

66.3

16.8

1232

500

70.3

86.5

Max

601

4.27

3.36

0.95

89.9

28.6

4280

1434

92.1

98.4

Min

220

3.81

1.40

0.07

44.6

10.3

355

76

49.5

67.5

The measures for the Subject Tables are the same as for the main University League Table, except that only five measures are used: Student Satisfaction, Research Quality, Research Intensity, Entry Standards and Graduate Prospects.

  • Two years of Graduate Prospects data were aggregated to make the data more reliable and scores were withheld where the number of students was too small to calculate a reliable percentage. The calculation of the overall score is also the same except that there is no need for any subject mix adjustment and weights are 1.0 for student satisfaction, entry standards and graduate prospects, 0.67 for research assessment and 0.33 for research intensity. In the tables for Dentistry and Medicine the graduate prospects score is ignored because it is almost identical for all institutions.
  • To qualify for inclusion in a subject table, a university had to have data for at least two of the measures, one of which had to be student satisfaction (with the two research measures counting as one for this purpose). A blank in one of the columns is not a zero score but rather indicates that no valid data were available. Where no data were available, the final score was calculated using the data that were available.

How do we ensure the tables are accurate?

Each university was provided with complete sets of its own HESA data well in advance of publication. 

  • In addition, where we found anomalous figures in the HESA data, universities were given a further opportunity to check for and notify any errors.

Notes

  • Birkbeck is not included in the table as the available datasets are not reflective of the recently restructured institution.
  • Highlands & Islands: given its unique collegiate structure with thirteen academic partners, it would be inappropriate for the university to appear in the table.
  • Wolverhampton refused to release its data, the only university to do so, and is absent from the table.
  • Entry standards data provided by the university: Liverpool Hope; Queen's, Belfast; Trinity Saint David; West of England, Bristol.
  • Student–staff ratio data provided by the university: Aston; Canterbury Christ Church; Central Lancashire; Edinburgh; Exeter; Northumbria; St. George's, University of London; Strathclyde; Trinity Saint David; University of the Arts, London.
  • Student facilities: spend provided by the university: Lancaster (2011–12 and 2012–13); Loughborough (2011–12 and 2012–13); Manchester Metropolitan (2011–12 and 2012–13); St Mark & St John (2011–12 and 2012–13). Revised student data provided by Central Lancashire.
  • Spend: historical datasets unreliable for Trinity Saint David, only 2013–14 used.
  • Research assessment data not available: Buckingham and St Mark & St John (minimum used).
  • Research intensity provided by the university: Staffordshire

Similarly, we consulted the universities on methodology. Once a year an Advisory Group with university experts meets to discuss the methodology and how it can be improved.

  • Thus, every effort has been made to ensure accuracy, but no responsibility can be taken for errors or omissions. The data providers do not necessarily agree with data aggregations or manipulations we use and are also not responsible for any inferences or conclusions thereby derived.

On Monday 11 May 2015 we published some revisions to the tables.

  • We brought the Education table in line with the other 66 subject tables by using the NSS rather than Ofsted data.
  • We also made changes to the following subject tables, chiefly because of late reporting of REF mappings by universities: Archaeology, Biological Sciences, Chemistry, Drama, Economics, Geography, Geology, History, History of Art, Mathematics, Medicine, Nursing, Physics & Astronomy, Sports Science.
  • As a consequence of the UCL/Institute of Education merger, and its adjusted REF score, we republished the main League Table to accommodate small shifts in the ranking of four universities around the top ten.

The Green Score

The Green Score is not used in the compilation of the Complete University Guide's University League Table. However, it is another benchmark by which universities can be judged by prospective students.

  • View the Green Score.
  • The 2014 report Student attitudes towards and skills for sustainable development, published by the Higher Education Academy, the NUS and Change Agents UK, states that 'eight in every ten students consistently believe that sustainable development should be actively incorporated and promoted by universities, and this increases as respondents progress through their studies.' Sustainable development, in all its facets, is increasingly important to those going in to higher education.

What is it? 

  • A comprehensive and independent ranking of universities by environmental and ethical performance.

Where does it come from? 

How does it work? 

  • The People & Planet Green League ranking combines data obtained directly from the universities with Estates Management statistics data from HESA. It takes a dual approach to environmental management – looking both at a university’s commitment to systemic improvement and at its actual performance. Both are essential indicators of a university’s commitment to and actual transition towards a low-carbon future which will require resilience and innovation from all sectors of society. Read more about the Green League’s methodology.

What should you look out for?   

  • Going to a university which ranks highly in the Green League may reduce your own carbon footprint; conversely attending one of the least green universities can dramatically increase your own carbon footprint.
  • Some 44% of all UK universities now demonstrate a commitment to integrate sustainable development across all aspects of teaching and learning. If this is important to you, the Green League may give an indicator as to how actively a university is working to change their environmental impact.
The most comprehensive, definitive and complete university guide
@compuniguide
on.fb.me/compuniguide