Methodology

The raw data for the league tables all come from sources in the public domain.

  • The Higher Education Statistics Agency (HESA) provided data for entry standards, student-staff ratios, spending on academic services, facilities spending, good honours degrees, graduate prospects, completion and overseas student enrolments. HESA is the official agency for the collection, analysis and dissemination of quantitative information about the universities.
  • The Higher Education Funding Council for England (HEFCE), along with the Scottish Higher Education Funding Council (SHEFC) and the Higher Education Funding Council for Wales (HEFCW), are the funding councils whose remit it is to develop policy and allocate public funds to the universities.
  • The 2008 Research Assessment Exercise, conducted by the funding councils, provides the data for the research assessment measure used in the tables.
  • The funding councils also have a statutory responsibility to assess the quality of learning and teaching in the UK universities they fund. In England, Wales and Northern Ireland, the funding councils oversaw the National Student Survey, a major survey of the views of final year students of the quality of the courses they were studying on. We use the outcomes of this survey as a measure of student satisfaction.
  • In a few cases the source data were not available and were obtained directly from the individual universities.

Consultations

All universities were provided with complete sets of their own HESA data well in advance of publication.

In addition, where anomalous figures were identified in the HESA data, institutions were given a further opportunity to check for and notify any errors. The outcome of that consultation gave rise to the following:

  • Liverpool Hope, Newport, Swansea Metropolitan, Trinity St David and Wolverhampton requested not to be included in the tables.
  • Highlands and Islands – given its unique collegiate structure with thirteen academic partners, it would be inappropriate for the University of the Highlands and Islands to appear in the tables.
  • Birkbeck and South Wales – excluded due to incomplete data sets.
  • Norwich University of the Arts – excluded as a single subject university.
  • Entry standards data provided by the University – Greenwich, London South Bank.
  • SSR data provided by the University – Bedfordshire, Central Lancashire, Exeter, Glyndŵr & Liverpool John Moores.
  • Academic services spend provided by the University – Anglia Ruskin (2009–10 and 2010–11), Bolton (2009–10), Edge Hill (2009–10), Queen Mary (2010–11).
  • Staff and student facilities spend provided by the University – Anglia Ruskin (2009–10 and 2010–11), Edge Hill (2009–10).
  • Good honours data provided by the University – Glyndŵr.
  • Completion data not available, last year's data used – London South Bank.
  • RAE data not available – Buckingham and University College, Birmingham (minimum used).
  • RAE intensity data not available – Leeds Trinity (minimum used).
  • Graduate prospects data not available, last year's data used – London South Bank.

Similarly, we consulted the universities on methodology.

  • Once a year an Advisory Group with university representatives meets to discuss the methodology and how it can be improved. Thus, every effort has been made to ensure accuracy, but no responsibility can be taken for errors or omissions. The data providers do not necessarily agree with data aggregations or manipulations we use and are also not responsible for any inferences or conclusions thereby derived.

This analysis of the results of the Research Assessment Exercise 2008 makes use of contextual data supplied by the Higher Education Statistics Agency (HESA) who have required that we publish the following statement:

"HESA holds no data specifying which or how many staff have been regarded by each institution as eligible for inclusion in RAE 2008, and no data on the assignment to Units of Assessment of those eligible staff not included. Further, the data that HESA does hold is not an adequate basis on which to estimate eligible staff numbers, whether for an institution as a whole, or disaggregated by Units of Assessment, or by some broader subject-based grouping."

Statistical Methods

A particular feature of the tables is the way the various measures are combined to create a total score.

  • All the scores have undergone a Z-transformation, a statistical method for ensuring that each measure contributes the same amount to the overall score and so avoids the need for scaling. (For the statistically-minded, it involves subtracting the mean score from each individual score and then dividing by the standard deviation of the scores.)
  • Another feature of the tables is that five of the measures have been adjusted to take account of the subject mix at a university. A university with a medical school, for example, will tend to admit students with a higher tariff score than one without simply because it has a medical school. The adjustment removes this subject effect. A side-effect of this is that it is impossible to recalculate the total score in the tables using the published data, as you would need full access to all the raw data to be able to do that.

Apart from noting the overall position of any one university of interest, you can home in on a particular measure of importance to you such as entry standards or graduate prospects.

  • But bear in mind that the composite table says nothing about specific subjects at a university and so should be scrutinised in conjunction with the Subject Tables and University Profiles.

Subject League Tables

Here you can view subject-specific tables to see the ranking of particular universities (also university colleges and other HE institutions) in the subjects they teach.

  • Knowing where a university stands in the pecking order of higher education is a vital piece of information for any prospective student, but the quality of the course is what matters most. The most modest institution may have a centre of specialist excellence and even famous universities have mediocre departments. The Subject League Tables offer some pointers to the leading universities in a wide range of subjects.
  • The data for the Subject League Tables are the same as for the main University League Tables, except that only four measures are used: Student Satisfaction (Ofsted assessments in the case of the Education table), Research Assessment, Entry Standards and Graduate Prospects. The calculation of the overall score is also the same except that there is no need for any subject mix adjustment and all four measures are given equal weight.
  • To qualify for inclusion in a subject table, a university had to have data for at least two of the four measures. A blank in the Entry Standards and Graduate Prospects columns is not a zero score but rather denotes that no valid data were available. Where no data were available, the final score was calculated on the data we have. Two years of Graduate Prospects data were aggregated to make the data more reliable and scores were withheld where the number of students was too small to calculate a reliable percentage.

Table Key

  • Full detaills of how to use the league tables and exactly what each column means can be found in the League Table Key.
The most comprehensive, definitive and complete university guide
@compuniguide
on.fb.me/compuniguide