Guide to institutions’ self-assessments and commentaries for monitoring
What are the self-assessments and commentaries?
Universities and colleges set targets and milestones for themselves in their access agreements. These targets must be agreed with the Office for Fair Access (OFFA), although OFFA does not itself set any targets.
As part of our monitoring process for access agreements we asked universities and colleges to:
- assess their progress against each target they set themselves in their access agreements
- provide data showing their progress against targets for each academic year for the past five years
- provide a commentary setting their access and (for 2012-13 access agreements and later) student success work in context, highlighting any particular challenges they had faced, and, if they had not made as much progress as wished, explaining the reasons for this.
The tables shown in each institution’s self-assessment reproduce the information we received from the institution as part of its monitoring return. Therefore the tables reflect institutions’ own analysis of their performance.
What is OFFA doing with the self-assessments and commentaries?
The self-assessments and commentaries help us to judge how well universities and colleges are meeting their access challenges. They also help to inform our continual dialogue with individual institutions and the sector as a whole, about what works best to widen access and improve retention and student success.
To provide additional context we publish this information alongside institutions’ HESA Performance Indicator (PI) data, where this is available. In 2015-16, the most recent year for which we have data, we have created a new institutional performance tool (XLS) that collates this information into one file.
Guide to Table 1
Table 1 is split into two parts:
Table 1a shows the progress made by the institution against the high-level outcome targets/milestones it set itself in its access agreement relating to the institution’s applicants, entrants or student body. Institutions often use their own internal data when setting such targets, but most institutions also use data from the Higher Education Statistics Agency (HESA), UCAS, or another source.
Table 1b shows the other activity-based milestones and targets that the institution chose to include, relating to outreach, lifelong learning, or institutional management and mission. For example, these might relate to the number of schools an institution is aiming to work with, or the number of students involved in outreach activities. The table also includes information on the measurable success of those activities, for example in how well an institution’s work with schools has increased aspiration and attainment, influenced choices around GCSE subject, or increased the number of young people considering whether to enter higher education, or where to study.
In some cases data may not be given for a particular year because it was not available at the time the institution filled in the monitoring return. These instances are marked ‘N/A’ in the table or left blank.
Guide to column headings in Table 1a
Milestone/target type: The entries in this column describe the category the milestone/target fits into. Some institutions, for example, may have chosen a target that relates to ‘POLAR3 low participation neighbourhoods (HESA Table T1b)’. This is the figure reported by HESA showing what percentage of the institution’s students are from low participation neighbourhoods (POLAR3 Quintile 1). Further information on the definition of low participation neighbourhoods is available on the HESA website.
Description: This column gives a more detailed description of the milestone/target. Here, for example, the institution may explain that its target is that ‘20 per cent of young entrants from the UK should be from low participation neighbourhoods (POLAR3 Quintile 1) by a specified academic year’.
Is this a collaborative target? (2012-13 monitoring and later) We asked institutions to specify which targets were set collaboratively with other providers of higher education.
Baseline data: This is the starting point against which the institution is measuring its progress. It may be a whole number, a percentage, or a percentage expressed as a decimal.
Baseline year: This is the academic year in which the baseline data was measured. If only a single year is given this will be the first year of the two calendar years in which the academic year falls, e.g. ‘2013’ represents 2013-14.
Target: What the institution is aiming to achieve. For example, the institution may have set itself a target to achieve ‘22 per cent young UK entrants from low participation neighbourhoods (POLAR3 Quintile 1)’.
Target year: The academic year by when the institution aims to achieve its target. If only a single year is given this will be the first year of the two calendar years in which the academic year falls, e.g. ‘2019’ would represent 2019-20.
Progress to date: Data for each academic year for the past five years. Likely to be expressed in the same format as the baseline data.
Performance summary: The institution’s assessment of its own performance. The entries in this column were selected by the institution from a list of pre-determined categories. For example, an institution may have chosen ‘No progress made against baseline data’ because the data for the year being monitored is the same as that for the baseline year. The ‘Progress to date’ columns provide a more detailed picture, where you can see what progress may have been made against the target in previous years. This shows why it is important to look at performance over time rather than focus on the change in a single year. Institutions also provide some context about certain targets in their institutional commentary (Table 3, see below).
Guide to column headings in Table 1b
Table 1b uses the same headings, but these targets relate to outreach activity and outcomes. For example, an institution may have a target to increase its partnerships with schools and colleges by a specified year. Its performance summary shows whether it is on course to meet this target.
Targets around outreach activity and outcomes demonstrate the efforts and successes of institutions to raise aspirations and attainment levels well before application and entry to higher education. Much of this work benefits the sector generally and is not of sole benefit to the institution providing the activity. This work is critical to widening participation and fair access.
Guide to Table 2
Table 2 gives a reminder of the institution’s progress against six HESA widening participation performance indicators. This data is not OFFA data and it is not new; it is available from HESA for some time before we publish our monitoring outcomes.
What are performance indicators?
HESA’s widening participation performance indicators give information about the participation of certain groups that are under-represented in higher education, relative to the HE population as a whole.
More information about HESA’s widening participation performance indicators
What do the ‘+’ and ‘-’ mean?
The ‘+’ or ‘-’ in Table 2 indicates how the HEI is performing against its HESA location-adjusted benchmark:
- if the indicator is significantly lower than the benchmark, it is classed as ‘-’
- if it is significantly higher, it is classed as ‘+’
- if there is no significant divergence, neither symbol appears.
(“Significantly” higher or lower means that the difference between the indicator and benchmark percentages is greater than 3 per cent and the difference between the indicator and benchmark percentages is greater than three times the standard deviation of the indicator.)
What are benchmarks?
Benchmarks are not targets. They enable comparisons to be made between institutions which are sufficiently alike to be compared – for example, they take account of the entry qualifications of students, the subjects they study, and the students’ age. It is important to note that benchmarks change each year and, because they are averages, there will always be some institutions that are below the benchmark and some that are above it.
More information about how HESA’s benchmarks are calculated
Guide to the indicators used in Table 2
The first three indicators used are for young (that is, aged up to 20 when they start their course) full-time students only and show:
- State School (%): The percentage of students who attended a school or college in the state sector.
- NS-SEC 4-7 [socio-economic class] (%): The percentage of students from categories 4 to 7 of the National Statistics Socio-Economic Classification. Note that HESA is no longer publishing NS-SEC data. 2014-15 is the final year for which data has been published.
- Low participation neighbourhoods (young) (%) (POLAR3): The percentage of students who come from a neighbourhood in which there is low participation in higher education.
Since 2012-13 three additional indicators have been included to reflect OFFA’s responsibility for part-time students and our whole student lifecycle approach.
The fourth indicator used covers part-time students only and shows:
- Part-time undergraduate entrants by age marker and low participation marker (POLAR3): The percentage of part-time students who come from a neighbourhood in which there is low participation in higher education
The final two indicators used are for all entrants and young full-time students and show:
- Non-continuation following year of entry (all entrants no longer in HE): The percentage of full-time students on first degrees no longer in HE after their first year of study
- Non continuation following year of entry (young full-time first degree entrants from low participation neighbourhoods no longer in HE): The percentage of young full-time first degree students from low participation neighbourhoods no longer in HE after their first year of study
Why are further education colleges not included in Table 2?
For further education colleges, no data appears in Table 2. This is because HESA does not publish performance indicators for further education colleges.
Guide to Table 3
Table 3 gives the institution’s commentary on the data it has reported. This allows the institution to put the raw data into context and to give more details. In this table we asked institutions to:
- comment on the level of progress made against targets
- set the figures in context (for example, say whether there were any external factors to explain certain figures)
- provide explanations where targets have not been met or progress has been less than anticipated
- comment on key factors that have led to successful outcomes.