Joint Negotiating Committee for Higher Education Staff
Role Analysis and Job Evaluation
Guidance for Higher Education Institutions
January 2004
As part of the national negotiations that established the Joint Negotiating Committee for Higher Education Staff (JNCHES), concluded in June 2001, it was agreed that a joint working group would be set up to identify the arrangements needed to ensure that the sector's pay systems deliver equal pay for work of equal value.
The resulting guidance on role analysis and job evaluation was agreed and published by JNCHES in March 2002. It included advice to institutions on the selection of a suitable system to underpin pay and grading arrangements to ensure 'equal pay' criteria are met. In December 2003 JNCHES agreed this updated and extended version which includes additional advice on implementation (see the latter part of this Preamble and Appendix D). The AUT did not agree the guidance on either occasion.
The guidance should be viewed alongside the new Framework Agreement on Modernisation of HE Pay Structures which:
- establishes a single pay spine covering all staff;
- provides for the application of pay and grading structures, linked with that spine, within the framework of nationally agreed principles; and,
- supports the achievement of equal pay for work of equal value, with the application of pay points to staff being transparent, consistent and fair.
JNCHES recommends that Higher Education Institutions (HEIs) adopt a role analysis/job evaluation system which can be applied across all staff groups, so that relativities across different occupational and job groups can be assessed effectively and to help ensure equal pay for work of equal value.
JNCHES notes that various schemes are available for this purpose. A large number of HEIs are members of the Educational Competences Consortium (ECC) which has developed the Higher Education Role Analysis (HERA) scheme specifically to cover the wide range of roles in HE, an increasing number of institutions are using the HERA scheme to help address equal pay concerns, and that ECC continue to work with the HE trade unions to clarify issues relating to the implementation of the scheme at local level.
The Universities and Colleges Employers Association (UCEA) and the nationally recognised HE trades unions recommend that HEIs enter into the processes of role analysis and job evaluation in partnership with their recognised trade unions.
The Framework Agreement contains provision for the development of a library of indicative role profiles which can be used to inform grading decisions for academic staff. These profiles will be capable of being used in conjunction with a number of different role analysis schemes and will be consistent with the principles of equal pay for work of equal value. The library of grade profiles will be accompanied by detailed guidance (in the form of a 'Tool Kit') which sets out the recommended approach for the use of grade profiles within a role analysis framework.
A partnership approach is likely to enhance the quality of the process and promote commitment to its objectives. It is vital that staff have confidence in the role analysis process and are confident that their recognised trade union representatives are both well informed and able to advise and support them throughout each stage of the role analysis process.
Therefore JNCHES strongly recommends that institutions and their recognised trade unions agree on the amount of facility time that would reasonably be needed for local trade union representatives to participate fully in partnership working. A need for additional facility time is likely to be most acute when considering the demands of implementing role analysis and new pay structures. However, it is also recognised that the Framework Agreement places heavy burdens on local representatives in relation to other aspects of pay modernisation.
Additional facility time related to the introduction of role analysis and new pay structures should ensure that:
- local representatives are able to take part from the beginning in joint discussions about the implementation of the Framework Agreement locally
- local representatives are able to receive sufficient and appropriate training to enable them to work to implement the Agreement. Such training will include training by their union(s) and joint training in some aspects of the role analysis scheme
- all aspects of implementing the Framework Agreement can be jointly agreed locally including the aspects not included in the job evaluation scheme itself (eg use of contribution-related progression)
- local representatives have time to brief members on progress in local discussions and on how members should ensure the Framework Agreement is implemented effectively locally.
An institution-wide facilities agreement should provide for a reasonable reduction in workload where appropriate for the initial period of implementation of the Framework Agreement, with access to facilities time available to the union representatives involved irrespective of discipline, campus or part-time or contractual status.
INTRODUCTION
This guidance deals with the basic considerations affecting the choice and development of processes for role analysis and job evaluation. It is set out under the following headings:
• the need for role analysis and job evaluation
• meeting the need
- definition of terms
- aims of role analysis and job evaluation
- features of role analysis and job evaluation processes
- design and operational considerations
- criteria for choice
- implementing role analysis and job evaluation.
Appendix A Types of non-analytical schemes
Appendix B Evaluating job evaluation
Appendix C Guidelines on job evaluation
Appendix D Implementation framework
THE NEED FOR ROLE ANALYSIS AND JOB EVALUATION
In its Good Practice Guide on Job Evaluation Schemes Free of Sex Bias the Equal Opportunities Commission (EOC) states that:
Non-discriminatory job evaluation should lead to a payment system which is transparent and within which work of equal value receives equal pay regardless of sex.
If an Employer wishes to defend an equal pay claim, equal pay legislation requires the job evaluation study to have been done by an analytical method, i.e. the study should have been undertaken with a view to evaluating 'in terms of the demands made on a worker under various headings (for instance, effort, skill, decision)'. In the leading case Bromley v Quick (1988) the Court of Appeal ruled that a job evaluation system can provide a defence only if it is analytical in nature. The Employer must demonstrate the absence of sex bias in the job evaluation scheme, and jobs will be held to be covered by a job evaluation scheme only if they have been fully evaluated using the scheme's factors. Slotting whole jobs against benchmarks is insufficient.
Employers must also comply with the General Statutory Duty placed by The Race Relations (Amendment Act) 2000 to promote race equality in all relevant functions as explained in the draft Statutory Code of Practice on the Duty to Promote Race Equality produced by the Commission for Racial Equality, and with the requirements of disability discrimination legislation as explained by the Disabilities Rights Commission. The equality of treatment between different racial groups and those with and without disabilities required by this legislation includes the need to provide equal pay for work of equal value.
From December 2003 legislation extends these provisions to cover discrimination on grounds of sexual orientation or religion and belief, and they will be further extended in 2006 to cover age. These issues and their links with equal pay requirements are addressed more fully in the JNCHES guidance "Partnership for Equality: Action for Higher Education" (published in February 2003).
The Bett Report recommended that job evaluation which satisfactorily accommodates the full range of duties and responsibilities appropriate to higher education is a necessary requirement for ensuring equal pay for work of equal value.
The EOC has expressed the view that HEFCE, as the largest funding body for Higher Education, is obliged to ensure that its funds are not spent in a discriminatory way. This requirement has been taken into account by HEFCE in its formulation of the specific areas which HR strategies in HE institutions should cover in order to be eligible for special funding under its Rewarding and Developing Staff initiative. The first phase of this specifically required HEIs to:
"Develop equal opportunities targets, with programmes to implement good practice throughout the institution. This should include ensuring equal pay for work of equal value, using institution-wide systems of job evaluation."
The criteria for the second phase are due to be published shortly and are expected to include equivalent requirements.
These HEFCE requirements explicitly cover equal pay for those from different racial groups and those with disabilities as well as between men and women. The funding bodies in Scotland and Wales have likewise introduced conditions of grant related to equal opportunities and pay.
MEETING THE NEED
It is clear that to meet the need, an analytical scheme is necessary which, as defined by the EOC, is one 'where jobs are broken down into components (known as factors) and scores for each factor are awarded with a final total giving an overall rank order'. In this definition, the EOC is referring to what is commonly known as a point-factor scheme. However, as long as the scheme requires jobs to be evaluated in terms of the various demands made on them (i.e. is analytical), a scoring system may not be essential.
DEFINITIONS
Role analysis The process of collecting, analysing and recording information about the requirements of roles in order to provide the basis for a role profile. Role analysis focuses on the demands made on role holders in terms of what they need to know and be able to do to deliver the expected level of performance (competency).
Role analysis is based on the concept of a role. This can be defined as the part played by people in fulfilling the purposes of their work by operating effectively and flexibly within the context of the institution's purposes, structure and processes. The concept of a role can be distinguished from that of a job in which the duties are fixed, irrespective of who is carrying out the work. Both roles and jobs can be analysed systematically to determine their relative size, a process normally termed job evaluation as defined below.
Job evaluation A systematic process for defining the relative worth or size of jobs or roles within an organisation in order to establish internal relativities and provide the basis for designing an equitable grade structure, grading jobs in the structure and managing relativities. The terms job evaluation and role evaluation are often used interchangeably although it could be argued that if the focus is on roles as defined above rather than jobs, then the term role evaluation would be more appropriate. In this paper, however, the common parlance term job evaluation is used to cover both role and job evaluation.
As the Equal Opportunities Commission (EOC) points out in its Good Practice Guide on Job Evaluation Schemes Free of Sex Bias:
The aim is to evaluate the job, not the job holder, although it is recognised that to a certain extent any assessment of a job's total demands relative to another will always be subjective .
Job evaluation can take the form of:
• An Analytical Scheme in which decisions about the relative value or size of jobs or roles are based on an analysis of the degree to which various defined elements or factors are present in the form of demands made on the job or role holder.
• A Non-analytical Scheme in which whole jobs or roles are described and compared in order to place them in rank order or slot them into a grade without analysing them into their constituent parts or elements (Appendix A contains a description of the main types of non-analytical schemes). The EOC states that 'these types of schemes are particularly prone to sex discrimination because where whole jobs are being compared (rather than scores on components of jobs) judgements made by the evaluators can have little objective basis other than the traditional value of the job'. This point made by the EOC applies equally to discrimination on grounds of race or disability. What is sometimes called a 'felt-fair' comparison between jobs is in real danger of simply reproducing the existing hierarchy - the sex or race of the job holder may well have been a contributory factor to the placing of a job in that hierarchy.
AIMS OF ROLE ANALYSIS AND JOB EVALUATION
The aims of role analysis and job evaluation are to:
• establish the relative value or size of jobs or roles, i.e. internal relativities
• produce the information required to design and maintain equitable grade and pay structures
- provide as objective as possible a basis for placing jobs or roles within a grade structure
- enable consistent decisions to be made about grading jobs or roles
- ensure that the organisation meets legal and ethical equal pay for work of equal value requirements and the legal and ethical requirements not to discriminate on grounds of race, disability, sexual orientation or religion.
FEATURES OF ROLE ANALYSIS AND JOB EVALUATION PROCESSES
The main features of role analysis and job evaluation processes are that they:
- attempt as far as possible to enable objective judgements to be made about relative job size and gradings
- enhance objectivity by providing factual evidence (role analysis) on which informed judgements can be based rather than relying on opinion or pre-conceptions
- provide a framework of defined yardsticks which will help to channel judgements - to achieve as high a degree of objectivity and consistency as possible, these are based on an analysis of job demands under different headings
- evaluate the job not the person - evaluations take no account of the personal characteristics or performance of individuals, although it has to be recognised that where there is some flexibility, the content of the role can be influenced by the role holder
- do not directly take into account the volume of work
- are solely concerned with internal relativities - account is not taken of market rates.
DESIGN AND OPERATIONAL CONSIDERATIONS
When considering what schemes to use, HEIs should distinguish between the design of the scheme and the process of operating it. Equal pay considerations have to be taken into account in both design and process.
Design principles
For an analytical scheme, the design principles are that:
• the scheme should be thorough in analysis and capable of impartial implementation
• the elements used in the scheme should cover the whole range of jobs to be evaluated at all levels without favouring any particular type of job, role or occupation and without discriminating on the grounds of gender, race, disability, sexual orientation, religion or for any other reason - the scheme should fairly measure features of female dominated jobs as well as male dominated jobs and those jobs carried out mainly by one or more racial groups or those with disabilities
• through the use of common elements and methods of analysis and evaluation, the scheme should enable comparison to take place of the relativities between jobs in different functions or job families
• the elements should be clearly defined and differentiated - there should be no double counting
• the levels should be defined and graduated carefully
• bias by reference to gender, race or disability must be avoided in the choice of elements, the wording of element and level definitions and the element weightings - statistical checks should be carried out to identify any bias.
Process principles
The process principles are that:
- the scheme should be transparent, everyone concerned should know how it works - the basis upon which the evaluations are produced
- appropriate proportions of men and women regardless of sexual orientation, those from different racial groups and people with disabilities, and with a range of contractual status, should be able to participate in the process of job evaluation
- the quality of role analysis should be monitored to ensure that analyses produce accurate and relevant information which will inform the job evaluation process and will not be unjustifiably biased
- consistency checks should be built into operating procedures
- the outcomes of evaluations should be examined to ensure that gender, racial or any other form of unjustifiable bias has not occurred
- particular care is necessary to ensure that the outcomes of job evaluation do not simply replicate the existing hierarchy - it is to be expected that a job evaluation exercise will challenge present relativities where they cannot be justified
- all those involved in role analysis and job evaluation (including local union representatives) should be thoroughly trained in the operation of the scheme and in how to avoid bias because of sex, race, disability, sexual orientation or religion
- special care should be taken to ensure that grade boundaries are placed appropriately and that the allocation of jobs to grades is not in itself discriminatory
- there should be provision for the review of evaluations and for appeals against gradings
- the scheme should be reviewed regularly to ensure that it is being operated properly and that it is still fit for its purpose.
- Both employers and trade union representatives will work in partnership, as defined in the Framework Agreement.
Appendices B and C contain checklists covering the overall approach to job evaluation and design and operating requirements respectively.
CRITERIA FOR CHOICE
The main criteria for selecting a scheme are that it should be:
- Analytical - it should be based on the analysis and evaluation of the degree to which various defined elements or factors constituting demands on the job holder are present in a job.
- Appropriate - it should cater for the particular demands made on all the jobs or roles to be covered by the scheme.
- Comprehensive - the scheme should be capable of application to all the jobs or roles in the organisation covering all categories of staff, and the factors should be common to all those jobs. There should ideally be a single scheme which can be used to assess relativities across different occupations or job families and to enable benchmarking to take place as required.
- Thorough in analysis and capable of impartial application - the scheme should have been carefully constructed to ensure that its analytical framework is sound and appropriate in terms of all the jobs or roles it has to cater for. It should also have been tested and trialled to check that it can be applied impartially to those jobs or roles.
- Transparent - the processes used in the scheme from the initial role analysis through to the grading decision should be clear to all concerned. Information should not be perceived as being processed in a 'black box'.
- Non-discriminatory - the scheme must meet equal pay for work of equal value requirements and not discriminate in any way on grounds of sex, race, disability, sexual orientation or religion.
IMPLEMENTING ROLE ANALYSIS AND JOB EVALUATION
Should role analysis and job evaluation processes lead to the adoption of new grade structures, the agreed framework for implementation of those will have need to cover assimilation policies. These should cover:
• Policies on the pay point in the new grade to which staff should be assimilated.
• Protection - 'red circling' individuals whose job has been downgraded and are therefore paid above the upper limits of the new grade for their job. The agreed assimilation arrangements should limit the duration of the protection period as extended red-circling can lead to pay inequities which may have equal value implications if a higher proportion of either sex, members of a racial group or those without disabilities have been protected for some time.
• Policies on green-circling - bringing staff who are under-graded and are therefore paid less than the minimum for the new grade up to the minimum rate for the grade or appropriate pay point as determined by the assimilation policy. Movement to the new rate of pay should be in line with any agreed assimilation arrangements.
Role analysis and job evaluation programmes always generate costs. These can be classified under the following headings:
• the cost of purchasing a ready-made job evaluation scheme
• the cost of any consultancy advice obtained to help develop or introduce a scheme
• the opportunity cost of the time spent by HR staff, line managers, staff and union representatives in developing and introducing a scheme, and in role analysis and the evaluation of jobs and roles when it is in operation
• the cost of dealing with anomalies (bringing the pay of staff up to their new pay range) - this will depend upon the number of such anomalies but can be at least 3 per cent of pay roll
• the cost of pay protection for staff whose posts are down-graded.
Further guidance on implementation is set out in Appendix D.
APPENDIX A
TYPES OF NON-ANALYTICAL JOB EVALUATION SCHEMES
All non-analytical schemes are based on a process of comparing whole jobs with one another or against some form of scale, i.e:
- job to job in which a job is compared with another job to decide whether it should be valued more, less or the same (ranking and 'internal benchmarking' or job matching processes)
- job to scale in which judgements are made by comparing a whole job with a defined hierarchy of job grades (job classification) - this involves matching a job description to a grade description.
Job ranking
Ranking is the process of comparing jobs with one another and arranging them in order of their perceived value to the organisation. In one sense, all evaluation schemes are ranking exercises because they place jobs in a hierarchy. The difference between ranking and analytical methods such as point-factor rating is that job ranking does not attempt to break down jobs into factors or elements although, explicitly or implicitly, the comparison may be based on some generalised concept such as the level of responsibility.
Paired comparison is a statistical technique which is sometimes used to provide a more sophisticated method of job ranking. It is based on the assumption that it is always easier to compare one job with another than to consider a number of jobs and attempt to build up a rank order by multiple comparisons.
Job ranking is a simple process which reflects what people tend to do when comparing jobs, but:
• there are no defined standards for judging relative worth and there is therefore no rationale to defend the rank order - it is simply a matter of opinion (although it can be argued that even analytical schemes do no more than channel opinions in certain directions)
• ranking is not acceptable as a method of determining comparable worth in equal value cases
• evaluators need an overall knowledge of every job to be evaluated and ranking may be more difficult when a large number of jobs are under consideration
• it may be difficult if not impossible to produce an appropriate ranking for jobs in widely different functions where the demands made upon them vary significantly
• it may be hard to justify slotting new jobs into the structure or to decide whether or not there is a case for moving a job up the rank order, i.e. re-grading
• the division of the rank order into grades is likely to be somewhat arbitrary.
Internal benchmarking or job matching
Evaluation by internal benchmarking or job matching simply means comparing the job under review with any internal benchmark job which is believed to be properly graded and paid and slotting the job under consideration into the same grade as the benchmark job. The comparison is usually made on a whole job basis without analysing the jobs factor by factor.
Internal benchmarking is simple and quick and is perceived by those who practice it as a natural approach to valuing jobs. It is therefore commonly used, often in conjunction with job classification. But:
- it relies on judgements which may be entirely subjective and could be hard to justify
- it is dependent on the identification of suitable benchmarks which are properly graded and such comparisons may only perpetuate existing inequities
- it is not acceptable as a defence in equal value cases.
Job classification
Job classification is the process of slotting jobs into grades by comparing the whole job with a scale in the form of a hierarchy of grade definitions. It is based on an initial definition of the number and characteristics of the grades into which jobs will be placed. The grade definitions may refer to such job characteristics as skill, decision making and responsibility. Job descriptions may be used which include information on the presence of those characteristics but the characteristics are not assessed separately when comparing the description with the grade definition.
Job classification is the most used form of non-analytical job evaluation because it is simple, easily understood and at least, in contrast to whole-job ranking, it provides some standards for making judgements in the form of the grade definitions. But:
- it cannot cope with complex jobs which will not fit neatly into one grade
- the grade definitions tend to be so generalised that they may not be much help in evaluating border-line cases
- it fails to deal with the problem of evaluating and grading jobs in dissimilar occupational or job families where the demands made on job holders are widely different
- grade definitions tend to be inflexible and unresponsive to changes affecting roles and job content
- the grading system can perpetuate inappropriate hierarchies
- because it is not an analytical system, it is not effective as a means of establishing comparable worth and does not provide a defence in equal value cases.
APPENDIX B
ASSESSING JOB EVALUATION SCHEMES
Is the scheme:
• Thorough in analysis and capable of impartial application?
• Analytical - jobs are valued in terms of demands under various headings?
• Appropriate for the type and range of jobs it has to cover?
• Transparent - the process of evaluating jobs is clear?
• Easy to understand?
• Reasonably easy to administer?
Has the scheme:
• Been thoroughly researched?
• Been systematically tested?
• Been introduced only after comprehensive training?
• Been monitored for consistency and lack of bias in design and application?
Does the scheme:
• Meet the equal value criteria of the EOC and the European Commission?
• Comply with the General Statutory Duty placed by The Race Relations (Amendment Act) 2000 to promote race equality in all relevant functions as explained in the draft Statutory Code of Practice on the Duty to Promote Race Equality produced by the Commission for Racial Equality?
• Comply with the requirements of discrimination legislation on gender, marital status, race, disability, religion or belief, and sexual orientation and take account of the proposed legislation on age discrimination?
• Cover all the jobs without favouring any?
• Enable equitable and consistent decisions to be made on relativities and gradings?
• Provide for appeals?
(See also the design and process principles set out on pages 7 and 8 above.)
APPENDIX C
GUIDELINES ON JOB EVALUATION
(Adapted from EOC and EC Guidelines and also taking into account the General Statutory Duty to Promote Race Equality and the provisions of the legislation relating to discrimination because of disability.)
Design guidelines
• The design and development project team should be representative of the spread of jobs or roles to be covered by the scheme and should include an appropriate representation of women as well as men, the main racial groups and those with disabilities.
• Job holders selected for interviews should be of the predominant gender or racial group for each job where there is a clear gender or racial dominance.
• All jobs or roles should be covered, regardless of whether they are carried out on a full-time or part-time basis (but it is not necessary separately to evaluate identical jobs).
• The test or benchmark sample should be fully representative and should include an appropriate proportion of predominantly female as well as predominantly male jobs, of jobs mainly held by people in different racial groups, and any jobs mainly carried out by those with disabilities.
• All those concerned should have been trained in equal value issues and awareness of how bias occurs.
• The factor plan should be non-discriminatory (the list of factors should favour neither men nor women, nor any racial group, and it should not discriminate against those with disabilities).
• No important job demands should be omitted from the factor plan which should be representative of the whole range of work to be evaluated.
• Factor plan definitions should be precise and unambiguous.
• There should be no double counting of factors.
• The number of factor levels should be realistic and points gaps should reflect real steps in demands.
• To avoid biased implicit weighting, factors which are characteristic of jobs or roles held largely by one sex or one racial group should not unjustifiably have greater numbers of levels than factors which are contained in jobs or roles held mainly by the other sex or other racial groups.
• The knowledge and skill factor should not operate unfairly against women, members of different racial groups or those with disabilities by an undue emphasis on qualifications or experience.
• There should be a rationale for any factor weightings which should reflect the importance of the demands for the whole range of jobs or roles in the organisation and which should not contribute to the perpetuation of the existing hierarchy or be biased with regard to either women or men, any racial group or those with disabilities.
Operating guidelines
• All those involved in analysing and evaluating jobs and roles should be thoroughly trained in the skills involved and the operation of the analysis and evaluation process as well as in equal value issues and awareness of how bias occurs. Recognised trades union representatives should be included in appropriate training.
• Job descriptions or role profiles should be written to an agreed format to enable jobs to be assessed to a common standard.
• Job or role analysts should be provided with a comprehensive list of the elements they should cover in the jobs to be analysed.
• If job evaluation panels are used they should include a representative sample of people from the spread of jobs to be covered by the scheme.
• The chairs or facilitators of job evaluation panels should be selected for their knowledge of job evaluation, their impartiality and their concern that decisions of the panel are not discriminatory.
• Over-reliance on generic job descriptions should be avoided, especially when there are significant clear variations in job duties.
• Checks should be made to ensure that job descriptions or role profiles are completed to a uniformly high standard.
• Gender, race and individual identification should be removed from job descriptions or role profiles.
• The outcomes of a job evaluation exercise should be assessed to ensure that there has been no bias.
• The operation of the scheme should be monitored to ensure that discrimination has not taken place.
Grade structure design guidelines
• Grade boundaries should not be placed so as to unjustifiably segregate jobs or roles mainly held by men from those mainly held by women or jobs or roles which are predominately carried out by one racial group from those carried out by other ethnic groups.
• So far as possible, grade boundaries should be placed where there are gaps in the rank order of scores.
• If jobs or roles are re-evaluated because their score brings them to just below the grade boundary, care must be taken not to allocate additional points in a discriminatory way.
• Wherever appropriate the design of grading structures should take account of national agreements and examples of 'best practice' elsewhere.
APPENDIX D
IMPLEMENTATION FRAMEWORK
The first stage in implementation is to determine:
• the objectives of the exercise and what resources are available to undertake it,
• who is locally responsible for implementing the selected role analysis scheme, and
• and whether there will be a Steering Group that will be monitoring progress.
Issues which need to be considered include the availability of existing staff, the importance of involving recognised trade unions in the process, the employment of additional staff, and the use of external consultants. JNCHES recommends that HEIs, working in partnership, consider establishing a joint steering group which includes representatives from employers and trade unions.
Other issues to be taken into account include the financial resources for implementation and an overall time-scale which is practicable and achievable.
Institutions will also need to address at an early stage, in partnership with their recognised unions, whether and to what extent they wish to make use of the planned national library of indicative role profiles for academic staff. The associated guidelines on the use of those profiles will indicate particular approaches to some of the issues described below, especially as regards selection of the roles to be analysed (section 6) and grading (section 8).
1) Communication
Communications need to reach all those who are covered by the role analysis scheme and involved in the process. In particular briefings and, where appropriate, more formal training will need to be provided for the following:
i Those whose roles are to be analysed. They will all need to have explained to them the objectives of the exercise, how the process will work and their involvement in it, and what feedback they will get at the end. They may also require reassurance about the use of the data and who will have access to it.
ii The trade unions who are recognised to represent the staff will need to be involved in the process at the earliest possible stage and briefed regularly.
iii Those who will be verifying the information provided by role-holders will need to be briefed about the objectives, and their involvement in the verification process. This can be done through a briefing session, guidance notes or a written explanation (or any combination of these). The more they understand about how the scheme works the better they will be able to confirm, or seek revision of, the evidence. They need also to be clear about the need to distinguish between role requirements and what role-holders are actually doing. If there are areas of mismatch in this respect, these will need to be sorted out before the role is scored.
Consideration needs to be given to the publication of general articles about the scheme in the institution's newsletter, and the publication of details, including copies of documents, on an appropriate website.
2) Selection and Training of Role Analysts
Once the number of roles to be analysed has been agreed it will be possible to determine how many role analysts are needed, and who should act as role analysts. Depending on the scheme used and the nature of the roles being analysed, it may be necessary to allow up to half a day of an analyst's time for each role to be analysed, including preparation and scoring time.
Consideration needs to be given to the data collection and the data analysis processes, and local agreement should be reached on this. Analysts may be drawn from line managers, trade union representatives, personnel staff, or other individuals. Scoring may be done by the analyst or by using existing mechanisms, such as a grading committee. Where existing mechanisms are utilised all the individuals involved need to have received the same training as the analysts. Whatever mechanisms are established must ensure consistency of approach. Different mechanisms or grading committees for different staff groupings are unlikely to do this.
In order to eliminate the possibility of unjustified bias, once the data on a role has been collected and verified, it should be scored in accordance with the operating specifications of the chosen role analysis scheme.
Once analysts have been selected they must receive training in the appropriate role analysis techniques.
Before undertaking role-analysis the analyst needs to have received thorough training in interview skills and diversity/equal opportunities, as well as in role analysis. Where an individual has not already received this training, arrangements will need to be made for this to take place.
Local trade union representatives should be offered local role analysis training even if they are not going to undertake any role analyses, so that they are fully aware of how the scheme will operate within their Institution.
3) Data collection
This can be done in a variety of ways: for example,. an interview between the analyst and the role-holder, facilitated functional workshops, or completion by the role-holder of a written record or electronic pro-forma (either of which may be supplemented by an interview). If the role-holder is to complete a written record themselves (assuming that written communication is a required competency for the role), more detailed training for the role-holder will be required. Alternatively, the analysis can be done directly from a job description which has been verified by the role-holder and their line manager. This can be useful if the role is new or vacant, but the job description should be verified by the role-holder and their line manager wherever possible.
Each method has its pros and cons, and more than one approach may be appropriate within an institution. Data obtained from the role-holder, through an interview with a trained analyst, is likely to produce the most accurate picture of the role, but this can be time-consuming for both parties. Completion by the role-holder of a written record can save the analyst's time, but may involve more of the role-holder's time. Group interviews need careful facilitation, but can be particularly helpful where there are a number of employees in one role. Analysis of the role from a job-description of the traditional kind is the least satisfactory. Whilst it has the benefit of involving considerably less of the role-holder's or analyst's time, it can result in a reduction in the quality and quantity of the information obtained.
In deciding which method, or combination of methods to use, consideration needs to be given to the resources available and the overall objectives of the role-analysis exercise.
4) Verification
Data collected must provide an accurate record of the role, ensuring that nothing relevant has been omitted. It will be necessary to put in place a verification process whereby the immediate supervisor, or some other designated person who is familiar with the role, signs off the evidence as correct at that time and distinguishing as necessary between requirements of the role and current practice.
Only when the role requirements have been verified and any mismatches between those and actual activity have been clarified, can the role be scored. If there is disagreement this needs to be resolved through an appropriate local mechanism, including the scope for recourse to the institution's grievance procedure.
5) Scoring
Scoring should only be undertaken by someone who has been trained in how to do it. Good equal opportunities practice dictates that each role should be scored more than once, to eliminate any possibility of bias, with the results compared and reconciled.
Each role should be treated in the context of the whole institution. Individual institutions will want to add local guidelines to any produced nationally to ensure that local language, definitions and priorities are properly taken into account. Such local additions should only provide interpretation and clarification within any national notes for guidance for the selected scheme - in order not to risk distorting the scheme and thus increasing vulnerability to equal value claims. After scoring, the role should be viewed in relation to the overall rank order and any surprises or anomalies examined.
To ensure consistency of approach it is recommended that the team of analysts as a whole meets regularly to resolve any local difficulties and establish local implementation guidelines and issues of interpretation.
6) Selecting the roles to be analysed
Some institutions may wish to evaluate every role. Most will wish to select a sample for analysis. Consideration needs to be given to how to select this sample. Possible approaches are:
• to include roles at all levels in a school, department or section, to include both span and range; or
• to include a small number of roles from every department or section across the institution to provide a representative sample of the institution as a whole.
Where it is decided to adopt a sample approach, care needs to be taken in deciding on the sample size. A representative benchmark sample for a medium size university, covering all roles, might be a minimum of 10% of all roles (bearing in mind that some roles are occupied by a lot of employees whilst other roles are unique). However, the more roles that are included, the more accurate the overall picture will be. Within each identified element of the sample, it will be necessary to determine which role-holders are included in the analysis. This can be done by seeking volunteers from role-holders or by selecting those that are deemed to be most typical or appropriate.
In selecting the roles to be analysed, and the role-holders to be interviewed, it will be necessary to take account of the balance of gender, ethnic origin, disability, and employment status amongst the work force, and to explore any differences which might lead to unfair discrimination.
Once data has been gathered on a representative sample of roles, these can be used as benchmark comparators against which the size of comparable roles can be measured. For example, where a points range has been established for a role such as Electronics Technician, it may be possible to use this as a means of drawing up a profile for others in very similar posts. Care needs to be taken in establishing that the posts really are alike and that they do not just carry the same job-title, as these can sometimes be misleading. The fundamental issue is what each role-holder is required to do. It is essential to include in such a process the opportunity for an individual role-holder to request a full analysis of their role where they consider that it does not match the benchmark being used.
Extensive use of benchmark profiles (properly validated by detailed role analysis) can helpfully reduce the burden on institutional HR resources and role-holders. The planned library of academic role profiles is intended to assist in delivery of such "light touch" approaches. However, a full analysis of individual roles will continue to be needed in respect of atypical roles, roles at the cusp between grades, and where a role-holder seeks a review - as well as for validating the benchmark profiles used for other staff.
7) Feedback and Review
It will be necessary to consider what feedback should be given to role-holders whose roles are analysed, both in terms of their own scores, and in terms of the overall objectives of the exercise. Under the Data Protection Act 1998 and the associated Code of Practice 2002, employees have a right to receive information held about them, and this is likely to include scores for role assessment.
A role-holder who is dissatisfied with their grading outcome must have access to a review, and a local mechanism needs to be established. In the event that a role has been evaluated by a mechanism other than a one-to-one interview, it would be normal for a review to require such an interview to take place, and for the role to be re-evaluated on the basis of this. Whilst in many cases there will be a joint review mechanism, the precise nature of this will need to be determined in partnership with recognised unions - taking account of existing local arrangements, the requirements of the Framework Agreement, and the need to enable the employee to query both how the role analysis scheme was applied in their case and the verification of their present duties.
Those involved in the review process should be fully trained in role analysis in the same way as other role analysts (see section 6 above).
8) Fitting the rank order to a Grading Structure
Role analysis differentiates to produce a rank order of roles which can be used to inform decisions on pay. Once a points score has been agreed for each role, institutions (in partnership with local trade unions) will need to determine how this links with a defined pay range - through the usual local procedures and by reference to the new Framework Agreement and any associated JNCHES guidance. Reference should also be made, where appropriate, to the forthcoming library of academic role profiles and accompanying guidelines..
The provisions of the agreed institutional policies on grading and assimilation (including for posts that are green-circled and red-circled) should be clearly explained to staff at the outset of the process. Minimum provisions are detailed in Appendix F of the Framework Agreement.
9) Role of trades unions
Effective implementation of the role analysis arrangements is likely to be achieved where HE institutions work in partnership with their recognised trades unions. JNCHES therefore recommends that institutions agree appropriate facility time for this purpose, including for necessary training of the union representatives involved in the role analysis process.



