Monitoring the Monitors
With over 15,000 visits carried out each year on a variety of sites of different sizes, duration and type, it is vital that the Scheme has a robust process in place to monitor its Monitor’s activities to ensure that we are delivering a consistent approach to the very highest of standards.
The quality of monitoring activity and the consistency of application of the Code and Checklist is fundamental to the integrity of the Scheme. Over the years, the Scheme has developed a number of systems to monitor key aspects of this work and has procedures in place to take investigatory/remedial action where required.
The Scheme has a Review Group (RG), made up of a selection of the Scheme’s most experienced Monitors and their role is to monitor the Scheme’s activities, design and deliver training, issue guidance notes, update documentation and presentations and much more.
Review Group (RG)
The Scheme’s RG is a working group of experienced Monitors who assist the administration office in undertaking much of the review work. The RG are ideally placed to assess the quality of Monitor activity to establish whether it is meeting the required standards.
The role of the RG is to:
- maintain and improve Monitor performance standards through a consistent and coherent approach to Monitor selection, training and ongoing review and feedback;
- to provide professional and technical advice to the administration office (AO) including the development and revision of documentation;
- identify potential procedural improvements to the monitoring and administrative processes.
The AO is responsible for the efficient delivery of the Considerate Constructors Scheme as outlined in the Scheme’s Business Plan and in line with predefined Key Performance Indicators. The RG acts as an aid to the AO as agreed by the Scheme’s Board.
The RG will comprise a core team of a Chairman and several Team Leaders made up of experienced and competent Monitors with current or recent Board experience an advantage. Each Team Leader will oversee a specific function within the RG and will liaise directly with the appropriate member of the AO team keeping the Chairman informed where appropriate.
The Chairman will ensure that the review group performs its role well and will intervene where standards or expectations are not being met and in conjunction with the administration, recommend any training thought appropriate for new or existing group members.
Each of these elements will be owned by a Team Leader and overseen by the Chairman. Each of these functions will also require additional resource which will be provided by other Scheme Monitors supporting the Team Leader.
The Review Group also oversee the Quality Control function of the Scheme’s Best Practice Hub though this is not a separate function within the Group, nor is it headed by one individual.
Selection and Appointment
The Scheme recruits Monitors from the senior ranks of all disciplines within the construction industry and each Monitor must have a high level of understanding about all aspects of the UK construction industry, importantly demonstrating a desire to see it improve.
When a potential Monitor expresses an interest in joining the Scheme, they are put through a rigorous selection and assessment process.
After a prospective Monitor has been identified, a Monitor Trainer (MT) is selected to take the candidate on a number of visits to assess their suitability as a Monitor. The first meeting allows the candidate to shadow the MT on a number of site visits to better understand what a Monitor does and to get an overview of the Scheme. The second meeting provides an opportunity for the prospective Monitor to carry out the visits and produce the reports, with the assistance of the MT.
Typically, the type of person considered for the role will have an extensive industry background, ideally with ‘hands on’ construction knowledge, and will be involved with other industry organisations or bodies.
Only individuals who show an aptitude for the role and for the production of the Monitor’s Reports will be invited to join the Scheme.
It is vital that the MT is satisfied that the prospective Monitor not only has an understanding of the Scheme and its purpose, but is also an appropriate representative for the Scheme and the industry.
Once a candidate has been selected as a Monitor, the MT will then act as a mentor. The purpose of this mentoring is to assist with any issues or queries that may arise.
Once three or more Monitors have been taken on, the Scheme will arrange an induction meeting at its office, which will be carried out by an MT. This also gives the new Monitors the opportunity to meet each other and the Scheme’s administration team.
From appointment of a new Monitor, the RG will also review every report that is produced, and provide feedback on the overall quality of the reports, until they are satisfied that they are meeting the required standards.
The Monitor’s Report is the written confirmation of the meeting with the site manager and should properly reflect the matters that have been discussed. It should recognise the positive measures that are being undertaken whilst also identifying those areas that, when addressed, will help to raise standards.
Increasingly, Monitors’ Reports are used by contractors as a measure of performance across companies locally, regionally and nationally, and by clients when compiling lists of potential contractors. It is therefore essential that reports be of a consistently high standard between Monitors across the organisation.
Reports should recognise that certain sites, of limited involvement and low value, will not have the scope of the larger, more involved ones. This should be taken into consideration and be evident in the report.
Included in the Review Group’s (RG) role is to undertake a regular report reviewing process to check that report writing is in line with the standards set out in the Monitors’ Handbook. Other Monitors may be asked to assist the RG with this review. The admin office will select Monitors for report checking based on their key performance indicators (KPI). As such, Monitors may be contacted by members of the RG to discuss their reports.
The RG member carefully studies the reports to ensure that the agreed criteria are being achieved and speaks personally to the Monitor, commending him/her on the positive aspects of the report and identifying ways in which the reports can be improved.
Monitors who, in the opinion of the reviewer, require further training will have their reports looked at on a more frequent basis than those that are achieving a satisfactory, or better than satisfactory, standard.
When assessing the contents of reports, whether as part of a formal report review, a high/low scoring report review or the checking of an innovation, the RG decision is final. Any disagreements should be escalated to the relevant Team Leader or, if necessary, the RG Chairman.
Each year a selection of Monitors will meet with an MT who will formally review their performance on a site visit.
The area of performance being reviewed falls into seven categories: initial impression; visit arrangements; attitude when monitoring; visit requirements; interaction with site; general understanding; and report overview.
The purpose of the exercise is to praise a Monitor for good work, or to highlight areas where any improvements could be considered necessary. Should the MT have any serious concerns about a Monitor’s performance then these will be highlighted to the administration office and appropriate action will be taken.
Report return times
The Scheme will endeavour to send completed reports to the site manager as quickly and efficiently as possible.
On a regular basis, the Scheme looks at Monitors’ average report turnaround times to ensure reports are completed and submitted promptly.
Monitors’ scoring patterns are reviewed on a regular basis which is done to assess and aid in overall consistency.
Monitors are emailed with their personal average score compared to an individual estimated average score which takes into account the region that the Monitor operates in and the contractors he or she has visited.
This provides an extremely accurate guide as to whether a Monitor is scoring higher or lower than would statistically be expected, based on the overall performance of the contractors visited.
Each time the averages are emailed to Monitors, they will only be issued with the overall estimated average score of all of the companies they have visited and their comparable average score.
For any period of time, the companies’ estimated average score is indicative of the average that the Monitor was estimated to have scored those sites based on the scores of other Monitors visiting the same contractors in the same regions.
This calculation shows how a Monitor is scoring compared to other Monitors. The estimated average is not a target to aim for but should be used to judge whether scores should generally be increased or decreased to come in line with the overall scoring patterns.
Key Performance Indicators (KPIs)
To assist the Scheme in measuring the performance of Monitors, a number of KPIs are considered. These help the Scheme and the RG to track Monitor performance and highlight if additional help and guidance is required. KPIs include:
- Percentage of 7 and 8 point scores
- Percentage of 9 point scores
- Percentage of 40 point scores
- Percentage of 40+ point scores
- Percentage of non-compliant reports
- Percentage of complaints received
- Percentage of award winning sites based on immediately previous National Site Awards
Reviewing high and low scoring reports
Reports that confirm a score that is exceptionally high or low, outside of a predetermined range, are automatically reviewed by the RG before they are issued out to the contractor to ensure fair and accurate reporting.
As part of the Scheme’s scoring system, Monitors look for activities or initiatives that they consider to be innovative within the Scheme’s scoring definition. As each Monitor can only make such a judgment based on their own experience, all innovative submissions are reviewed by the Review Group to ensure consistency across all sites, companies and suppliers visited.
Any concerns or complaints that a contractor or client may have regarding the content of a Monitor’s Report should be forwarded to the Scheme’s administration office in writing and within 30 days of the report being issued.
In the first instance, the administration office will read the report and assess the issues raised. The RG will be asked to review the complaint and liaise with the relevant Monitor where appropriate before preparing a suitable response to the complainant.
Where the RG and the Monitor agree that the report is inaccurate, the Monitor will be asked to make the appropriate changes and the report will be re-issued.
Where the RG and the Monitor are satisfied that the report is accurate, a letter will be issued explaining that the Scheme is happy that the report is a fair reflection of the site’s performance and offering, for a fee, a re-visit to the site should it be required.
If the administration office has any concerns that there may have been an error by the Monitor, the details of the site may be issued to another Monitor to re-visit the site. If their visit highlights a scoring inconsistency by the original Monitor, then further investigatory work will be conducted including the possibility of a report review or a feedback assessment.
To allow sites and companies to provide instant feedback on the monitoring process, the email sent with their reports now includes a link to an online survey where the recipient is asked a number of questions about their experience.
The information received gives the Scheme an indication as to how its Monitors are performing and allows the Scheme to identify trends or patterns as well as individual concerns. As each question is scored out of 10, this provides an indication as to how certain areas improve, or otherwise, over time.
Monitors’ disciplinary procedure
All aspects of a Monitor’s performance are constantly monitored by the Scheme’s administration office. The areas of monitoring include scoring patterns and averages; quality of reports, style and content; punctuality in producing reports; frequency of aborted visits/missed site visits; frequency of incident reports logged against a Monitor; feedback from contractors; and general attitude.
In the case that the Scheme has a concern about any aspect of a Monitor’s performance, then they will contact the Monitor to discuss the issue.
Where the concern relates to site or company monitoring, or report production, the administration office may decide to stop the Monitor from arranging or making any further visits so that the concern can be addressed. In this case, the office will arrange for an MT to meet with the Monitor to discuss the issues identified.
If the MT and the office are happy that the Monitor understands the issues raised, and will act to resolve them, they will be allowed to continue making visits. The office will continue to track the performance of the Monitor until they are happy that any issues have been resolved.
If it is the case that the MT and the office agree that it is not possible to resolve the issues, then the office will write to the Scheme’s chairman, suggesting that this Monitor should not be issued with any further visits. If the chairman agrees, then the Monitor will be written to, explaining why the decision has been taken.
Where the concerns do not relate directly to site or company monitoring, or report production, and the office does not feel re-training the Monitor will help address the issue, the office will highlight the concerns to the chairman with a recommendation that the Monitor be immediately removed from the Scheme. In severe cases, for example where unacceptable complaints have been received about a Monitor’s conduct, the administration office may stop issuing visits to the Monitor prior to contact with the chairman.
In all cases, the Monitor will be written to explaining the decision that has been made which may be that the Scheme will no longer require them to act as a Monitor on its behalf.
Meetings and training
Monitors’ Regional Meetings
The Monitors’ Regional Meetings take place in the first half of the year in various regions across the UK. They provide an opportunity for all Scheme Monitors to meet other Monitors who visit sites within the same region and to hear about current Scheme news and proposed future developments. Each meeting is chaired by a Scheme Director and the Chief Executive of the Scheme.
Each meeting is an interactive session with lively discussion around the content, style and scoring of reports. The meetings also serve to ensure continued consistency of scoring in every site report and sharing examples of good practice that Scheme Monitors have witnessed on site visits.
Monitor Development Workshops
Monitor Development Workshops are run throughout the year and are used to discuss all aspects of monitoring to help ensure consistency across the Scheme. Every Monitor will attend at least one workshop every year as part of their training. The workshops are presented by a Monitor Trainer (MT) who will discuss topics such as Monitor visits, report writing and scoring. During the workshops, a number of exercises will be carried out by the Monitors which will lead to discussions chaired by the MT.
The Scheme hosts an Annual Conference where all Monitors and Scheme administration staff meet for a day. This provides an ideal opportunity for Monitors from all regions of the UK to meet and discuss monitoring, as well as speaking with administration staff about processes and performance. The conference includes a presentation from the chairman on the latest Scheme initiatives and developments, and from MTs on various aspects of monitoring. Monitors are also given exercises to complete throughout the day which asks them to discuss various topics and scenarios, with advice and suggestions offered by the MTs.