Attribute Sampling - A New Approach to Service Contracts Surveillance
Attribute Sampling
A new Approach to Service Contracts Surveillance

This article is about a simplified service contract surveillance technique called "Attribute Sampling." An attribute is a characteristic or trait of a service output that can be measured or compared to a performance standard that the contractor has to meet. If all attributes of a service deliverable are satisfactory, then the service is satisfactory. If a reasonable sample of attributes is evaluated as satisfactory, then it is reasonable to assume that the overall service is satisfactory. This technique is a way to formalize what is frequently referred to as "walk-through inspections", "spot checks", or "periodic validation." These informal inspection techniques generally provide a low confidence level in the overall results of service delivery, except where the contractor has demonstrated over a considerable period of time the ability to control the quality of services provided at a consistently high level. These informal inspection techniques are set apart from the resource intensive traditional methods of inspection and are the result of years of cuts in staffing and funding for service contracting and of being expected "to do more with less." "Attribute Sampling" is based on a simple concept and can be applied to any service contract. The technique may at first appear to be complicated, but it involves no more effort than that required to set up any reasonable surveillance plan. A well-formed Attribute Sampling plan is an all-inclusive, easy to use, Quality Assurance Surveillance Plan.

The Basis for Inspection

In Attribute Sampling, as for the inspection of any requirement, the basis for inspection must be determined. Some requirements have clearly defined units of service which become the basis for inspection For example a custodial services contract may require daily cleaning in 20 near identical offices. In this case each office can be inspected independently as a unit of service. In another example, the contractor may be required to perform elevator maintenance. Here, the service delivered at each elevator would be a reasonable basis for inspection. However, in most facility operation and support services contracts the contractor delivers many different services and the service is not the same each day. In these contracts there is no single unit of service that clearly can be used as the basis for inspection. In such contracts it is reasonable to evaluate a day of all services delivered to evaluate overall performance for the day. There are nominally 20 working days in a month. Depending on contractor performance, the number of days in the month to be evaluated will vary. In a new contract, or when performance is poor, every day may need to be inspected. Otherwise, a sample of 5 to 10 days may be adequate. Such a sample may be determined randomly or selected by the COR. When using Random Sampling, normally only five days out of the month need to be the evaluated. Thus, a "day of service" is the basis for inspection in Attribute Sampling.

Attributes to be Inspected

When evaluating a day of service, the COR must determine from the contract and the contractor’s schedule the services that are to be delivered and where they are to be delivered. From the contract, the COR must determine all of the attributes of the services to be delivered and the standards that must be met for service to be acceptable. Again, depending or contractor performance, all attributes can be inspected, or a sample may be sufficient to determine whether or not the day of service is satisfactory. If sampling is used, the sample size may vary, again, depending on the contractor's performance. If using "Planned Sampling", a sample of around 30% of the services delivered should be adequate. The sample should be larger if contractor performance is marginal or unsatisfactory and it should be considerably smaller if the contractor's performance is superior. If Random Sampling is used the sample size is dictated by the Random Sampling application and it will be much smaller than that required by Planned Sampling. With 500 to 1000 attributes to be evaluated in a single day the sample size using Random Sampling will vary between 50 and 80. Using Planned Sampling the sample size will vary from 150 to 300. From this it can be seen that using Random Sampling requires less inspection effort and yet it results in a significantly higher confidence level in the overall results.

The process described in the preceding paragraph is referred to as attribute sampling. It is clear that if all attributes of a day’s service are inspected and found to be satisfactory, then the overall service for the day is satisfactory. However, when using a sample of the attributes for inspection, the question is, if the sample is satisfactory does that mean it is reasonable to assume that the service for the day in all areas is also satisfactory? The answer to this question is that if the sample is large enough, the results in the sample can be used to represent the overall service delivery for the day with a reasonable level of confidence. A random sample will always provide a high confidence level in the results regardless of the sample size.

Inventory of Services

The first step in development of an attribute sampling plan is to create an inventory of services attributes to be evaluated based on the contract. Services will be found in the Performance Work Statement (PWS) and may also be identified in a Services Summary (SS) document. In most contracts there will be additional requirements in various contract clauses and reference documents that must also be evaluated from time to time. These additional requirements are generally referred to as Non-Services Summary (NSS) items. Each required service will have one or more attributes with performance standards that must be met by the contractor for the service delivery to be satisfactory. This step is an essential part of the development of any QASP, regardless of contract value, scope, complexity, or type. It becomes the basis for the development of the surveillance checklists that must be prepared to document the results of inspections performed. This listing should show the required service, the attributes of the service, and the standards that must met for the service delivery to be acceptable. Each required service attribute should then be assigned a unique identification number. For example, in a grounds maintenance contract there may be on the order of 15 to 20 required services and each will have one or more attributes of service output. Each attribute should have a standard specified in the PWS. In the table below, some representative examples of required services, attributes and standards for grounds maintenance are shown. These attributes are evaluated against the standards each time the contractor delivers the service. One hundred percent inspection is not required for all of the service deliveries to be evaluated. So a plan is created for evaluation of some of the services to conclude that the overall service is satisfactory or not. In the table below only three required services are shown. Each service has multiple attributes and each attribute is assigned a unique attribute number. For easy reference, the performance standards from the PWS are paraphrased and shown on the inventory form.


Table 4-1: EXAMPLE OF AN INVENTORY OF SERVICES ATTRIBUTES
Att # Required Service: Maintain Improved Grounds (IG)
IG 1 Mow grass height 2 to 4 inches, uniform in appearance, free of skips, gaps, rutting, or scalping. No visible debris, grass clippings, or leaves.
IG 2 Edge hard surfaced areas = ˝-inch from edged, even, uniform in appearance, free of scalping, rutting, and uneven or rough cutting, max overhang = 1.5 inches, joints and cracks free of vegetation
IG 3 Trim around fixed obstacles to match surrounding grass height, clippings removed or mulched
IG 4 Maintain, adjust, and repair existing lawn irrigation systems. Irrigate to maintain health of turf, no pooling or excessive runoff. Operate automated irrigation system.
Att # Required Service: Maintain Semi-Improved Grounds (SG)
SG 1 Mow grass height 4-10 inches, uniform in appearance, adjacent areas same height. Pick up natural and manmade debris immediately prior to mowing, mulch or remove debris or trimmings after mowing.
SG 2 Trim grass and weeds around fixed obstacles and temporary obstacles or objects to match surrounding area.
Att # Required Service: Indefinite Delivery Services (ID)
ID 1 Remove leaves in improved grounds to maintain a neat and professional appearance.
ID 2 Prune Trees: meet Delivery Order standards, safety clearance 14 feet over streets, 12 feet over driveways, 8 feet over walk areas, and 4 feet from buildings
ID 3 Tree Removal and Stump Grinding, meet Delivery Order standards
ID 4 Remove debris and police grounds: Police improved grounds to maintain a neat and professional appearance. Police other grounds to meet safety, security, and operational requirements.
1.1.1 SERVICE DELIVERY AREA LOCATIONS

The next step is to identify the areas where services will be delivered. Most contracts will include site plans and area descriptions showing where services are to be delivered. Use this information to identify the areas where each required service is to be delivered. The area descriptions should be clear in the contract and should also be the basis for the contractor’s scheduling of performance. If this is not the case, it will be necessary to adjust the area descriptions to reflect the contractor's performance plan.


Table 4-2 SERVICE DELIVERY AREAS
IMPROVED GROUNDS
Area Grounds Description
A Major Command Buildings Area
B North Dormitory Area
C Bldgs 3001-3005
D South Dormitory Area
E Office Complex Area
F Bldgs 130, 1002, and 14C
SEMI-IMPROVED GROUNDS
Area Grounds Description
M Shop area C and F
N Fence Line, Highway 101
O Warehouse area G
P Lay down areas B and D

Add other areas as necessary to accommodate IDIQ orders not otherwise included.

INDEFINITE DELIVERY AREAS
Area Grounds Description
X Unimproved Grounds, drainage ditches
Y Indefinite Delivery tree removal and stump grinding facility wide

ATTRIBUTE SAMPLING FORM

The next step is to prepare an attribute sampling form for the contract. It should identify each attribute of each service in each location where the service can be provided. The form should include three columns. The first column identifies each attribute with a unique identification number. The second column is for the sequence number assigned to each attribute of service that is scheduled for delivery on the day to be evaluated. Normally, the sequence number begins with "1" and continues sequentially until the last attribute scheduled for delivery on the day being evaluated has been assigned a number. To assign this number to an attribute, the COR must go through the contract and the contractor's performance schedule and identify those areas and attributes that are to be or have been performed for the day being evaluated. The third column is used to identify those attributes selected as a sample for inspection. The selection of the sample can be made by Random Sampling, Planned Sampling, or arbitrarily by the COR. Normally, random sampling is preferred because it generally requires a smaller sample size and provides a higher confidence level in the overall results. Those attributes chosen to be inspected are assigned unique identification numbers normally starting with "1" and continuing sequentially. The numbers need not be in order on the form, but they must be from a sequential list and be unique. The sampling form may contain multiple sets of columns on a single page if desired. In the example below, there are three sets of the three columns. The objective is to develop a complete listing of all services and all attributes in the PWS in all areas where they apply. All attributes for infrequent indefinite delivery items are treated as a single attribute for evaluation purposes or the attributes are taken from the indefinite delivery ordering document. The form that follows is an example using the contents of Tables 4-1 and 4-2.

SAMPLING FORM
Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp
A-IG 1     D-IG 2     N-SG 1    
A-IG 2     D-IG 3     N-SG 2    
A-IG 3     D-IG 4     O-SG 1    
A-IG 4     E-IG 1     O-SG 2    
B-IG 1     E-IG 2     P-SG 1    
B-IG 2     E-IG 3     P-SG 2    
B-IG 3     E-IG 4     P-SG 2    
B-IG 4     F-IG 1     ID 2    
C-IG 1     F-IG 2     ID 3    
C-IG 2     F-IG 3     ID 4    
C-IG 3     F-IG 4          
C-IG 4     M-SG 1          
D-IG 1     M-SG 2          

1.1.2 USE OF THE SAMPLING FORM

For illustration purposes, this form contains only a small number of requirements, attributes, and areas for service delivery. An actual contract would have a great many more of each. That said, the form is used by identifying from the contractor’s performance schedule the services that are to be delivered on the day inspections will be made by the COR. Starting with the number 1, and continuing on with 2, 3, and so on until all services scheduled in all areas for performance have been assigned a sequence number. In this example, assume that contractor plans for the day are to perform IG1, IG2, and IG3 in areas B, C and D; IG4 in areas A and F; SG1 and SG2 in areas M and O; and ID3. This is a total of 16 attributes. The Sampling Form would them be marked up as follows to show the scheduled services:

SAMPLING FORM WITH SEQUENCE NUMBERS ASSIGNED
Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp
A-IG 1     D-IG 2 8   N-SG 1    
A-IG 2     D-IG 3 9   N-SG 2    
A-IG 3     D-IG 4     O-SG 1 14  
A-IG 4 10   E-IG 1     O-SG 2 15  
B-IG 1 1   E-IG 2     P-SG 1    
B-IG 2 2   E-IG 3     P-SG 2    
B-IG 3 3   E-IG 4     P-SG 2    
B-IG 4     F-IG 1     ID 2    
C-IG 1 4   F-IG 2     ID 3 16  
C-IG 2 5   F-IG 3     ID 4    
C-IG 3 6   F-IG 4 11        
C-IG 4     M-SG 1 12        
D-IG 1 7   M-SG 2 13        

The method of surveillance to be used determines the next step. If using 100 % Inspection, all 16 attributes would be inspected. If using Random Sampling, the random sample drawn would determine which attributes to inspect, If using Planned Sampling the COR would select the sample based on the size of the sample desired. For a 30% sample, the sample would be 5 attributes. For Periodic Validation, again the COR would select a small sample of 4 or 5 attributes. Suppose the sample selected is sequence numbers 10, 2, 4, 12, and 16. This is shown on the Sampling Form by assigning a number to the sample, starting with 1 and ending with 5, as shown in the form following.

SAMPLING FORM WITH SEQUENCE NUMBERS AND SAMPLE NUMBERS ASSIGNED
Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp Area/
Att. #
Seq
#
Samp
A-IG 1     D-IG 2 8   N-SG 1    
A-IG 2     D-IG 3 9   N-SG 2    
A-IG 3     D-IG 4     O-SG 1 14  
A-IG 4 10 1 E-IG 1     O-SG 2 15  
B-IG 1 1   E-IG 2     P-SG 1    
B-IG 2 2 2 E-IG 3     P-SG 2    
B-IG 3 3   E-IG 4     P-SG 2    
B-IG 4     F-IG 1     ID 2    
C-IG 1 4 3 F-IG 2     ID 3 16 5
C-IG 2 5   F-IG 3     ID 4    
C-IG 3 6   F-IG 4 11        
C-IG 4     M-SG 1 12 4      
D-IG 1 7   M-SG 2 13        

Next, a Surveillance Checklist is prepared for the sample as shown in the following form:

Inspection Checklist Grounds Maintenance Attributes
COR Name: Date:
Sam # Area-Att. # S/U Comments
1 A-IG 4    
2 B-IG 2    
3 C-IG 1    
4 M-SG 1    
5 ID 3    

Each of these attributes is inspected as it is completed and the inspection results recorded on the form. As long as the nature of the service is such that the evidence of quality does not change significantly with time, the inspections may be performed on the following day. Otherwise, the sample attributes must be evaluated while the objective evidence of quality is still apparent.


1.1.3 ANALYSIS OF RESULTS

From the Services Summary in the PWS, the Performance Threshold should show the percentage of service delivery that must be satisfactory for overall serviced delivery to be evaluated as within an acceptable quality level. Again, the detail of how the Performance Threshold is established is addressed in Chapter 3. For non-mission critical requirements, the Performance Threshold should be no more than 90%. That means that as long as 90% or more of the sample inspected is satisfactory, then the entire service for the day can be considered as satisfactory. If any unsatisfactory work is observed, it must be addressed in accordance with the contract corrective action program. Normally this requires that the contractor reperform the defective work.

Contractor Quality Control

Quality control and quality assurance should be a partnership between the contractor and the Government. It is true that the contractor remains responsible for quality control and the Government must initiate corrective action whenever it discovers defects in service delivery. However, under these procedures, the partnership with the contractor begins even before award by working with industry to discover the best of commercial practices that may be applied to contract requirements. The Government continues to work closely with the contractor both during source selection and after award to establish meaningful performance objectives and thresholds to satisfy the Government's specified needs. During the initial contract performance period, Government Functional Commanders or Directors, the COR, and customers must work together with the contractor to validate and modify as needed the performance objectives, standards, and thresholds. Once the contractor has achieved a satisfactory level of performance, the primary element of Government surveillance then is to continuously evaluate the contractor's control of quality.

Evaluation of the contractor's quality control program is an on-going effort and the nature of the validation effort changes with time. Changes occur in the service delivery environment continuously and these changes impact on the contractor's ability to control quality. The contractor's management and labor mix changes as employees are reassigned, quit, retire, or are hired and fired to account for increases and decreases in work load. Worn and broken tools and equipment are replaced, often with newer and more state-of-the-art items which must be addressed in the processes, procedures, and training programs underlying the contractor's service delivery efforts. Likewise facilities change with time, along with regulatory requirements, the availability of supplies and materials, the contractor's financial condition, and a host of other elements, any of which can impact significantly on the quality of service output. A contractor with a successful past performance record will routinely account for such changes in the continuing effort to control quality. The Government must be assured that the contractor is successful in this effort and must insist on the contractor's maintenance of the system in an acceptable manner at all times during the contract performance period. To do this, the Government measures the contractor's control of quality by establishing a performance threshold for each required service. Service output which does not meet the established performance threshold is unsatisfactory. This means that the contractor's control of quality for that service, during that observation period, is unacceptable. When performance is thus found to be unacceptable, the Government should identify the nonconformances to the contractor and require corrective action, both to correct the observed nonconformances and more importantly, the root cause of the problem.

The primary element of the Government Contract Quality Assurance Program is simply to validate the contractor's quality control system and monitor contractor metrics. This approach to Quality Assurance requires less effort on the part of Government the COR and yet it still presents acceptable risk. The risk of accepting unsatisfactory service delivery is greatest during the initial phase in period of contract performance. It is during this period that the Government must work diligently with the contractor to establish meaningful expectations and metrics for the major required services.

Attribute sampling is the most effective approach to evaluation of most support services contracts. If using Random Sampling to determine which days of the month to evaluate, it means that only five days are inspected, and not full time for each of those days. In the other days of the month, the COR should be conducting "walk-throughs" of the areas where services are being delivered, to obscure the actual schedule for Random Sampling. The contractor should not be able to guess the days that "count" for evaluation purposes. During these walk-throughs, the COR should, of course, document any nonconformances found and initiate appropriate corrective action. In addition, the COR should be doing periodic evaluations of all elements of the CQCP. All elements should be evaluated on a continuous basis, but the most important elements are the processes and procedures used for delivery of the required services. The COR should routinely look at various contractor work crews and verify that they are properly trained to perform the work that they are engaged in, have the tools, equipment, and materials called for in the contractor’s process control procedure, and are performing in accordance with the procedures as required. Any departure from the CQCP should be addressed with contractor managers for resolution and should be documented in the COR file.