Select Committee on Social Security Second Report



7. DESIGN OF THE PILOT

Brief overall view of the whole process undertaken in the exercise, including its management and fieldwork

Management of Pilot

7.1  The design of the pilot evolved after consultation at all levels, including Ministerial. FAMC Unit, FAMC Fraud and DSS Policy Group were consulted at the earliest stage of the planning process and involved throughout. A Project Initiation Document (PID) was produced to set out the objectives and scope of the review and the responsibilities for all the major work activities and a study timetable was produced. The Benefit Review Project Steering Committee was responsible for high level management of the review.

7.2  A working group was set up comprising members of the Benefit Review Team (BRT), Business Systems Security (BSS), Analytical Services Division (ASD2H), SS Policy Group (FS1), FAMC Unit, FAMC Fraud Section, InDepth Consulting Group and Benefit Fraud Investigation Service (BFIS). The Working Group was responsible for designing the review methodology and planning the implementation of the pilot. The methodology was specific to FAMC, however experience gained from previous reviews of other benefits was not ignored.

Methodology

7.3  Broadly the methodology was as follows:

Sampling

7.3.1  A sampling strategy was agreed with statisticians. A random sample of 200 employed and 100 self-employed FAMC customers, living within the areas covered by 15 local offices, was chosen. To ensure that cases were reviewed as close as possible to the date of claim sampling was based on the date of award. The sample was chosen in three batches in weeks 0, 4 and 8 of the fieldwork.

Interviewers

7.3.2  Fraud Area Managers (FAMS) were asked to provide a list of experienced officers prepared to undertake the fieldwork. A total of 15 interviewers were recruited.

Training

7.3.3  All interviewers attended a training course designed to include instruction on the Benefit Review methodology, research-type interviewing techniques and technical training on FAMC.

Questionnaire

7.3.4  A specially designed questionnaire was used to capture data from all customers interviewed. It was designed to structure the interview and collect data that could be coded, input onto a database and analysed.

Penpictures

7.3.5  A comprehensive penpicture was produced by the FAMC Unit for each case selected for visit. This provided the interviewer with details of the current claim, relevant claims history and provided indications of the main points requiring checking or clarification.

Helpdesk

7.3.6  A helpdesk was set up to advise and support interviewers during the fieldwork, check returned questionnaires, forward cases for adjudication where appropriate and input outcomes onto the database. The helpdesk was staffed by BA Security and FAMC Unit.

Fieldwork

7.3.7  After attending the training course interviewers embarked on the fieldwork which broadly comprised the following:

  • a full preview of individual cases,
  • an unnotified visit to the home of each customer,
  • a full interview using the specially designed questionnaire, including a statement from the customer and a report,
  • a visit to the customer's and/or partner's employer,

  • any post-interview follow-up work that seemed appropriate, for instance surveillance, and
  • reference to the helpdesk (FAMC adjudication experts) to decide whether a review of entitlement was appropriate.

Adjudication

7.3.8  After the above actions had been completed interviewers sent in their completed questionnaires and other documentation to the helpdesk. All cases were checked by adjudication experts and where a review of entitlement appeared appropriate the case was referred to the FAMC Unit. Any adjudication resulting from Benefit Review action was carried out by a team of independent Adjudication Officers at the FAMC Unit.

Data Entry

7.3.9  After all visiting and adjudication action had been completed, all questionnaires and other documentation, including the outcome of any adjudication, were held at the helpdesk. Cases were then classified as "no change", "case correct", "departmental error", "customer error", "fraud", "suspected fraud" or a combination of these outcomes. These details, together with the information contained in the completed questionnaires, were input onto the database for subsequent analysis by Analytical Services Division.

Analysis

7.3.10  Once the outcomes from the above action had been entered onto the database, ASD carried out an analysis in order to produce the findings.

Debrief

7.3.11  A number of the pilot interviewers were invited to attend a debrief to gauge their reaction to the exercise, obtain ideas on how to improve the methodology and to inform the planning of the main exercise.


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries

© Parliamentary copyright 1999
Prepared 4 February 1999