UKPS Biometrics Enrolment Trial Comments
on Final Report (version of 25.02.05)
Dr Tony Mansfield, National Physical Laboratory,
10 March 2005
OVERALL COMMENTS
How does the trial "help inform Government
plans to introduce biometrics"
Some of the report might be better
focussed if the management summary provided a list of questions
that the trial was trying to answer, and then the report provided
detail on what the trial indicated in response to these questions.
(I am sure that many of these questions are listed in documents
at the commencement of the contract.)
I feel the analysis does not quite
go far enough to answer many of the questions that need to be
answered (even though the trial may provide the required data).
Recommendations
I find the recommendations rather
blandmany would have been obvious prior to the trial. There
are more substantial recommendations to be made:
I believe the trial has provided
data to indicate how long biometric enrolment ought to taketaking
account of both public acceptability, and the capabilities of
the technology.
There could be clearer recommendations
about the adequacy of current technology. All the systems need
improvement, especially in the user and operator interfaces.
The environment recommendations ought
to note that the current enrolment pod did not sufficiently control
the environment.
The report (and previous versions)
made a number of observations where causes of problems merited
further investigation. These should be listed.
A categorisation of types of exception
cases, and their extent, would lead to a recommendation that future
tests include such cases to check that necessary system improvements
have been made.
I would have expected a number of
recommendations in regard of any future trialsurely many
lessons were learned by the problems encountered during the trial
setup!
Recommendations that are made could
also have more substance.
Eg the trial shows that allowing
longer enrolment times with the existing system will not achieve
the required improvement in enrolment success rate, even if publicly-acceptable
enrolment times are exceeded. However the recommendation for a
secondary enrolment system with different interfaces is made without
this justification.
Report structure
Section 1 is somewhat long for an
executive summary. Could this section commence with a shorter
overview (of a page or so) that provides highlights of the main
findings.
Some key findings, such as identifying
demographic groups with greater concerns and greater difficulties
in using the systems, are missing from the summary.
Annexes should not to contain key
findings or recommendations beyond those in the body of the report.
Could Annex C2 be moved to the main part of the report. Alternatively
there could be sections listing all recommendations and all key
findings together.
Additional sections
The summary should include some details
of the affect of demographics on performance and attitudes.
I could not see a section discussing
the views of the enrolment staff as to the ease of use or problems
with the systems used (as per a previous suggestion).
Add an annex providing a copy of
the questionnaire.
Add an annex explaining acronyms.
Wording
The report could be more carefully
worded in several places.To take one case, the majority of trial
participants were unconcerned about booth privacywhat is
written is "booth privacy is not an issue across all groups"this
seems rather dismissive of the views of the minority. Factors
which affect only a few individuals may still be an issue that
has to be addressed in deploying biometrics. Also, it is unclear
whether "all groups" refers here to the Quota, Disabled,
and Opportunistic groups or to all all demographic groups. There
is similar wording in many places in the report.
Section on Standards
The review of international standards
in Annex C3 omits the work of the international standards committee
SC37 (apart from referring to the US shadow committee INCITS M1and
it would have been more appropriate to refer to the equivalent
UK committee BSI/IST44). This annex does not add anything to the
report, and does not relate to any of the trial findingsI
suggest this is removing this section from the released version
of the report.
DETAILED COMMENTS
p10§1.1.1para 1, sentence 1
I thought ATOS designed the trial (as stated
in 1.1.2).
Change "designed by UKPS" to
"commissioned by UKPS"
p10§1.1.1para 3 "specific
objectives",
What is listed is the work to be done. I would
have thought the specific objectives would have been the list
of questions to be answered by the trial.
p10§1.1.1para 2, last sentence
Trial results about attitudes are independent
of software and hardware used.
Change "All the trial results"
to "The trial results".
p13§1.1.4Recording the iris
biometric1st paragraph
As stated, the right iris would be enrolled
only once the left iris is successfully enrolled. This does not
accord with my recollection of enrolment.
p13§1.1.4Recording the iris
biometric, and recording the fingerprint biometric, last lines.
How many times was a duplicate enrolment was
detected should be stated.
p14§1.1.4Recording electronic
signature, para 2
This paragraph refers to all the previous steps,
not just recording signature.
Remove the indentation.
p15§1.2.2Booth privacy
Awkward wording and poor grammar. Suggest changing
to:
"Within the locations and environments used
in the trial. few participants had concerns over booth privacy.
94% of the quota group, 89% of the opportunistic group, and 87%
of the disabled group were not at all or not very concerned about
privacy in the booth during the enrolment process".
Similar changes are required elsewhere:
Level of intrusion
Ease and speed of verification
and other places too (search for "not
an issue", "not a concern")
p15§1.2.2Time taken
Awkward wording.
" [Enrolment] . . . was some 8 minutes on
average. The figures were not equal across the three biometrics
. . . ". This implies that 8 minutes is the average enrolment
time for a single biometric! I suggest changing to:
"The total time needed to enrol face, iris
and fingerprint biometrics was about 8 minutes on averaged. For
each of the quota, disabled and opportunistic groups, the time
required for biometric enrolment was generally as expected, or
faster than expected. The iris biometric had the greatest number(18%-21%)
of participants who found the experience slower than expected.
p15§1.2.2Time taken, line 1
Delete "experienced"
p21§1.3.2Facial enrolment success,
last line.
First attempt enrolment success rate is a rather
esoteric performance metric, which is not as operationally relevant
as the overall enrolment success rate, or the time required. I
suggest it is inappropriate to introduce this metric in the management
summary.
p21§1.3.2Iris enrolment success,
last sentence
The analysis mentioned here is not described
in the body of the report. My inclination would be to delete this
sentence, or replace it with "enrolment operators felt
that the lack of feedback from the iris camera made it difficult
for them to establish reasons for enrolment failure and to advise
corrective action."
p23§1.3.3Iris verification
success, 2nd paragraph
As it is not clear whether this observation
(removing glasses improves iris verification success or whether
it is taking a 2nd attempt that improves iris verification success)
is significant, it should be omitted from the summary.
p23§1.3.3Fingerprint verification
success, 2nd sentence
Poor grammar.
"One of the factors causing a failure
at verification was that the single-finger sensor used for verification
occasionally did not record sufficient detail from the fingerprint.
p23§1.4.1Main recommendation
I don't see this as the most important recommendation
from the trial. Perhaps this should be titled "Back-up
solution/secondary capture devices"
p25§1.5
Surely there should be a conclusion on the adequacy
of the technologies used.
p25§1.5last sentence
Change to "Room or pod design should
be thoroughly reviewed . . ."
p29§2.2
I think that the equipment and technology used
is relevant enough to include in the body of the report. Perhaps
summarise in the form of a table, and refer to the Annex for full
details.
p78§5.2 paragraph 1, last 2 lines.
The opportunistic sample was not a randomly
picked sample of the general public as stated here. (By design,
the quota sample should be a good indicator for a random sample!)
p297Time taken vs participant response
graphs
The way these graphs are plotted is hiding any
information that may be gleaned from such graphs. The tails (long
enrolment times) appear as zerothe height of the curves
are not comparable, as results are not normalised against the
number of responses in each categoryand the bin sizes for
enrolment times are too large, so we cannot really see any difference
in the shape and skew of the curves.
p314§3.1
98 (Looks like 9 to power 8) Move the footnote
mark (8) from "9" to "failed".
Next Page: Extract from a submission to Andy
Burnham on the publication of the UKPS trial final report.
Note that paragraph 5 in this submission refers
to the decision to delay publication of the final report. The
UKPS had planned to publish the Biometric Enrolment Trial report
on 28 April 2005 alongside other Home Office research scheduled
for that day. However, that date fell in the week prior to the
General Election. In line with the General Election guidance (published
by the Cabinet Office); UKPS sought advice as to whether publication
in April would be acceptable. The advice received from the Cabinet
Office was that publication should be delayed until after the
election.
From:
To:
| Rob Bowley
Director Identity Projects
UK Passport Service
8th Floor
Tel: 020 7901 XXXX
Andy Burnham
|
cc: |
Home Secretary
Tony McNulty
John Gieve
Helen Edwards
Paul Wiles
Katherine Courtney
Stephen Harrison
Robert Raine
Henry Bloomfield
Paul Wylie
Christine Nickles
Vivienne Parsons
Peter Wilson
Special Advisers |
Date: | 12 May 2005 |
| |
| |
| |
UKPS BIOMETRIC ENROLMENT TRIALREPORT
ISSUE
1. When to publish the UKPS Biometric Enrolment Trial
Report
TIMING
2. Immediate
RECOMMENDATION
3. The report of the trial to be published on 26th May
2005. Research reports are normally published on the last Thursday
of the month unless there are good reasons for publishing on another
date.
SUMMARY
4. Contractors (Atos Origin) completed the report of
the Biometrics Trial and submitted it to UKPS for quality assurance
on 25 February. We have previously stated in a PQ to Mark Oaten
that the report would be published by the end of March. That date
fell because of the work needed to ensure that the findings were
statistically robust and presented to best effect. There was a
particular issue with the findings for people with disabilities
for whom the success in enrolling in the scheme and the results
in the attitude survey were relatively less favourable than for
the rest of the population MORI and Disability Matters were consulted
and their views incorporated.
5. The report was not taken to a conclusion for publication
in April because of the General Election.
6. The quality assurance of the final report is now complete
and in line with the Home Secretary's preference for research
to be published on the last Thursday in the month, The report
should be scheduled published on May 26 2005. Any further delay
will cause adverse comment.
7. A handling submission will follow once the date for
publication has been agreed.
SUPPORTING INFORMATION
8. The UKPS biometric trial began on April 14th last
year and was closed on 24th December 2004 after over 10,000 enrolments
had been completed.
9. Its objectives were to:
test the use of biometrics through a simulation
of the passport process
include exception cases, e.g. people who may have
difficulties in enrolment
measure the process times
assess customer perceptions and reactions
assess practical aspects of incorporation of biometrics
into a biometric database
test fingerprint and iris biometrics for one-to-many
identification and facial recognition for one-to-one verification
Although the trial was run by UKPS, we have been explicit
that it is also preparatory to the introduction of ID Cards, and
the bulk of its cost has been met from Home Office budgets.
10. It was not a technology trial i.e. we did not try
a range of equipment and compare performance of each unit nor
the interoperability of various systems.
11. The trial engaged with 10,000 volunteers from across
the UK providing a cross section of the UK population. The participants
fell into three broad categories:
Quota sample of 2,000 (a representative cross-section
of the UK population)
Opportunistic sample of 7,250 (people who self-selected
their participation)
12. Disabled sample of 750 which was collected against
the original requirement of 1,000. Statistically this reduction
in sample size did not have any material effect on the results.
Disability Matters, a leading disabled community interst group,
have contributed to the quality assurance of the report. They
were also engaged by Atos Origin, the contractor, to help in the
recruitment of the sample of disabled people.
13. To achieve trial objectives four fixed sites and
one mobile unit were employed during the trial. The four fixed
sites were located at:
UK Passport Office London
Newcastle Registrars Office
Leicester Post Office
DVLA Local Office Glasgow
14. The route of the Mobile Unit was chosen to allow
participants from the disabled community as well as able-bodied
to experience the process. Venues included Highbury College Portsmouth,
RNIB Redhill, National Association for Epilepsy Chalfont St. Peters,
Enham Alamein Andover and the St. Loyes Foundation Exeter.
15. Overall, the trial indicated a positive participant
response to both the process and concepts of biometrics. The majority
of participants were in favour of the adoption of biometrics as
a means of identification, believing that it would help prevent
identity theft and prevent illegal immigration. The majority of
participants in all sample groups also successfully enrolled on
all three biometrics, with success rates of 89% for quota participants,
90% for opportunistic participants and 61% for disabled sample
participants. 100% of quota and opportunistic sample participants
and over 99% of disabled sample participants were able to enrol
successfully at least one biometric i.e. face, fingers or iris.
16. The trial was particularly designed to find out which
groups of the UK population may have difficulties with biometric
enrolment. Although the trial was not a trial of the technology
or equipment and all results need to be seen in the context of
the particular hardware and software configurations used. It is
clear that some equipment used in the trial gave better results
than others. The equipment was however able to test processes,
customer reactions and perceptions as intended.
17. Across the trial results the sectors experiencing
most problems with biometrics enrolment in general were the elderly,
disabled and participants drawn from black and other minority
ethnic (BME) groups.
Some specific issues identified are:
Facial biometrics:
18. 99.74% of participants in each sample group successfully
enrolled a facial biometric. However, the failure rate for the
disabled sample group was significantly higher than that for the
quota and opportunistic groups. White participants had a higher
first attempt facial enrolment success rate than black participants
while facial verification success rate was higher for participants
aged under 60 than it was for those aged over 60. The exact reasons
for these inconsistencies are not currently evident from the report
of the trial and require further investigation, which we have
requested.
19. Maintaining the correct position for facial biometric
enrolment was a problem for some disabled sample participants
with a physical impairment or with learning disabilities and also
some elderly persons.
Iris biometrics:
20. 88.36% of participants successfully enrolled their
Irises. The disabled group achieved significantly lower results
at 61%. It appears that this was mainly due to positioning and
behavioural issues. i.e. where the participant could not place
themselves in the correct relationship to the camera or could
not follow the camera and operator instructions. Some participants
also volunteered existing medical conditions that apparently adversely
affected their ability to enrol.
21. Iris enrolment success, and success at the first
attempt also varied according to the participant's ethnic group
and age. Asian and white participants had higher success rates
than black participants. Younger participants had higher success
rates than older participants. Additional work is required to
ascertain the exact reasons behind these findings and to test
their statistical validity.
Finger biometrics:
22. 99.03% overall successfully provided finger scans.
This fingerprint enrolment success was lower for black participants
than other groups (97.72% of the sample tested). Participants
with a learning disability and participants with a physical impairment
had lower overall finger success and first time success than other
participants in the disabled sample and those from the quota and
opportunistic participants.
23. The 55+yr age group found it more difficult to position
themselves for the finger biometric than the 18-34yr and 35-54yr
age groups. Some disabled customers could not physically position
their fingers correctly and/or position their wheelchairs close
enough to the machine.
General:
24. A small percentage (0.62%) of disabled sample participants
failed to enrol on any of the biometrics. In these cases biometrics
could have been gathered but the process was halted at the operators
discretion for the comfort of the individual.
25. Over 80% of participants supported the use of Biometrics
and over 90% found the process not to be intrusive.
26. The sectors most likely to believe biometrics are
an infringement on their civil liberties are 18-34yrs, the C2DE
social group and the BME sectors.
FURTHER WORK
RECOMMENDED FROM
THE TRIAL
27. Additional work should be undertaken to further investigate
the findings surrounding the enrolment of the BME sector and the
elderly. Also that specific work addressing the needs of the disabled
community should be carried out in co-operation with appropriate
specialist organisations.
28. UKPS and the ID Cards Programme are currently drawing
up plans to follow up these recommendations with the emphasis
on comparative equipment trials, large database trials and later
(possibly tied in to the UKPS Authentication by Interview project)
and larger scale public trials.
4. ADVICE FROM
BIOMETRIC EXPERTS
The Biometrics Experts Group is a group of Home Office and
external experts which meets approximately once a month. Its role
is to actively contribute to the biometrics requirements of the
programme, in contrast with the Biometrics Assurance Group, which
provides assurance.
Minutes are taken of meetings but contain details of the
proposed testing and evaluation of vendors' biometric solutions
and their release may compromise the procurement process.
When it met on 26/27 January the attendees were:
Tony Mansfield | NPL |
Jim Wayman | San Jose State University
|
Philip Statham | CESG |
Chris White | CESG |
Bill Perry | UKPS |
Marek Rejman-Green | Home Office (27th only)
|
Kok Fu Pang | Home Office |
Zoe Nicholson | Home Office IDCP (26th pm only)
|
Duncan Westland | Home Office IDCP
|
Peter Durant | Home Office IDCP
|
| |
5. THE BIOMETRICS
ASSURANCE GROUP
You asked for membership of the Biometrics Assurance Group,
the meetings it has held and for copies of any reports reproduced.
Current membership of the group is reproduced below. In addition
to the Chair who is Professor Sir David King the Government's
Chief Scientific Adviser and head of the Office of Science and
Technology there are due to be 10 members, of these the 7 below
have taken up their role. The Biometrics Assurance Group met on
the 24 November 2005 and again on the 20 February 2006. It is
next due to meet on the 15 May with further meetings in July,
September and December of this year.
The Biometric Assurance Group is to issue regular reports,
possibly to a June/December timetable.
It is worth noting some of these details may change. At its
last meeting the Biometrics Assurance Group elected to form a
number of sub groups each with a tighter remit which may require
additional resource. This may lead to an increase in membership
and group activity.
Biometric Assurance Group Members
Chair Sir David King UK Government Chief Scientific
Adviser
Member Angela Sasse | University College London
|
Member Dick Mabbott | APACS
|
Member John Daugman | University of Cambridge
|
Member Mike Fairhurst | University of Kent
|
Member Peter Hawkes | British Technology Group
|
Member Peter Waggett | IBM |
Member Valorie Valencia | Authenti-Corp
|
| |
The functions of the Biometric Assurance Group are: Ensuring
Programme's requirements for biometrics, biometrics testing and
biometrics procurement are adequately specified.
Evaluating the biometrics elements of proposed
solutions offered by suppliers and integrators.
There may be some work on reviewing and interpreting
the outcomes of testing.
Reviewing the advice the Programme receives from
its experts and offering advice in those areas that are unclear.
Reporting to the director, SRO and the Programme
board.
Identifying emerging issues
6. ADVICE ON
IT
External advice is provided by:
Other parts of government, eg other departments,
CSIA (Central Sponsor for Information Assurance), CESG (Communications
Electronic Security Group)
Market sounding exercises
We have adopted for reference and evaluation purposes a modular
architectural design approach based on a "Service-Oriented
Architecture" where module requirements can be met wherever
possible by customised versions of systems commonly found in the
marketplace and where communications between modules takes the
form of simple, highly-structured service-oriented messages. These
reference solutions are for the purposes of developing requirements
and evaluating proposals. The eventual system design will be done
by the suppliers, chosen through open competition.
This modularisation is intended to simplify, and hence help
de-risk, IT system delivery, and allow easier substitution of
any modules that fail to meet our capability, performance and
resilience requirements without a complete re-engineering of the
solution. It should also permit the simpler "debugging"
of problems and attempted security violations since information
flows between systems will be visible and auditable.
For biometric matching systems, we have conducted detailed
market soundings about the types of systems typically used by
biometric suppliers for this purpose (eg, general purpose blades,
conventional rack-mount servers with custom ASIC/FPGA-based PCI
cards) and performed a space-and-power requirements analysis within
the limits of the information provided by suppliers.
We are currently considering how to specifyand validate
delivery ofIT systems, both individually and operating
in unison, that are tolerant of unpredictable load conditions
(including major overloads) ensuring service continuity is maintained
at all times e.g., by slowing down rather than crashing. We also
are examining the most appropriate replicated system configuration
across multiple sites to ensure minimal service disruption in
the event of a "disaster".
Assurance on IT is provided by the programme team, by our
independent assurance panel and by external reviewseg by
the Home Office Science and Technology Reference Group.
7. PA CONSULTING
Note on role, involvement and responsibilities of PA Consulting
Ltd work for Home Office Identity Cards Programme.
It was identified in 2004 that the Home Office did not have
ready access to certain skill sets and resources necessary for
implementation of a large and complex project such as Identity
Cards. In addition it was seen that it would be beneficial to
have a mixture of public and private sector expertise in the programme.
To address this need a client-side "Development Partner"
was procured to bring in support to determine the best way of
designing and implementing the proposed scheme. In line with normal
practice on procurement of consulting services of this type, approaches
were made to a number of companies who have framework arrangements
with the Office of Government Commerce to provide management and
business consultancy to Government departmentsdetails of
the framework at www.s-cat.gov.uk
The contract was let as a result of evaluation of proposals
received, which were assessed on a Value for Money basis. This
resulted in a contract being awarded to PA Consulting Ltd for
the development and procurement phases of the Programme.
PA provides expertise and resource to the programme covering
a number of different skills. This is in the form of embedded
resourcePA work as part of the programme team along side
civil servants and seconded staff to deliver programme outputs.
PA support the design, feasibility testing, security accreditation,
business case and procurement elements of the proposed scheme.
The specialist skills include project and programme management,
procurement, smart cards and biometrics, business process design,
financial modelling and business case development.
|