Biometrics: guilty until proven innocent

Fundamental issues in biometric performance testing: A modern statistical and philosophical framework for uncertainty assessment
Is there any hope of inductively extending the results of our technical test more broadly to any other algorithms or databases? A Type B systematic uncertainty evaluation after consideration of changes in the unit of empirical significance and statistical controls over its tangible elements might be of value, provided that the specifics of the changes could be given, but we should not sanctify such a “guesstimate” in an emperor’s cloak of imagined analytic rigor.

... technology testing on artificial or simulated databases tells us only about the performance of a software package on that data. There is nothing in a technology test that can validate the simulated data as a proxy for the “real world”, beyond a comparison to the real world data actually available. In other words, technology testing on simulated data cannot logically serve as a proxy for software performance over large, unseen, operational datasets.

We lack metrics for assessing the expected variability of these quantities between tests and [we lack] models for converting that variability to uncertainty in measurands [the quantities intended here are false positives and negatives, failure to acquire and enrol, and throughput].

... each specific recognition technology (iris, face, voice, fingerprint, hand, etc.) will have specific factors that must be within a state of statistical control. This list of factors is not well understood, although ample work in this area is continuing. For example, recent analysis of iris and face recognition test results shows us that to report false match and false non-match performance metrics for such systems without reporting on the percentage of data subjects wearing contact lenses, the period of time between collection of the compared image sets, the commercial systems used in the collection process, pupil dilation, and lighting direction is to report “nothing at all”. Our reported measurements cannot be expected to be repeatable or reproducible without knowledge and control of these factors.

... the test repeatability and reproducibility observed in technology tests are lost in scenario testing due to the loss of statistical control over a wide range of influence quantities.

... Our inability to apply concepts of statistical control to any or all of these factors will increase the level of uncertainty in our results and translate to loss of both repeatability and reproducibility.

... Test data from scenario evaluations should not be used as input to mathematical models of operational environments that require high levels of certainty for validity.

We can conclude that the three types of tests are measuring incommensurate quantities and therefore [we] should not be at all surprised when the values for the same technologies vary widely and unpredictably over the three types of tests.

from a paper by Jim Wayman, Antonio Possolo and Tony Mansfield delivered at the International Biometrics Conference hosted by the US National Institute of Standards and technology, 2-4 March 2010

Security fear over airport face scanners
Sources from the UK Border Agency (UKBA) have revealed that the devices are failing to detect when two people pass through them at the same time.

The system, which replaces traditional passport control measures, is undergoing a "live trial" at Manchester Airport, where a UKBA worker said it was suffering almost daily malfunctions.

He said immigration officers had been able to accompany travellers through the scanners without an alarm being triggered, even though the booths are supposed to detect if more than one person enters at a time.

"Immigration officers have been able to tailgate passengers through the machine, without the machine picking it up," he said ...

The source said there were malfunctions taking place almost daily in the pilot project, which is thought to have cost the taxpayer several hundred thousand pounds.

"There are five 'pods' and when one breaks down, they all break down," he said ...

The UKBA source said there were widespread concerns about the facial recognition equipment.

"There is no reliable data on the machine's ability to pick up forgeries and imposters," he said ...

A spokesman for the PCS union, which represents UKBA staff, said: "The notion that you can replace the human intuition of highly trained immigration staff with unproven machines is dangerous.

"The technology is further undermined by staff sitting in front of the monitors for three hours at a time, leading to mental fatigue and a drop-off in concentration. There are major concerns about the reliability and accuracy of facial recognition technology ...

"We have advised our members not to train to use the equipment or to man it" ...

"Up until the point of the official launch, it was rejecting 30 per cent of those who tried to get through it," the UKBA worker said.

"We believe they had to recalibrate it – essentially make it easier to get through the system."

Telegraph, 4 October 2008

ID card 'will drown in a billion mismatches'
The government has underestimated the likely failure rate of the ID card scheme, according to a biometrics expert who reviewed the system.

The ID card scheme will guard against one person having multiple identities by checking the two fingerprints and facial scan held on a chip on the ID card against biometrics in a central database, the National Identity Register.

But academic John Daugman, a former member of the Biometrics Assurance Group (BAG) which reviewed the scheme, says its reliance on fingerprints and facial photos to verify a person's identity will cause the system to collapse under the weight of mismatched identifications ...

Daugman said that even if the error rate was as low as one in a million, the 10 to the power of 15 comparisons needed to verify the IDs of 45 million people would result in one billion false matches.

He told silicon.com: "The use of fingerprints will cause deduplication to drown in false matches.

"The government was badly advised by its internal scientists in the Home Office when it took the decision to base the biometric system on fingerprints instead of iris patterns.

silicon.com, 26 September 2008

Passengers test new face scanners
Facial recognition scanners are being trialled at an airport as part of government efforts to improve security and reduce passenger congestion.

The system at Manchester Airport can be used by adult biometric passport holders from the UK and Europe.

It works by scanning passengers' faces and comparing them to the photographs digitally stored on their passports.

The Public and Commercial Services Union (PCS) voiced concerns that the technology was "untried and untested" ...

PCS deputy general-secretary Hugh Lanning said: "This is untried, untested technology and they're going live with it before they've been able to recognise any of the difficulties there might be with the system.

"People are being allowed through on the basis of this technology. It means that 95% of people won't be checked in any way, other than by the machine."

BBC, 19 August 2008
German Federal Police questions reliability of facial recognition
The German Federal Criminal Police Office (BKA) found biometric visual-image search systems not advanced enough to be used by the police to search for persons. BKA presented research results of its visual-image search systems project. Given the present state of the technology the system was unfit to be deployed, they concluded.
 
The system was tested in a rail terminal in the city of Mainz and finally declared worthless in terms of being an investigative tool. It was presented as the first public trial under normal, every day conditions (rather than having the conditions manipulated for a good showing) and only matched 30%. Even when the lighting was modified to be ideal, it only reached 60%.

European Biometrics Portal, 17 July 2007

Unrecognised Iris
... Our first attempt at registering failed, however, because the official in charge of the camera at London's Heathrow airport could not remember the PIN needed to work his machine ... After many failed attempts at aligning our eyes with optical markers, the machine lost patience and told us to leave. An official appeared and said the malfunction might be down to the machine thinking our suitcase was a child being smuggled through ...

New Scientist, 14 April 2007

Identity and Passport Service: Introduction of ePassports
Facial recognition software is not reliable enough to use with large databases
3.4 The ePassports business case notes that the storage of biometric information should help reduce the risk of duplicate passports being issued. We were told by our consultants that the use of current facial recognition technology with two dimensional images of limited resolution (as is the case for ePassports) is not sufficiently reliable to enable fully automated searches even in relatively small databases, and performance is known to decline as database size increases. The Identity and Passport Service database of passport holders is large and still growing, so current facial recognition software cannot be used to check new applications against the entire database of existing ePassport holders.

National Audit Office, 5 February 2007

ID technology leaving passengers waiting
Passengers face massive delays at Gatwick Airport because of problems with new iris-recognition equipment, a Tory MP has claimed ... Mr Wallace told BBC Radio 4's Today programme: "The pilot failed half its assessments: it wasn't available when it was needed at the right level; when the system crashed, it took over eight hours to fix.

The Sussex Argus, 10 January 2007

Has he been watching CSI again?
... politicians like all this CSI-stuff ... But does it work? Does the data-crunching surveillance state make us safer?

Ministers will reel off positive stats but that doesn't really answer the question.

What if the money and energy was redirected to old-fashioned, bobby-on-the-beat policing that deters crime rather than being spent on schemes that make it easier to find criminals after the event. And what about the science behind DNA testing and the like? Is it that reliable? There are a lot of people with a stake in proving it is effective (well, there are Government contracts at stake) and few scientists with the resources to debunk it.

And if all these databases, cameras and new technologies are so great compared to what went before, why are serious crime levels so stubbornly high?

The Times, 23 October 2006

Mythbusters-Beat Finger Print Security System
Mythbusters, 17 September 2006
Identity Card Technologies: Scientific Advice, Risk and Evidence
81. We also note an apparent discrepancy between the advice offered to us during our visit to the United States in March 2006 and the advice subsequently provided to the identity cards programme team. On 6 March 2006, we met informally a group of senior policy advisers from the Department of Homeland Security to discuss the identity cards programme. When questioned about the maturity of biometric technologies, the advisers agreed that currently the technology was probably not as reliable or as accurate as it might need to be for a national identity card scheme ...

91. ... We are surprised by the Home Office’s unscientific approach and suggest that rather than collating figures merely to provide information regarding performance, the Home Office admits that it cannot release details until it has completed trials. We note the lack of independent evidence relating to the performance of iris scanning and welcome the Home Office’s commitment to undertake a large-scale matching test using pre-recorded biometrics. Given the relative lack of information available publicly regarding the performance of biometrics in a national scheme, we recommend that once the scheme is established the Home Office publishes details of the performance levels of the technology ...

93. We are surprised and concerned that the Home Office has already chosen the biometrics that it intends to use before finishing the process of gathering evidence. Given that the Identity Cards Act does not specify the biometrics to be used, we encourage the Home Office to be flexible about biometrics and to act on evidence rather than preference. We seek assurance that if there is no evidence that any particular biometric technology will enhance the overall performance of the system it will not be used ...

95. We note the lack of explicit commitment from the Home Office to trialling the ICT solution and strongly recommend that it take advice from the ICT Assurance Committee on trialling. We seek an assurance that time pressure and political demands will not make the Home Office forgo a trial period or change the purpose of the scheme.

96. In written evidence the Home Office said it was not necessary to embark on publicly funded scientific research to improve the capabilities of biometrics. This claim was subsequently denied in oral evidence and the identity card team asserted that research was being undertaken into fingerprint biometric performance ... We regret the confusion at the Home Office regarding the research that it is funding and what research it requires ... The Home Office has not provided us with evidence either that they have identified areas where the evidence base is weak nor that they have commissioned research in order to strengthen it. On the basis of the evidence that we have seen, we conclude that the Home Office does not seem to have an effective mechanism for ensuring that the required research and development in the relevant scientific and technological areas is carried out. We recommend that the Home Office identifies the gaps in the evidence base underpinning the identity cards programme, that it commissions research to fill these gaps and that it feeds any new developments into the scheme where appropriate. This process should be overseen by the departmental Chief Scientific Adviser.

House of Commons Science and Technology Committee, 20 July 2006

Statement of Glenn A. Fine Inspector General, U.S. Department of Justice before the Senate Committee on the Judiciary concerning “Oversight of the Federal Bureau of Investigation”
FBI’s Handling of the Brandon Mayfield Matter: ... In March 2006, the OIG released a 273-page report that examined the FBI's handling of the Brandon Mayfield case.

Mayfield, a Portland, Oregon, attorney, was arrested by the FBI in May 2004 as a material witness after FBI Laboratory examiners identified Mayfield’s fingerprint as matching a fingerprint found on a bag of detonators connected to the March 2004 terrorist attack on commuter trains in Madrid, Spain, that killed almost 200 people and injured more than 1,400 others. Mayfield was released 2 weeks later when the Spanish National Police identified an Algerian national as the source of the fingerprint on the bag.

The FBI Laboratory subsequently withdrew its fingerprint identification of Mayfield.

We found several factors that caused the FBI’s fingerprint misidentification. The unusual similarity between Mayfield’s fingerprint and the fingerprint found on the bag confused three experienced FBI examiners and a court-appointed expert. However, we also found that FBI examiners committed errors in the examination procedure, and the misidentification could have been prevented through a more rigorous application of several principles of latent fingerprint identification. For example, the examiners placed excessive reliance on extremely tiny details in the latent fingerprint under circumstances that should have indicated that these features were not a reliable support for the identification.

The examiners also overlooked or rationalized several important differences in appearance between the latent print and Mayfield’s known fingerprint that should have precluded them from declaring an identification.

In addition, we determined that the FBI missed an opportunity to catch its error when the Spanish National Police informed the FBI on April 13, 2004, that it had reached a “negative” conclusion with respect to matching the fingerprint on the bag with Mayfield’s fingerprints. 2.

DNA Reviews: Within the past 2 years, the OIG completed two reviews examining various aspects of DNA issues. In the first review, completed in May 2004, the OIG examined vulnerabilities in the protocols and practices in the FBI’s DNA Laboratory.

This review was initiated after it was discovered that an examiner in a DNA Analysis Unit failed to perform negative contamination tests, and the Laboratory’s protocols had not detected these omissions. The OIG’s review found that certain of the FBI Laboratory’s DNA protocols were vulnerable to undetected, inadvertent, or willful non-compliance by DNA staff, and the OIG report made 35 recommendations to address these vulnerabilities.

The FBI agreed to amend its protocols to address these recommendations and to improve its DNA training program. In addition, the OIG continues to audit laboratories that participate in the FBI’s Combined DNA Index System (CODIS), a national database maintained by the FBI that allows law enforcement agencies to search and exchange DNA information.

The OIG’s CODIS audits identified concerns with some participants’ compliance with quality assurance standards and with their uploading of unallowable and inaccurate DNA profiles to the national level of CODIS.

The OIG currently is analyzing findings from DNA laboratory audits – both OIG-conducted audits and external quality assurance audits – to determine if they reveal global trends and vulnerabilities.

We also are assessing the adequacy of the FBI’s administration of CODIS, including its oversight of the national DNA database, and evaluating its implementation of corrective actions in response to the original report.

US Department of Justice, May 2, 2006

Hi-tech Cassandras foresee trouble with ID cards
... Qinetiq, the defence technology company that advises the government, said a biometric scan in the US had failed because it concluded that a man who later went bald and had a wrinkled forehead had an upside-down face.

The Guardian, 21 October 2005

ID cards scheme dubbed 'a farce'
Plans for a national ID card scheme have been branded "farcical" after suggestions it might misidentify people with brown eyes or men who go bald.

The Home Office's Tony McNulty admitted some technological "difficulties" with some of the biometric checks ...

Mr McNulty said on Sunday: "There are difficulties with the technology, not least in terms of people who have difficulties with their eyes anyway, not least with people with brown eyes rather than other coloured eyes, and all those are being factored into the equation.

"None of these problems are new, but increasingly as biometrics are more and more used... we think the technology can only get better and better and better."

BBC, 17 October 2005

Prisoners unpick hi-tech lock system
PRISON officers have been forced to abandon a new security system and return to the use of keys after the cutting-edge technology repeatedly failed. The system, which is thought to have cost over £3 million, used fingerprint recognition to activate the locking system at the high-security Glenochil Prison near Tullibody, Clackmannanshire. After typing in a PIN code, prison officers had to place their finger on a piece of glass. Once the print was recognised, they could then lock and unlock prison doors. However, problems arose after a prisoner demonstrated to wardens that he could get through the system at will. Other prisoners had been doing the same for some time.

The Scotsman, 20 September 2005

Technical problems for Dutch biometric passport
A study commissioned by the Dutch Ministry of the Interior and Kingdom Relations has raised fresh concerns over a number of technical issues related to the issuance of biometric passports.

According to the study, the results of the first biometric passport trials conducted in 2004-2005 showed that the quality of fingerprint information used in the tests was sometimes poor and that the biometric documents were less robust than the traditional passports.

The quality of digital photographs was also a concern, as unclear backgrounds, insufficient contrast, and other problems such as reflection from spectacle lenses resulted in about 1.6% of photographs being unsuitable for automated biometric matching.

In addition, including fingerprints of young children and the elderly in the future Dutch e-passport may prove more difficult than expected as people have to hold on their fingers for quite a while for the procedure to be successful, the report says ...

The Dutch government is not the only European government experiencing technical difficulties with the development of its biometric passport programme.

In the UK, the findings of a biometrics enrolment trial, published earlier this year, revealed that biometric technologies were still not foolproof and that large-scale issuance of biometric identity and travel documents would inevitably run into some glitches.

In Germany, serious concerns over the government’s biometric passport programme were voiced by security and privacy experts, parliamentary committees and by the Federal Data Protection Commissioner, who even called for a moratorium on the introduction of biometric passports in light of the still immature state of the technology and of a number of unresolved data protection issues.

According to press reports, technical difficulties have also led the Irish Government to shelve plans to introduce biometric chips into passports for the time being.

IDABC eGovernment Observatory (= Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizen), 20 September 2005

No smiles, please - we're British
Saying 'Cheese' could soon mean you'll be taking your summer holidays in Bournemouth rather than Barcelona - as a new rule demanding straight faces only on passport photos comes into force next week.

The idea behind the Home Office restrictions - first announced last year - is to ensure the smooth running of new scanning technology, which apparently has problems recognising gurning and grinning holiday makers.

The rules also specify the mouth should be closed, your piccie should be less than a month old, and only taken against an "off-white, cream or light grey, plain background."

And what if you insist on sending in your most winsome, toothy grin? Smiley faces will lead to applications being refused until officials receive suitable photos, says the Home Office. Which is no laughing matter.

The Guardian, 5 September 2005

ID technology 'must be foolproof'
Technology behind the government's controversial identity card scheme must be "almost foolproof", the UK's most senior police officer has warned.

The cards could tackle terror only if biometric indicators like irises and fingerprints were recognised almost perfectly, Sir Ian Blair said ...

"ID cards can only be the answer if the recognition of them is almost perfect," he said.

"Identity cards are only going to work if we have a biometric answer - that may be iris recognition but it is unlikely to be facial recognition because that changes because of diet and beards and everything else."

BBC, 15 June 2005

UK large-scale biometrics trial reveals technology limits
The findings of a biometrics enrolment trial conducted by Atos Origin on behalf of the UK Passport Service (UKPS) show that biometric technologies are still not foolproof and suggest that large-scale issuance of biometric identity and travel documents would inevitably run into some glitches ...

Facial recognition was the least successful identification technology ...

Among other things, further trials are needed, specifically targeted towards those disabled groups that have experienced enrolment difficulties due to environment design, biometric device design, or to specific group problems – for example, black participants and participants aged over 59 had lower iris enrolment success rates ...

A report released by the European Commission on 30 March 2005 warned that – on the technological side – there is currently a lack of independent empirical data. This means that there is an urgent need to conduct large-scale field trials to ensure the successful deployment of biometric systems.

IDABC (= Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizens), 31 May 2005

ID trials reveal scan problems
Some experts argue the technology may never be good enough.


Professor Angela Sasse, a biometrics expert who has advised MPs on the home affairs select committee, said biometric technologies were "a lot less mature" than manufacturers made out.

"To be honest, I think it is a possibility that eventually we will conclude it isn't good enough or that the current systems we're using aren't good enough for a large scale public domain application such as an ID card," she said.

BBC, 25 May 2005

UK Passport Service Biometrics Enrolment Trial

Approximate results, rounded to nearest integer
(all biometrics software provided by L-1 Identitiy Solutions, Inc.,
please see Appendix C)
Able-bodied
Disabled
Success
Failure
Success
Failure
Registration (1.2.1.3, test to check that participants could be enrolled into the National Identity Register using their biometrics)
Facial geometry
100%
0%
98%
2%
Fingercopies
100%
0%
96%
4%
Irisprints
90%
10%
61%
39%
Verification (1.2.1.4, test to check, a few minutes later, that the participants' identity can be verified by comparing their biometrics with the registered template)
Facial geometry
69%
31%
48%
52%
Fingercopies
81%
19%
80%
20%
Irisprints
96%
4%
91%
9%

Atos Origin, 24 May 2005

Crash halts police print checks
Police forces in England and Wales could not access national fingerprint records for up to a week because of a computer failure, it has emerged.

BBC News, 3 December 2004

Biometrics Is that really you?
... No single biometric technology is infallible and different technologies have strengths and weaknesses that make them more or less suitable for certain applications. Increasingly biometrics are used together to provide a stronger authentication and reduce the risk of error ...

Fingerprint recognition can be fooled by calluses, residual prints on the reader, and even hand cream! Face recognition struggles in certain lighting conditions and can by fooled by disguises; and iris recognition can be confused by contact lenses and watery eyes.

Biometric identification also faces stringent opposition from civil liberties groups who believe that it represents a breach of privacy. There is great concern about the storage of the biometric data and who has access to it. The possibility of storage of personal data on a centralized government database causes greatest concern. There is concern that this data may be misused and even that it may be possible for a person’s stored data, or ‘biometric reference template’, to fall into the wrong hands. Even schemes in which data is stored on the card itself have not been immune from criticism.

PA Consulting, 24 November 2004

Doubts over passport face scans
... Professor Angela Sasse of University College London - who has made a study of biometrics, said she was very doubtful whether facial scans were a practical security measure yet. "It will be a huge problem if facial biometrics cannot always correctly identify genuine passport holders," she told the BBC.

BBC News, 21 October 2004

DNA fingerprinting 'no longer foolproof'
The genetic profiles held by police for criminal investigations are not sophisticated enough to prevent false identifications, according to the father of DNA fingerprinting. Professor Sir Alec Jeffreys, a geneticist at Leicester University, said police DNA databases should hold more information to lessen the chances of a false positive.

The Guardian, 9 September 2004

'I've got a biometric ID card'
... No cheesy grins will be allowed, because the machine is scanning the measurements of your face and "doesn't like teeth".

BBC News, 12 August 2004

Long eyelashes thwart ID card iris scans
Long eyelashes and watery eyes are causing technology being used to scan the irises of 10,000 volunteers for the Government's national identity card project to fail.

The shortcomings in the system were exposed as MPs took part in a pilot project at the UK Passport Service HQ in London. The trial is testing technology for the proposed "biometric" identity card.

Eye malfunctions have also been found to cause a problem for the technology, while some experts believe the scanners will not work on people wearing hard contact lenses.

Mr Sables said that there may also problems when people have faint fingerprints, such as manual labourers who work with concrete, but that the next generation of technology would overcome the problem by reading bloodflow beneath the skin.

The Daily Telegraph, 6 May 2004

ID card trials put back by technical glitches

A PILOT scheme for the proposed national identity card system is beset with technical difficulties even before it has started, MPs were told yesterday. Problems with the project forced officials to delay its launch by three months because of difficulties with the hardware and software. As a result, the planned length of time the pilot will operate has been cut from six to three months.

The Times, 5 May 2004

Testing the biometric facts
... "The technologies like iris scanning are accurate enough for the ID cards application but only providing they are implemented properly and one has appropriate fall-back processes to deal with exceptional cases," Dr Tony Mansfield, chief research scientist and biometrics expert at the National Physical Laboratory, told BBC News Online.

Such "exceptional cases" could be someone who simply has very long eyelashes, or it could be something more serious like a disability that affects the collection of biometric data.

BBC News, 26 April 2004

'Earprint' man cleared of killing
A man convicted of murder on the basis of his earprint has been freed after spending seven years in prison.

BBC News, 22 January 2004

Facing a biometric future
... Although "facial recognition biometric data" all sounds very sci-fi, it is in fact the least accurate biometric identifier there is, according to experts ...

"The current encoding of photographs digitally into passport chips is almost entirely for the purpose of ultimate visual comparison by a human," says Professor John Daugman. And although humans are not very good at that, he says, machines have an even harder time ... If a machine were to take over in order to match passport images against a database of pictures, Professor Daugman says the rate of error would still be five to 40%, even with the best algorithms.

"Today's computer algorithms for automatic face recognition have a truly appalling performance, in terms of accuracy," he says. "Even small variations in pose angle, illumination geometry, viewing angle, and facial expression have catastrophic effects on algorithm accuracy," says the Professor ...

The key to the power of biometrics to identify people is the amount of randomness and complexity that the biometric contains, according to Professor Daugman. "Face recognition is inherently unreliable because there isn't nearly enough randomness in the appearance of different faces. Fingerprints are vastly better biometrics than faces," he says, "but better still are iris scans" ... there will be some stumbling blocks and no biometric method offers 100% certainty.

BBC News, 13 January 2004

Technobabble
IF WE can believe the politicians, biometric technology offers the world’s first foolproof identification system. The use of iris patterns or fingerprints on ID cards, says the Home Secretary, “will make identity theft and multiple identity impossible — not nearly impossible, impossible”. This certainty is echoed by the Home Office minister Fiona Mactaggart, a former chairwoman of Liberty who once denounced ID cards as an outrage. Now, she says, the “security and opportunity” of biometrics has convinced her that this “revolutionary” hi-tech solution is the only way to protect personal identity ...

As for David Blunkett’s conviction that a biometric record will guarantee an individual’s identity without any “false positive” readings, the evidence is lacking. Last year, after rigorously testing leading iris-scanning and face-matching products, the US Defence Department reported that they were far less effective than their manufacturers claimed.

Eye-scanning software from Iridian, for instance, claims a 99.5 accuracy rate; the Pentagon found that it worked only 94 per cent of the time. As for Visionics’ “face-recognition” technology, which maps patterns on individuals’ faces, it recognised people in tests barely 51 per cent of the time, rather than the 99.3 per cent claimed.

The Times, 18 November 2003

Feasibility Study on the Use of Biometrics in an Entitlement Scheme for UKPS, DVLA and the Home Office
Biometric methods do not offer 100% certainty of authentication of individuals. (4)

The UKPS/DVLA proposals assume that applications are processed, and biometric images collected at local offices, in a manner similar to the current process of checking and driving licence applications by the high street partners of UKPS and DVLA. We assume a similar number of local offices (i.e. approximately 2000). (14)

Even under relatively good conditions, face recognition fails to approach the required performance. (52c)

With the known performance of fingerprint, iris and face biometric systems, this requirement mandates the use of multiple fingers, or irises, and confirms that facial recognition is not a feasible option. (55)

Face recognition is not strong enough to uniquely identify one person in a population of 50m. (57)

Fingercopies and irisprints compared (Appendix B)
. Fingercopies Irisprints
Security against fake biometrics Poor, additional liveness tests need to be developed Satisfactory
Exceptional cases Missing hands and fingers. Difficult to register fingerprints for some sections of the population. (women, East Asians, manual labourers, older people) Congenital eye conditions (anirida, coloboma, anolphthalmia) eye damage & disease. Partial image of iris may be all that can be obtained.

Tony Mansfield and Marek Rejman-Greene, 12 November 2003

How do I know who you are?
... Take the Enhanced Border Security and Visa Entry Reform Act of 2002 for example. This requires biometric identification on the travel documents of everyone entering the US after 26 October 2004, even for visa waiver countries ... The act, if you read it carefully, doesn't require that the system actually works, just for it to be there ...

A woman from Australian customs told me straight: "In Australia we will not give fingerprints to the US for the purpose of visa entry, we absolutely do not give fingerprints." Her response was there would be no travel from Australia to the US.

Some people say it's like barcodes, which didn't work in the early days. Biometrics will get better, it's true. But it's a bad analogy because barcodes can be controlled in manufacturing. If a checker has to type in the code too many times they make the manufacturer redesign the can. Human beings can't go to God. No one technology is going to provide the magic bullet.

People are different in ways that you could never imagine. They never have what you think they are going to have where you think they are going to have it. It never, ever, occurred to me that people can have polydactylism: one fellow had two right thumbs. I have a friend who has a hard time with facial recognition systems: he is very light-skinned, with very light hair but mostly bald. Against a light background, the computer couldn't find the outline of his face, and it said: "There's nobody here." Another guy I knew didn't have a round pupil because he had damaged his eye. You couldn't use iris recognition on that one eye. And then there are people with one glass eye. Or take privacy advocate Simon Davies, whose irises move constantly. He can't be successfully iris-scanned.

Everybody learns from reading Mark Twain's Pudd'nhead Wilson that fingerprints are unchanged from cradle to grave and that everybody has unique fingerprints. But despite this, there remains a tremendous controversy over the admissibility of fingerprints as evidence. I've been an expert witness on this. Fingerprinting is very defendable, but the government has used some of the most stupid, crazy, spurious and non-scientific scientific arguments to try to defend it. We do lack the scientific basis, and that's what we're trying to make up for now.

DNA is not biometrics, it's not automatic unless you touch a machine and it takes a sample, like in the movie Gattaca. But there are a couple of problems. First, you are invading my privacy by asking me to touch a machine and by removing something from my body. I find that disgusting. Secondly, there may be information in that DNA analysis that tells you something about me as a person. Other biometrics don't give any information about a person at all ...

Face recognition still seems to be the holy grail. Perhaps it's more acceptable to people than being fingerprinted or iris-scanned. And often if we have any information at all on terrorists, the face may be the only thing we have. But there are many problems. Take the London mayor, Ken Livingstone, and his idea that you can point a camera at a car and do facial recognition of the occupants. We did that at a Mexico border crossing in Otay Mesa. The immigration service tried to automate the crossing by installing facial recognition cameras in a system called SENTRI, but the driver had to stop and look into the camera. That was highly problematic because the height of the cars varied, and window frames obscured the faces. The state of this technology is we are still trying to teach the cameras that the two people in each scene are the same person.

You have no clue who I am, and I could give you my fingerprint and you still wouldn't know who I am. That's a fundamental flaw in all the legislation. Biometrics says nothing about whether I'm a terrorist or not ...

We'll never use biometrics to track somebody. I've got a really good idea for tracking people: you ask them to carry radio transmitters... Like my mobile phone? ... So right now the government can track you within metres. That's a much better way to track people ...

Biometric tests are not like tests of computer security because in biometrics you are testing people - and people are extremely expensive to test. We have seen that recently with the results from a facial recognition test sponsored by the US Department of Defense and conducted by the National Institute of Standards and Technology (NIST). Two of the companies involved came forward and said "We've improved our product, those results don't apply to us." How would they know? No one has tested the new product. And tests are so expensive that they can't afford them. We see this in biometrics all the time ...

In hand geometry, you get nine measurements. In facial recognition you get 128. Why don't we just concatenate them? It turns out the mathematics is really, really hard. If you throw cotton balls into a shoebox with no gravity, what is the probability that there will be a collision? The probability of a collision increases as you get more balls, the smaller the box gets or the bigger the cotton balls are. Then suppose we change the dimension: so that they are not cotton balls but the shadows of cotton balls on the floor of the box. The shadows may be colliding while the cotton balls are not colliding. Can we put together a mathematical formula that tells us how increasing the dimensions of the system decreases the probability of collisions? In biometrics, a collision is a false match.

New Scientist, 21 June 2003

Face-off
I CAME here looking for an argument but I can't find one. All round this lofty exhibition hall - billed as the world's biggest market for security equipment - the people selling face-recognition systems are being disarmingly, infuriatingly honest ... I thought they'd at least attempt to defend the technology. When they don't, it's me who's caught off guard. Is it true that the systems can't recognise someone wearing sunglasses? Yes, they say. Is it true that if you turn your head and look to one side of the camera, it can't pick you out? Again, yes. What about if you simply don't keep your head still? They nod.

Maybe nine or ten months ago they would have risen to the bait. In those days the face-recognition industry was on a high. In the wake of 11 September, Visionics, a leading manufacturer, issued a fact sheet explaining how its technology could enhance airport security. They called it "Protecting civilization from the faces of terror". The company's share price skyrocketed, as did the stocks of other face-recognition companies, and airports across the globe began installing the software and running trials. As the results start to come in, however, the gloss is wearing off. No matter what you might have heard about face-recognition software, Big Brother it ain't ...

Image Metrics, a British company that develops image-recognition software, ... warned of the danger of exaggerated claims, saying that "an ineffective or poorly applied security technology is as dangerous as a poorly tested or inappropriately prescribed drug" ... to catch 90 per cent of suspects at an airport, face-recognition software would have to raise a huge number of false alarms. One in three people would end up being dragged out of the line - and that's assuming everyone looks straight at the camera and makes no effort to disguise themselves ...

Palm Beach International Airport in Florida released the initial results of a trial using a Visionics face-recognition system. The airport authorities loaded the system with photographs of 250 people, 15 of whom were airport employees. The idea was that the system would recognise these employees every time they passed in front of a camera. But, the airport authorities admitted, the system only recognised the volunteers 47 per cent of the time while raising two or three false alarms per hour ...

To give themselves the best chance of picking up suspects, operators can set the software so that it doesn't have to make an exact match before it raises the alarm. But there's a price to pay: the more potential suspects you pick up, the more false alarms you get. You have to get the balance just right. Visionics - now called Identix after merging with a fingerprint-scanning company in June - is quick to blame its system's lacklustre performance on operators getting these settings wrong ...

Numerous studies have shown that people are surprisingly bad at matching photos to real faces. A 1997 experiment to investigate the value of photo IDs on credit cards concluded that cashiers were unable to tell whether or not photographs matched the faces of the people holding them. The test, published in Applied Cognitive Psychology (vol 11, p 211), found that around 66 per cent of cashiers wrongly rejected a transaction and more than 50 per cent accepted a transaction they should have turned down. The report concluded that people's ability to match faces to photographs was so poor that introducing photo IDs on credit cards could actually increase fraud.

The way people change as they age could also be a problem. A study by the US National Institute of Standards and Technology investigated what happens when a face-recognition system tries to match up two sets of mugshots taken 18 months apart. It failed dismally, with a success rate of only 57 per cent.

There's another fundamental problem with using face-recognition software to spot terrorists: good pictures of suspects are hard to come by ...

Very few security personnel at American airports have CIA clearance, so they aren't allowed to see the images. "Until they've got cleared personnel in each of those airports they can't stop terrorists getting on planes," says Iain Drummond, chief executive of Imagis technologies, a biometrics company based in Vancouver, Canada ...

Airport security isn't the only use for face-recognition software: it has been put through its paces in other settings, too. One example is "face in the crowd" on-street surveillance, made notorious by a trial in the London Borough of Newham. Since 1998, some of the borough's CCTV cameras have been feeding images to a face-recognition system supplied by Visionics, and Newham has been cited by the company as a success and a vision of the future of policing. But in June this year, the police admitted to The Guardian newspaper that the Newham system had never even matched the face of a person on the street to a photo in its database of known offenders, let alone led to an arrest.

New Scientist, 7 September 2002

Doubt cast on fingerprint security
Fake fingers made out of common household ingredients can fool security systems that use fingerprints to identify people ... The artificial fingers and prints were created with gelatine by Japanese researchers who used the digits to trick biometric systems into thinking they were seeing the real thing ... Not only was it possible to fool the security systems with casts of fingers, the researchers found they could make convincing fakes using fingerprints lifted from glass ... Experts say the experiments cast serious doubt on any claims that this type of biometric system can be made fully secure.

BBC, 17 May 2002

Forensic evidence in the dock
... Contrary to what is generally thought, there is little scientific basis for assuming that any two supposedly identical fingerprints unequivocally come from the same person.

Indeed, according to a report published in December, the only major research explicitly commissioned to validate the technique is based on flawed assumptions and an incorrect use of statistics. The research has never been openly peer reviewed. This month, the US government also published a set of funding guidelines that rules out further studies to validate both fingerprint evidence and other existing forensic techniques presented as evidence in court.

In 2003, a proposal by the US National Academies to validate such techniques collapsed after the Department of Defense and Department of Justice demanded control over who should see the results of any investigation.

New Scientist, 28 January 2002


© 2002-2007 Business Consultancy Services Ltd
on behalf of Dematerialised ID Ltd