ABIS: The Spanish Police will use an automatic facial recognition tool | Technology
is the headline of the news that the author of WTM News has collected this article. Stay tuned to WTM News to stay up to date with the latest news on this topic. We ask you to follow us on social networks.
The National Police, the Civil Guard and the regional bodies will soon have at their disposal a new tool to fight crime: an automatic facial recognition system. The ABIS program (acronym in English for automatic biometric identification system), which uses artificial intelligence to identify suspects in a few seconds from any type of image, is currently in the phase of establishing the database, according to Ministry sources. inland. Pilot tests have already been carried out with it and, as soon as it is ready, it will be used in police investigations, initially only in serious crimes. Interior ensures that in no case will it be used for surveillance work or for live recognition of people in public spaces, although independent analysts consulted by EL PAÍS believe that the system does not offer all the required transparency guarantees.
The ABIS algorithm, named Cogent, has been developed by the French military technology company Thales. The system compares the image entered by the agents, extracted for example from a security camera, with the photographs available in the system to search for matches. The database with which the images will be compared will consist of some five million facial photographic reviews of detainees and suspects who were already on file, according to Interior (other sources speak of 5.6 million images of 3.9 million people arrested ). Those files are being harmonized so that the tool is able to read them.
To this data fund will be added the photographs of those who are arrested from the moment the system begins to be used. In no case, they point out from the National Police, can civil database records be used, such as the one that contains the photographs of the identity documents and to which the police also have access. Interior has been working on the project for at least three years, which has suffered several delays.
Each person has a unique arrangement of facial features. In a first phase, facial recognition systems extract the face from the image using a technology called computer vision; locate where in the photograph there is a face. Next, they apply an algorithm to that face to obtain a pattern that represents it and distinguishes it from the others. Artificial intelligence systems make it possible to search for this pattern, which is unique for each individual and which does not vary over the years, in extensive image banks and offer the most similar results. Each algorithm (each provider) has its own formula to trace the patterns and to search for matches.
As EL PAÍS has learned, the Spanish Agency for Data Protection (AEPD) is in contact with Interior “to address various projects of the Ministry that could have an impact on data protection,” including ABIS. The agency, which was unaware of the existence of the project until July, must determine whether or not the processing of this type of personal data poses a tolerable risk to the rights and freedoms of citizens. Can the Police keep facial data of subjects forever or should time limits be applied? Under what assumptions can the system be used? Who has access to that data? What guarantees are established for the provided use of the tool?
As soon as the database is finished, made up of records provided by the different police forces (such as the Civil Guard and Mossos), work stations will be deployed in the central services of the scientific police so that they can verify its use with real cases. Interior does not specify when it will be operational, but according to sources familiar with the process, its development could still take months.
The application of automatic facial recognition systems in police work is making its way in Europe, where several countries, such as France, the Netherlands or Germany, have carried out pilot tests or already have tools in use. This technology will begin to be used at the beginning of next year at the EU borders to register only non-EU citizens entering EU territory. In the United Kingdom, it has gone further and the police have placed vans with cameras equipped with these systems in front of the London subway entrances.
In the US, one of the countries where this technology is most widely used, several cities have decided to apply moratoriums on its use after protests by the Black Lives Matters movement, which identifies facial recognition as an element of police segregation. Other countries, such as Russia or China, take advantage of the surveillance potential of this technology. The large cities of the Asian giant are flooded with cameras that have live facial recognition systems capable of finding any citizen in a matter of hours.
A revolutionary tool
Inspector Sergio Castro, from the Forensic Anthropology Section of the Madrid Scientific Police General Station, leads the team of seven people who will initially be responsible for managing the ABIS tool. “It is probable that if the system is successful, they will reinforce us with more troops or decentralization will take place,” he points out. Once it is underway, it will also be decided on the criteria for the operation and users of the system (ie, whether or not the different bodies will have their own equipment to use ABIS).
Castro does not contain his enthusiasm when he talks about the new tool that they have put in his hands. His department has two main ways to identify suspects: fingerprint analysis and DNA analysis. Facial recognition will open a third way, which is also non-invasive: unlike the other two, it does not require having physical samples of the subject.
Until now, when there was no candidate or suspect, camera images of a bank where a robbery had taken place were of little use. It was unfeasible to start looking for who appears in the footage without having some clue to narrow the search. That’s where automatic facial recognition tools come in. “When you present an image of a person, the system orders the police report photographs [unos cinco millones, según interior] from the most similar to the least similar. Then the operator goes through the first positions in search of a match”, indicates Castro.
The agent’s work is key: depending on how clear the image is and the degree of concealment of the face (glasses, beard, differences in pose, etc.), the correct one may be the thirtieth. It is always a person, and not the computer, who determines whether or not there is a resemblance. “If we find a match, then we talk about a potential candidate. An investigation could be launched in search of evidence”, remarks the inspector. This process may or may not end in detention, depending on the evidence found.
In a second step, if the candidate for investigation or arrest is to be validated, a forensic study is carried out, just as was the case up to now with fingerprints or DNA. “My team would do a one-on-one study of the subject offered by the automated tool. A very high reliability is sought, because our expert can condition a sentence, and for that a lot of image quality is needed ”, he underlines. In 90% of the cases, the identifications they have to make are judicial requests; the rest are ex officio requests from other police departments, which have images of the perpetrator of a criminal act and need to conclusively confirm whether it is the person they are investigating or not in order to subsequently inform the courts.
The database that will contain the facial photographic records of all the suspects is the same one in which the fingerprints and DNA samples are already stored. These last two types of personal data are shared with European partners under the Schengen Information System (SIS). Brussels intends to include facial data in the same package in the future. “The Spanish ABIS system can connect with European databases, such as Eurodac, EU-Lisa or VIS, since the corresponding links are designed. It is not an isolated system, but is interconnected with the countries of the European Union”, Thales sources explain.
The risks of biometric technologies
The algorithms fail. And it is not the same that they are wrong when recommending a movie than identifying a suspect. That of the American Robert Williams is the first documented case of irregular detention due to a facial recognition system: the tool confused him with another and the agents, far from checking if he looked like the suspect, took him to jail. These systems are trained on data from white people, so they fail much more with blacks and Asians. There are Federal Government studies that prove that this technology is 100 times more likely to confuse black individuals than white ones.
The draft regulation for Artificial Intelligence that is being negotiated in Brussels adopts an approach based on the potential risks that the application of these technologies may entail. Facial recognition falls into the “high risk” category, although the door is open to its use as long as it is for “the purposes of preventing, arresting or investigating serious crimes or terrorism”. “Indiscriminate surveillance” tools are expressly prohibited, and therefore, in principle, these systems cannot be placed on the streets to identify people. That is by no means the intention of the Interior, according to ministry sources.
The application of algorithms to public affairs must be audited and monitored. According to Interior, the system, developed by Thales, has been validated by the Civil Guard and the National Police Force. “Scientific and criminal specialists from the State security forces and bodies have participated in the validation work,” these sources point out. The Cogent algorithm, the heart of the ABIS system, has also passed the vendor test of NIST, an independent US organization. “It is the guarantor that the evaluated algorithm complies with the standards and requirements that are demanded for the different use cases,” says Interior. “NIST does not say that algorithms are good or bad. And in addition, the organization proposes several evaluations with different objectives, and we do not know which ones they refer to, ”says Carmela Troncoso, professor at the Federal Polytechnic School of Lausanne (Switzerland) and author of the secure tracking protocol used in tracking applications. of covid.
Gemma Galdon, director of Eticas Consulting, a consultancy specializing in algorithmic audits, doesn’t think that’s enough either. “According to European regulations, the proportionality of high-risk technologies must be justified and what is expected to be achieved with them. It is also necessary to know what precautions have been taken to avoid algorithmic biases: it has been shown that these systems identify white people better than the rest, so you have to prove that you do not make mistakes with blacks, ”she explains.
You can follow THE COUNTRY TECHNOLOGY in Facebook Y Twitter or sign up here to receive our weekly newsletter.