[ad_1]
The urgencies of the war have led the Ukrainian government to decide to use the facial recognition system of Clearview AI, an American company that is suing both sides of the Atlantic for collecting images of people without asking for permission. As advanced by Reuters and confirmed by this newspaper, the company offers kyiv unlimited access to its tool since last Saturday, as well as free training for the staff that will handle it. The system, which is capable of identifying a person from an image, be it a photograph or a frame of a video, could be useful, says Clearview AI, to detect infiltrated Russian soldiers, identify dead bodies or help in the reunification of refugee families.
The startup He did not want to respond to this newspaper, however, about whether he plans to extend the invitation to use his software to countries bordering Ukraine and members of the EU, such as Poland or Hungary. These two States are known for defending the police applications of facial recognition, which are concentrating a good part of the exodus of refugees that has caused the invasion.
The controversy surrounding Clearview AI has to do with the database from which it feeds: it is made up of more than 10,000 million photographs of faces taken from the internet without any permission. The intention of the company is to reach 100,000 million images next year, so that it can identify almost any person on the planet. Their services are requested by the security forces of several countries, including some European ones. In total, about 2,000, according to the company itself. The FBI or the US Department of Homeland Security are among them. Some social networks, such as Facebook, Google, YouTube, Twitter or LinkedIn, have formally requested the company to stop collecting photos from their respective platforms.
In Europe, Clearview AI has been sued by data protection authorities in six countries: France, Germany, Austria, Italy, Greece and the UK. In December last year, Paris ordered the company to stop collecting photos of French citizens and to delete any that it had stored. Rome, for its part, imposed a fine of 20 million euros on the company last week, the highest amount allowed by European law, and also ordered it to eliminate the files it has related to Italians. “This decision closes the European market to Clearview AI and similar companies”, valued the lawyer Alan Dahi, from the Austrian organization of activists in defense of privacy NOYB. The European Parliament specifically referred to Clearview AI, for operating with a private database, when it called for a “permanent ban on automatic recognition of individuals in public spaces” in a resolution in October and tougher regulations surrounding its police use. .
“Clearview AI is used to solve crimes such as human trafficking or drug trafficking,” says the founder and CEO of the company, Hoan Ton-That, in the letter sent to the Ukrainian Government in which he offers his services and to which he has had access to EL PAÍS. “Our database contains photos from all over the world, and a large proportion of Russian social networks such as Vkontakte”, of which there are more than 2,000 images, according to the letter.
Ton-That says that its tool is able to identify Russian infiltrators “just by taking a photo with your mobile phone of the person or their identity card.” It also explains that it can recognize fatal victims even if they have suffered facial damage and that it can be used to “identify people who are in refugee camps without documents proving their identity.”
The exceptionality of war
“The database on which Clearview AI is based should never have existed,” says Borja Adsuara, an expert in digital law, as a summary of the legal problems surrounding this company. In the EU, facial recognition is not prohibited, but it is limited to certain cases. With few exceptions, it cannot be used in public spaces, for example, and its use must be proportionate (recognizing frequent thieves in stores, as Mercadona claimed, is not admissible) and limited to cases in which citizen security is at stake.
Another key is the very nature of the databases on which these systems are based. The EU is, compared to the US or other countries, especially jealous of the privacy of its citizens’ information. The police, for example, can use facial recognition at airports, but because it is assumed that the databases with which the images of people cross are subject to controls and only include terrorists and other criminals. Hence Adsuara’s consideration: in the case of Clearview AI, the error is of origin in that all the photos he uses to identify people were taken from the internet without any consent.
Then there is the reliability of the tool itself. Although according to the company itself its success rate is over 90%, a false negative (that the system makes a mistake in identifying someone) in a war context can have serious consequences for the victim. “That is why the use of facial recognition in this context should be prohibited. It is too dangerous to also leave it to the discretion of a private company”, says Ella Jakubowska, coordinator of the facial biometrics program at EDRI, a Brussels-based NGO that works for the defense of Human Rights in the digital age.
The underlying concern raised among experts by the landing of Clearview AI in Ukraine is the use that may be made of its technology in the future. When the US military left Afghanistan last summer, it left behind iris and facial pattern reading systems on the ground. The Taliban are suspected of using these devices along with databases also compiled by the Americans to identify and hunt down collaborators of the regime they overthrew. “The Clearview AI database can be used against the good guys or the bad guys,” Adsuara says.
“The introduction of their technology in a war context is a huge red flag. There is no way to safely carry out mass facial surveillance,” adds Jakubowska. “Clearview AI may have chosen this time to align itself with public opinion that supports Ukraine, but what if next time it sides with a regime that wants to use that technology against its own people?”
You can follow THE COUNTRY TECHNOLOGY on Facebook and Twitter or sign up here to receive our weekly newsletter.
Exclusive content for subscribers
read without limits
[ad_2]
Quellenlink : elpais.com