A year and a half before the start of the Paris Olympics, the bill on the organization of the event was adopted by the Senate in the first reading on January 31. His article 7 is at the center of the debate. This involves “experimenting” surveillance cameras or AI-equipped drones to detect crowd movements, suspicious messages, or abnormal behavior. Communists and environmentalists voted against, socialists abstained, and right-wingers and centrists overwhelmingly supported the text.
Facial recognition or the use of biometric data (DNA, fingerprints, iris, etc.) is not included, but the Human Rights Activist believes that “bodily data and/or data from biometric systems” aimed at “revealing or deducing emotions, traits personality or intention” can be considered as biometric data. The impact study accompanying the bill ensures that “the multitude of affected locations and the expected level of security make it necessary to optimize the use of internal security forces and civil security forces, as well as real-time processing of collected images.”
“Paper experiment” for opponents, which will surely be immortalized
Opponents criticize the very vague nature of the criteria used: how to define “abnormal phenomena” or “criminal situations” as outlined in the bill’s impact study? The associations also dispute the very broad scope of the law passed in the upper house. The text could come into effect as soon as the law is finally passed by Parliament and apply until June 30, 2025, well after the end of the Olympic Games, which end on September 8.
“A paper experiment,” stings La quadrature du Net in a report released on January 21: “We could give an example of perpetuating black boxes in terms of intelligence or emergency measures, security measures that had to be taken. initially temporary and exclusive, and then systematically enshrined in common law.
“Red Lines” are respected according to Cnil
In its January 4 report, Cnil acknowledges that “the deployment, even experimental, of these devices represents a turning point.” “These image analysis tools can lead to massive collection of personal data and provide real-time automated monitoring,” she notes.
However, the digital cop believes that the bill respects the “red lines” it has set for the legislator, such as no processing of biometrics, experimental deployment limited in time and space, or lack of automatic decision: an algorithm-driven report will be systematically subjected to human analysis. . The senators also ensured that Cnil “supports” the development of algorithms and “evaluates the device,” notes Le Monde.
The commission is more concerned about the anti-doping article of the project, which provides for the “comparison of genetic fingerprints and the study of genetic characteristics” of athletes. Transposition of the World Anti-Doping Code, of course, but “particularly intrusive” tests that require “new derogations from the civil code,” Cnil points out.
Opponents of the project finally question the very effectiveness of video surveillance, with or without artificial intelligence. “Given local data, no overall correlation was found between the presence of video security devices and the level of crime committed on public roads, or even the level of disclosure,” the Accounts Chamber even pointed out in October 2020. However, objective data on this issue would be more than necessary, writes the Court, given the “magnitude of the amounts allocated over more than ten years”.
Selected for you