Paris—French government plans to trial surveillance cameras upgraded with artificial intelligence at the 2024 Olympics have opponents fuming at what they say is unnecessary and dangerous security overreach.
While the government says such systems are needed to manage millions-strong crowds and spot potential dangers, critics see the draft law as a gift to French industry at the cost of vital civil liberties.
Last week, around 40 mostly left-leaning members of the European Parliament warned in an open letter to French lawmakers that the plan “creates a surveillance precedent never before seen in Europe”, daily Le Monde reported.
Debates kicked off late Monday in the National Assembly, France’s lower parliamentary chamber, with discussions to continue Friday.
Even before the debates started, MPs had already filed 770 amendments to the government’s wide-ranging Olympics security bill, many aimed at its Article Seven.
That section provides for video recorded by existing surveillance systems or new ones — including drone-mounted cameras — to be “processed by algorithms”.
Artificial intelligence software would “detect in real time pre-determined events likely to pose or reveal a risk” of “terrorist acts or serious breaches of security”, such as unusual crowd movements or abandoned bags.
Systems would then signal the events to police or other security services, who could decide on a response.
Biometric or not?
The government is at pains to reassure that the smart camera tests would not process biometric data and especially not resort to facial recognition, technologies the French public is wary of applying too broadly. AFP
“The experiment is very precisely limited in time… (and) the algorithm does not substitute for human judgement, which remains decisive,” Sports Minister Amelie Oudea-Castera told MPs.
The interior ministry highlights a February survey for the Figaro daily suggesting that large majorities back using the cameras in public spaces and especially in stadiums.
But opponents say the plans overstep the bounds of the French constitution and European law.
Digital rights group La Quadrature du Net (QDN) wrote in a report sent to lawmakers that the systems would in fact handle sensitive “biometric” data under a broad 2022 definition from France’s rights ombudsman.
As biometric data, those characteristics would be shielded by the European Union’s powerful General Data Protection Regulation (GDPR), QDN argues.
An interior ministry spokesman rejected that finding, insisting that the planned processing did not use any biometric data or any facial recognition techniques.
‘State of emergency’
The camera test period is slated by the bill to run to the end of 2024 — well after the end of the games and covering other major events including the Rugby World Cup later this year.
Once the law is passed, public authorities such as the emergency services and the bodies responsible for transport security in the Paris region will be able to request its use.
The interior ministry said it “should cover a significant number of large events” for “the most complete and relevant evaluation”.
But QDN activist Naomi Levain told AFP: “It’s classic for the Olympic Games to be used to pass things that wouldn’t pass in normal times”.
“It’s understandable for there to be exceptional measures for an exceptional event, but we’re going beyond a text aimed at securing the Olympic Games,” Socialist MP Roger Vicot told the chamber on Monday.
Elise Martin, an MP following the process for hard-left opposition party France Unbowed (LFI), told AFP that the bill was just the latest of a slew of additional security powers introduced under President Emmanuel Macron since 2017.
“The way this law is thought out is as if we live in a permanent state of emergency,” she said.
‘Favour to industry’
Meanwhile QDN’s Levain highlighted that “many of the leaders in this market are French businesses”, calling the bill’s provisions a “favour to industry”.
The size of the video surveillance market in France alone was estimated at 1.7 billion euros ($1.8 billion) in a 2022 article published by industry body AN2V, with the global business many times larger.
If passed, the law would make the 2024 Olympics “a shop window and a laboratory for security”, handing firms an opportunity to test systems and gather training data for their algorithms, Levain said.
Some cities in France, such as Mediterranean port Marseille, are already using “augmented” surveillance in what is at present a legal grey area.
Such data is needed to train computer programmes on what kinds of behaviour to flag as suspect, learning to recognise patterns in moving images — just as text AIs such as ChatGPT are trained on large bodies of writing before they can generate written output of their own.
But opponents say that there is little or no evidence that augmented surveillance — or even more traditional CCTV systems — can prevent crimes or other incidents around the large sporting and cultural events targeted by the draft law.
Smart cameras “wouldn’t have changed anything at the Stade de France” last year, when huge crowds of Liverpool supporters were rammed into tiny spaces as they waited to enter the Champions League final, Levain said.
“That was bad human management, there’s know-how to managing a crowd, calculations to be made about placing barriers and directing flows… no camera can do that,” she added.