Olympics 2024: France will use AI to monitor event
Olympics 2024 will take place starting July 26, so France has been preparing for the event in numerous ways. For security, the country said it will use artificial intelligence to identify potential threats and help forces to respond immediately.
However, many fear it is the precursor for intrusive surveillance reminiscent of Big Brother from George Orwell’s iconic novel “1984.”
Olympics 2024 and artificial intelligence
This year’s Olympics will happen in Paris, France, so authorities passed a new law that will secure the event with the latest technologies.
Article continues after this advertisementPopSci says it will allow law enforcement to use “experimental” artificial intelligence or AI algorithms to monitor public video feeds and provide “real-time crowd analyses.”
READ: Tokyo Olympics’ lights seen from space
The AI detection programs will sift through thousands of CCTV cameras looking for signs of potentially dangerous activity. These may include people with weapons, usually large crowds, unattended luggage, and sudden brawls.
Article continues after this advertisementFrance partnered with many AI companies like Orange Business and Videtics for these security technologies. Also, French authorities have already tested the system in select subway stations and events like the Depeche Mode concert.
Paris Police Chief Laurent Nunez told Reuters told the concert trial went “relatively well.” Moreover, “all lights are green” for its deployment in the Olympics 2024.
Human law enforcement officers will still decide which potential threats require action. Also, they claim that AI analyses will not use facial recognition or collect biometric identifiers.
Still, critics doubt AI analyses will be possible without referring to biometric identifiers. Doing so would violate Europe’s General Data Protection Regulation (GDPR) under the EU AI Act.
READ: Why AI is the latest game-changer for sports
Amnesty International Researcher & Advisor on Artificial Intelligence and Human Rights Matt Mahmoudi told PopSci that AI models still rely on “reference databases” with biometrics.
“Monitoring behavior involves the identification and detection of faces, bodies, their gestures, patterns and movements, all different forms of biometric data, which subjects communities to mass surveillance, and often racial discrimination,” Mahmoudi said.
Human Rights Watch wrote a letter with this statement:
“The mere existence of untargeted (often called indiscriminate) algorithmic video surveillance in publicly accessible areas can have a chilling effect on fundamental civic freedom.”