As skiers schussed and swerved in a snow park outside Beijing during the 2022 Winter Olympics, a few may have noticed a string of towers along the way. Did they know that those towers were collecting wavelengths across the spectrum and scouring the data for signs of suspicious movement? Did they care that they were the involuntary subjects of an Internet of Things–based experiment in border surveillance?
This summer, at the Paris Olympic Games, security officials will perform a much bigger experiment in the heart of the City of Light, covering the events, the entire Olympic village, and the connecting roads and rails. It will proceed under a
temporary law allowing automated surveillance systems to detect “predetermined events” of the sort that might lead to terrorist attacks.
This time, people care. Well, privacy activists do. “AI-driven mass surveillance is a dangerous political project that could lead to broad violations of human rights. Every action in a public space will get sucked into a dragnet of surveillance infrastructure, undermining fundamental civic freedoms,”
said Agnes Callamard, Amnesty International’s secretary general, soon after the law passed.
Yet the wider public seems unconcerned. Indeed, when officials in Seine-Saint-Denis, one of the districts hosting the Olympics, presented information about a preliminary AI-powered video surveillance system that would detect and issue fines for antisocial behavior such as littering, residents raised their hands and asked why it wasn’t yet on their streets.
“Surveillance is not a monolithic concept. Not everyone is against surveillance,” says anthropology graduate student
Matheus Viegas Ferrari of the Universidade Federal da Bahia, in Brazil, and the Université Paris 8: Saint-Denis, in Paris, who attended the community meeting in Seine-Saint-Denis and published a study of surveillance at the 2024 Olympics.
Anyone who fumes at neighbors who don’t pick up after their dogs can identify with the surveillance-welcoming residents of Seine-Saint-Denis. If, however, the surveillance system fines one neglectful neighbor more than another because its algorithm favors one skin color or clothing style over another, opinions could change.
Indeed France and other countries in the European Union are in the midst of
hammering out the finer details of the European Union’s AI Act, which seeks to protect citizens’ privacy and rights by regulating government and commercial use of AI. Already, poor implementation of an AI law related to welfare policy has felled one European government.
Countries often treat the Olympics like a security trade fair.
It seems the temporary surveillance law–the video-processing clause of which expires in March 2025–was written to avoid that outcome. It insists that algorithms under its authority “do not process any biometric data and do not implement any facial recognition techniques. They cannot carry out any…
Read full article: A New Olympics Event: Algorithmic Video Surveillance
The post “A New Olympics Event: Algorithmic Video Surveillance” by Lucas Laursen was published on 12/27/2023 by spectrum.ieee.org