La vidéosurveillance algorithmique
  • Research

Algorithmic Video Surveillance: Will Enhanced Security Compromise Freedom?

In a context where security is a central concern, algorithmic video surveillance (AVS) emerges as a promising technological solution. However, this advancement raises numerous ethical and societal questions. Here we examine its promises and limitations with Adrien Tallent, a Ph.D. candidate in political philosophy.

Enhanced surveillance via artificial intelligence

Authorized experimentally in France to secure major events of the Paris Olympic and Paralympic Games, AVS aims to integrate image analysis algorithms into current video surveillance systems. Its goal is to increase image processing capacity, enabling the monitoring of a growing number of cameras in public spaces.

"The argument in favor of AVS is that it could automatically detect behaviors, suspicious objects, or crowd movements without resorting to biometric surveillance, which relies on sensitive data like facial recognition, currently prohibited by law," explains Adrien Tallent. "However, according to some critics, AVS could be a disguised form of biometric surveillance since a person's gait or clothing might indirectly identify them."

The promises of "objective" security

Another argument supporting AVS is its apparent objectivity. "It is believed that the machine does not discriminate; it simply analyzes images neutrally and flags suspicious behaviors or events," reports Adrien Tallent.

For the researcher, this perspective relies on the myth of technological neutrality: "The common belief is that technology itself is neutral, and only its usage determines its impact. A knife can be used to cut tomatoes or to harm someone. Similarly, video surveillance can strengthen democracy or, conversely, be used to control the population." But, according to the Ph.D. candidate, technology is never truly neutral. It is influenced from its design phase by values, biases, financial, geopolitical and economic interests. "Investment decisions influence the direction in which technology evolves, and this has societal repercussions. For example, developing artificial intelligence for military rather than democratic purposes is not a neutral choice."

The hidden biases of algorithms

For Adrien Tallent, this perception of objectivity is also reinforced by our historical trust in mathematical rationality, inherited from the Enlightenment and the scientific revolution. However, this trust can be misleading. Despite their apparent neutrality, AVS algorithms can reproduce and amplify existing biases. "Algorithms need to be trained on large datasets to learn how to detect suspicious behaviors or abnormal objects...The problem is that identifying what is 'abnormal' requires defining a norm. That's where biases begin," explains Adrien Tallent.

These biases can be racial, gender-based, or related to other forms of discrimination present in training data. "Naturally, these biases are embedded in the algorithms. As a result, these machines can disproportionately identify behaviors of minorities as 'abnormal' simply because they deviate from the majority norm," he adds. This can lead to the automation of discrimination.

Opacity of surveillance systems and democratic challenges

AVS appears to public decision-makers as a solution to guarantee security more discreetly, without mobilizing significant human resources. "It also prevents suspicious events from escaping the attention of human agents who spend hours monitoring screens. While it can assist surveillance personnel, it simultaneously justifies its use without questioning the proliferation of cameras," adds the Ph.D. candidate.

This technology also raises the issue of transparency: "These systems are opaque, and we do not always know who produces them, how they are designed, and under what rules," emphasizes the philosopher. This opacity raises crucial questions about the accountability of these systems and our democratic principles. "How can we challenge a decision based on an algorithm whose functioning is obscure? This threatens the very principles of transparency and justice on which democratic societies are founded," recalls Adrien Tallent.

'Soft' surveillance and risks to individual freedoms

AVS also raises concerns about its impact on individual freedoms and privacy. "Being constantly monitored reduces freedom, even without being consciously aware of it," he explains. It can lead to a form of self-censorship, even unconsciously, subtly altering our behaviors in public spaces.

In this way, AVS fits into a broader trend: 'soft' surveillance. In the digital age, surveillance has become more diffuse and less visible. Online, every click, purchase, and video viewed is recorded to influence what is shown or recommended to us. "This 'soft' surveillance is insidious," warns Adrien Tallent. "It can lead to an invisible but powerful form of control, where our behaviors are influenced without our full awareness."

Towards the normalization of surveillance?

Although presented as temporary, the experimentation with AVS during the Paris Olympic and Paralympic Games could become a permanent feature in France. "It's quite common with emergency laws: a temporary measure ends up being adopted long-term," recalls Adrien Tallent. Once these systems are in place, it becomes politically difficult to remove them, according to the researcher. This trend is accompanied by an acclimatization effect. Citizens, accustomed to the presence of these devices, eventually accept these measures without questioning their impact on individual freedoms. "There is a kind of gradual trivialization where surveillance becomes a normal component of public space, to the point of no longer being perceived as problematic," explains the Ph.D. candidate.

For Adrien Tallent, the main argument used to justify the permanence of these measures is based on the idea that "security is the first of freedoms." This phrase, popularized by right-wing political figures, has, according to him, become a mantra to justify security measures. "This narrative fuels a vicious cycle. By constantly emphasizing the feeling of insecurity, surveillance measures end up seeming normal, even necessary, even if statistics do not always support this perception," he points out.

The "Technopolis"

AVS is also part of a global movement known as "technopolis," which leverages technology to optimize security and city management. This quest for efficiency, often justified by ecological or financial benefits, can also, according to Adrien Tallent, turn into generalized control: "The growing need for data to make automated decisions leads to an ever-increasing collection of information. The more we seek to control and optimize, the more we depend on data. And the more data we collect, the less individuals can escape this surveillance," the researcher observes.

For Adrien Tallent, this increasing automation of decisions through digital technologies raises fundamental questions about the social contract and the role of democracy. "The more we delegate decisions to algorithmic systems, the narrower the space for political debate and democratic decision-making becomes," he explains. In an extreme scenario where data collection covers all individual behaviors, opinions, and thoughts, it would be tempting for an organization to claim it knows the optimal way to govern. "But such centralization of data and decisions endangers free will, democracy, and even ethics because it profoundly limits the space for individual freedom. This is why it is essential for civil society to address these issues," concludes the philosopher.

Par Justine Mathie

Adrien Tallent, a Ph.D. Candidate Committed to the Study of Digital Transformations

After completing a dual education in business school and philosophy, Adrien Tallent found his calling in research. Currently pursuing a CIFRE Ph.D., he collaborates with SNCF Réseau to study the impact of digital technologies on society and governance.

His work explores the collection and analysis of data through artificial intelligence, as well as its consequences on democracy and the social contract. Interested in the philosophical history of rationality, he analyzes how digital tools are transforming our ways of governing.

Adrien Tallent