Tesi etd-03282025-141657
Link copiato negli appunti
Tipo di tesi
Dottorato
Autore
LEVANTINO, FRANCESCO PAOLO
URN
etd-03282025-141657
Titolo
“Emotional Dominance”: International and European Human Rights Law Perspectives on Emotion Recognition Technology in Law Enforcement
Settore scientifico disciplinare
IUS/13
Corso di studi
Istituto di Diritto, Politica e Sviluppo - PHD IN HUMAN RIGHTS AND GLOBAL POLITICS: LEGAL, PHILOSOPHICAL, AND ECONOMIC CHALLENGES
Commissione
relatore Prof.ssa CAPONE, FRANCESCA
Membro Prof. MOBILIO, GIUSEPPE
Presidente Prof.ssa BACHMAIER WINTER, LORENA
Membro Prof. MARTINICO, GIUSEPPE
Membro Prof. MOBILIO, GIUSEPPE
Presidente Prof.ssa BACHMAIER WINTER, LORENA
Membro Prof. MARTINICO, GIUSEPPE
Parole chiave
- Emotion Recognition
- EU Law
- Human Rights
- Policing
- Privacy
- Security
Data inizio appello
30/05/2025;
Disponibilità
parziale
Riassunto analitico
This dissertation critically examines the deployment of Emotion Recognition Technology (ERT) by Law Enforcement Agencies (LEAs) for internal security purposes, focusing on its compatibility with international and European human rights law standards. From this perspective, the use of ERT – i.e., Artificial Intelligence (AI) systems designed to detect or infer individuals’ emotions or intentions – raises serious concerns regarding the desirability and lawfulness of such uses in particularly sensitive domains, such as crime prevention and detection, the maintenance of public order and security, and national security. In fact, in these contexts – especially when deployed in publicly accessible spaces – the potential for significant interferences with protected fundamental rights and freedoms is particularly high.
In this respect, while the risks that other closely related applications of AI – such as Facial Recognition Technology (FRT) – pose to fundamental human rights, democracy, and the rule of law, are increasingly under scrutiny and, at the EU level, subject to stricter regulation, the specific implications of the use of ERT by Law Enforcement Agencies (LEAs) for security purposes remain, at this stage, largely neglected.
Building on an interdisciplinary approach primarily grounded in international and European human rights law, this dissertation aims to fill this substantive gap and advance the academic and policy debate on the opportunity of deploying ERT in law enforcement and the risks it poses. In doing so, this study first situates LEAs’ uses of ERT for security purposes within broader trends, including the internalisation of national security concerns, the resulting “blurring” of boundaries between internal and external security spheres, the shift towards prevention in criminal justice, and the increasing reliance on various forms of surveillance. In this connection, by extending the scope of surveillance from observable behaviours to their association to emotions or intentions, the use of ERT in law enforcement intensifies existing concerns – particularly in relation to the rights to private life and to the protection of personal data – and introduces new risks compared to those associated to other biometric technologies such as FRT.
To capture the paradigmatic shift generated by the introduction of ERT in LEAs’ practices and its implications for fundamental human rights, this dissertation draws on a comparative analysis of current discussions and regulatory frameworks addressing the use of FRT. This approach will allow for the conceptualisation of a shift from “identity” to “emotional dominance” – with this concept being used to identify and discuss the broad range of interferences with protected rights and freedoms arising from the use of emotion recognition in law enforcement.
Taking into account the (disputable) technical and scientific foundations of ERT, this study then examines the regulatory framework governing the use of this technology by LEAs under the EU AI Act, as well as its intersection with EU data protection law. Through a detailed legal analysis, this dissertation highlights definitional ambiguities, regulatory inconsistencies, and the AI Act’s permissive stance on LEAs’ deployments of ERT – especially when compared to the stricter regulation of FRT. To address these gaps, this dissertation argues that LEAs’ uses of ERT in publicly accessible spaces should have been subject to at least an equivalent level of safeguards as those adopted for analogous uses of other biometric-based technologies, such as FRT.
To stress-test this position, the dissertation presents a fictional yet realistic scenario involving the deployment of ERT under the heightened safeguards required by the AI Act for the certain uses of FRT. The results of this exercise suggest that even under a stricter regulatory regime, the risks posed by such deployments of ERT to several fundamental human rights – identified and discussed throughout the dissertation – would remain largely unaddressed.
In light of these findings, the dissertation concludes by highlighting the growing regulatory inadequacy of EU secondary law in addressing the challenges posed by emerging technologies and their uncritical deployment in sensitive domains. In contrast, it reaffirms the enduring relevance and adaptability of regional instruments such as the European Convention on Human Rights (ECHR) – as interpreted by the European Court of Human Rights (ECtHR) – not only in offering essential tools to retrospectively address identified violations, but also in providing anticipatory normative guidance for assessing the legitimacy and desirability of deploying intrusive and questionable technologies in the security sector.
In this respect, while the risks that other closely related applications of AI – such as Facial Recognition Technology (FRT) – pose to fundamental human rights, democracy, and the rule of law, are increasingly under scrutiny and, at the EU level, subject to stricter regulation, the specific implications of the use of ERT by Law Enforcement Agencies (LEAs) for security purposes remain, at this stage, largely neglected.
Building on an interdisciplinary approach primarily grounded in international and European human rights law, this dissertation aims to fill this substantive gap and advance the academic and policy debate on the opportunity of deploying ERT in law enforcement and the risks it poses. In doing so, this study first situates LEAs’ uses of ERT for security purposes within broader trends, including the internalisation of national security concerns, the resulting “blurring” of boundaries between internal and external security spheres, the shift towards prevention in criminal justice, and the increasing reliance on various forms of surveillance. In this connection, by extending the scope of surveillance from observable behaviours to their association to emotions or intentions, the use of ERT in law enforcement intensifies existing concerns – particularly in relation to the rights to private life and to the protection of personal data – and introduces new risks compared to those associated to other biometric technologies such as FRT.
To capture the paradigmatic shift generated by the introduction of ERT in LEAs’ practices and its implications for fundamental human rights, this dissertation draws on a comparative analysis of current discussions and regulatory frameworks addressing the use of FRT. This approach will allow for the conceptualisation of a shift from “identity” to “emotional dominance” – with this concept being used to identify and discuss the broad range of interferences with protected rights and freedoms arising from the use of emotion recognition in law enforcement.
Taking into account the (disputable) technical and scientific foundations of ERT, this study then examines the regulatory framework governing the use of this technology by LEAs under the EU AI Act, as well as its intersection with EU data protection law. Through a detailed legal analysis, this dissertation highlights definitional ambiguities, regulatory inconsistencies, and the AI Act’s permissive stance on LEAs’ deployments of ERT – especially when compared to the stricter regulation of FRT. To address these gaps, this dissertation argues that LEAs’ uses of ERT in publicly accessible spaces should have been subject to at least an equivalent level of safeguards as those adopted for analogous uses of other biometric-based technologies, such as FRT.
To stress-test this position, the dissertation presents a fictional yet realistic scenario involving the deployment of ERT under the heightened safeguards required by the AI Act for the certain uses of FRT. The results of this exercise suggest that even under a stricter regulatory regime, the risks posed by such deployments of ERT to several fundamental human rights – identified and discussed throughout the dissertation – would remain largely unaddressed.
In light of these findings, the dissertation concludes by highlighting the growing regulatory inadequacy of EU secondary law in addressing the challenges posed by emerging technologies and their uncritical deployment in sensitive domains. In contrast, it reaffirms the enduring relevance and adaptability of regional instruments such as the European Convention on Human Rights (ECHR) – as interpreted by the European Court of Human Rights (ECtHR) – not only in offering essential tools to retrospectively address identified violations, but also in providing anticipatory normative guidance for assessing the legitimacy and desirability of deploying intrusive and questionable technologies in the security sector.
File
Nome file | Dimensione |
---|---|
Ci sono 1 file riservati su richiesta dell'autore. |