Spyware, lawful trojans, proportionality, and the limits of oversight.

Surveillance has long occupied a contested space between security and rights. With the integration of artificial intelligence into monitoring, interception, and investigative technologies, this space has expanded into a governance risk zone. AI-enabled surveillance does not merely enhance existing capabilities; it transforms their scale, persistence, and opacity, often outpacing the legal and institutional safeguards designed to constrain them.

When oversight mechanisms fail to adapt, surveillance evolves from a security instrument into a source of governance failure, undermining proportionality, accountability, and fundamental rights.

The evolution of lawful surveillance

Tools such as spyware and so-called “lawful trojans” were originally framed as targeted, exceptional measures for serious crime and national security investigations. Their legitimacy rested on strict conditions: judicial authorisation, necessity, proportionality, and time limitation.

AI integration alters these premises. Automated targeting, behavioural analytics, and pattern recognition extend surveillance beyond discrete suspects toward continuous monitoring ecosystems. What was once episodic becomes persistent; what was once targeted becomes inferential.

This shift raises a fundamental governance question: can proportionality survive automation?

Proportionality under algorithmic pressure

Proportionality is a legal and ethical cornerstone of surveillance governance. It requires that intrusive measures be limited, justified, and balanced against individual rights.

AI challenges proportionality by introducing:

  • large-scale data correlation beyond initial investigative scope,
  • predictive inferences based on probabilistic models,
  • automated expansion of surveillance targets through network analysis.

In such contexts, intrusion is no longer a discrete act but an ongoing process, making proportionality difficult to assess and even harder to contest. The risk is not only excessive surveillance, but the normalisation of intrusion as a default investigative posture.

Oversight in name, opacity in practice

Formal oversight mechanisms (judicial review, parliamentary committees, independent authorities) often remain anchored to pre-AI assumptions. They authorise tools, not systems; operations, not data flows; warrants, not algorithmic pipelines.

AI-enabled surveillance systems fragment visibility. Decisions emerge from layered architectures involving vendors, algorithms, human operators, and outsourced infrastructure. Oversight bodies may approve surveillance without access to model logic, training data, or automated decision pathways.

This creates an oversight paradox: surveillance appears lawful, yet its operation becomes effectively unreviewable.

Digital forensics and the accountability gap

Digital forensics plays a critical role in exposing governance failures in AI-enabled surveillance. Forensic reconstruction can reveal how data was collected, processed, correlated, and acted upon.

However, forensic access is often restricted or delayed, particularly in national security contexts. Without timely forensic scrutiny, accountability mechanisms become symbolic rather than effective. Rights violations may be acknowledged only after harm has occurred—and without clear attribution.

In governance terms, this represents a shift from ex ante control to ex post damage management, a model incompatible with fundamental rights protection.

AI governance beyond compliance

AI governance in surveillance contexts is frequently reduced to compliance with statutory authorisations or technical safeguards. This approach is insufficient.

Effective governance requires:

  • clear limits on automated expansion of surveillance scope,
  • mandatory explainability of targeting and inference mechanisms,
  • auditable decision logs linking human authorisation to machine action,
  • enforceable sunset clauses and data minimisation practices.

Without these elements, AI-enabled surveillance systems become self-reinforcing, gradually escaping the constraints that justified their deployment.

Rights erosion as institutional risk

The erosion of privacy, due process, and freedom of expression is often framed as an individual harm. In reality, it is also an institutional risk. When surveillance systems operate beyond meaningful oversight, public trust deteriorates, legal challenges proliferate, and democratic legitimacy weakens.

AI does not create this risk, but it accelerates it by embedding surveillance into infrastructure rather than operations.

Conclusion

When surveillance becomes AI-enabled without corresponding governance adaptation, the result is not enhanced security but governance failure. Spyware and lawful trojans, once exceptional tools, risk evolving into persistent systems that undermine proportionality and oversight.

Cybersecurity, digital forensics, and AI governance must therefore converge to ensure that surveillance remains legitimate, contestable, and constrained. Without this convergence, AI transforms surveillance from a means of protection into a structural threat to rights and, ultimately, to the institutions that rely on them.

References

  1. European Union
    Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act).
    Official Journal of the European Union, 2024.
  2. European Data Protection Supervisor (EDPS)
    Opinion on the Use of AI in Law Enforcement and Surveillance.
    EDPS, 2023.
  3. Council of Europe
    Guidelines on Facial Recognition and Law Enforcement.
    Council of Europe, 2022.
  4. United Nations Special Rapporteur on the Right to Privacy
    Surveillance and Human Rights in the Age of Artificial Intelligence.
    United Nations General Assembly, 2023.
  5. De Hert, P., Gutwirth, S., & Wright, D.
    “The Law, Privacy and Surveillance: Mapping the Challenges of Emerging Technologies.”
    Computer Law & Security Review, Vol. 29, No. 3, 2013.

No responses yet

Rispondi

Scopri di più da Federica Bertoni

Abbonati ora per continuare a leggere e avere accesso all'archivio completo.

Continua a leggere