The Limits of Artificial Intelligence in Legal Speech and Institutional Authority

Research output: Contribution to conferencePaperpeer-review

Abstract

The rise of artificial intelligence (AI) systems capable of generating legal texts and assisting in decision making raises profound jurisprudential questions: Can machines perform legal acts, or do their outputs remain mere simulations devoid of institutional authority? This article explores these issues through the lens of speech act theory, particularly John L. Austin’s framework of locution, illocution, and perlocution, and its application to law. Legal speech acts, such as judicial rulings, legislative enactments, and contractual agreements, do not merely describe norms; they create legal realities. Their performative power depends on institutional authority, procedural embedding, and collective intentionality, elements that AI systems fundamentally lack. While AI excels at producing fluent and contextually accurate locutionary content and may influence decisions and strategies (perlocutionary effects), it cannot perform law in the illocutionary sense. Legal performativity requires more than linguistic competence; it demands recognition and authority within institutional and social frameworks. This distinction between simulation and genuine legal performance is critical, as the illusion of performative equivalence, where AI-generated outputs are mistaken for binding legal acts, poses risks to legal validity, democratic legitimacy, and public trust.

The article addresses these challenges by proposing a “Human in the Loop” (HITL) model as a normative framework for integrating AI into legal systems. Under this model, AI enhances efficiency, accessibility, and analytical capabilities, but all binding legal acts remain the exclusive domain of authorized human actors. This approach ensures that the performative essence of law is preserved while leveraging the benefits of AI. By grounding its analysis in speech act theory, legal pragmatics, and illustrative case studies, the article contributes to ongoing debates about the role of AI in law. It offers a cautionary yet constructive perspective, reaffirming that while AI can augment legal processes, the creation and performance of law remain uniquely human and institutional endeavors.
Original languageEnglish
Publication statusPublished - 24 Apr 2026
EventAmerican Association For Applied Linguistics (AAAL) 2026 Conference - Online, Chicago, United States
Duration: 21 Mar 202624 Apr 2026
https://www.aaal.org/aaal-2026-conference

Conference

ConferenceAmerican Association For Applied Linguistics (AAAL) 2026 Conference
Abbreviated titleAAAL 2026 Conference
Country/TerritoryUnited States
CityChicago
Period21/03/2624/04/26
Internet address

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 16 - Peace, Justice and Strong Institutions
    SDG 16 Peace, Justice and Strong Institutions

Keywords

  • Artificial Intelligence
  • Legal Speech Acts
  • Institutional Authority
  • Deontic Power
  • Human-in-the-Loop (HITL)

Fingerprint

Dive into the research topics of 'The Limits of Artificial Intelligence in Legal Speech and Institutional Authority'. Together they form a unique fingerprint.

Cite this