Justice Speakers Institute

  • Home
  • What We Do
    • What JSI Can Do For You
    • Curriculum & Training Development
    • Corporate Road Safety
    • Selected Trainings & Publications
    • Service Inquiry
  • Meet JSI
    • Why the JSI?
    • The Partners and Associates of JSI
    • Our Topics of Expertise
    • Upcoming Events
    • Worldwide Expertise
    • Testimonials
    • Becoming JSI Associate
    • JSI Code of Ethics
  • JSI Blog
    • JSI Blog Menu
    • Justice and AI
      • AI in the Courts – An AI Series Hub
      • Hardwiring Justice – An AI Series Hub
  • JSI Podcast
  • JSI Justice Publications
    • JSI Justice Publications
    • Science Bench Book for Judges
      • Additional Resources
    • Drug Testing Programs
    • Corporate Road Safety
  • Resources
    • JSI Justice Publications
      • JSI Justice Publications
      • Science Bench Book for Judges
        • Additional Resources
    • Veterans Courts
    • Drug Testing Programs
    • Corporate Road Safety
    • Procedural Justice
    • Drugged Driving
  • Contact Us
Contact
JSI
Criminal justice reform consultant
Hon. Brian MacKenzie (Ret.)
Tuesday, 09 December 2025 / Published in Artificial Intelligence, Law, Problem Solving Courts

AI in Treatment Courts: Keeping Treatment Human

Share Button

Artificial intelligence is no longer a distant possibility for treatment courts. It is already influencing how participants seek help, how probation interacts with clients, how digital evidence is reviewed, and how courts assess risk and supervision. Treatment courts remain the justice system’s most human centered institutions, created to engage, support, and rehabilitate people whose underlying needs contribute to criminal behavior. That mission does not change because AI has arrived. Courts must establish clear boundaries now before the technology sets them on its own.

A growing concern reflects this moment. Participants are turning to AI chatbots as a substitute for therapy. Courts are beginning to see individuals who prefer AI because it feels nonjudgmental, always available, and less intimidating than a human clinician. Some courts are uncertain how to respond.

They should not be uncertain.

Participants with a history of suicidal ideation, self-harm, acute trauma, or other clinical instability cannot rely on AI in place of real treatment. Every credible clinical, medical, and judicial ethics body that has reviewed this issue reaches the same conclusion. AI may supplement support, but it cannot conduct risk assessments, deliver therapy, manage crises, or assume responsibility for clinical care. It cannot detect escalating distress, intervene in real time, coordinate safety plans, or replace the judgment and accountability of a licensed therapist.

Treatment courts are built on evidence-based practice. The evidence is clear. AI is not therapy.

The evidence is clear. AI is not therapy.

Why This Matters for Treatment Courts

Treatment court participants are often clinically fragile. Many face co-occurring disorders, trauma histories, unstable medication regimens, or abrupt emotional shifts caused by stress or withdrawal. These courts depend on transparent communication, strong therapeutic alliances, and reliable human oversight. None of these elements exist when participants rely on AI systems that function as sophisticated text prediction tools.

This is where the danger lies. AI sounds therapeutic. It mimics empathy. It produces reassuring, counselor like responses. But it does not understand risk, nuance, or context, and it has no duty of care. It may offer comforting statements while missing signs of crisis. It may unintentionally give advice that undermines treatment, supports avoidance, or normalizes harmful behavior.

Courts have already seen what happens when new technologies enter justice environments without clear rules. As the Justice Speakers Institute’s AI series shows, AI tools such as risk assessments and video analysis systems can be useful when properly governed but dangerous when unregulated. The same principle applies here. When courts lack policies, technology fills the vacuum, often at the expense of safety, fairness, and clinical integrity.

AI as a Supplemental Tool, Not a Substitute

Some participants will continue to use AI because it feels supportive or helps them process emotions between sessions. Courts do not need to prohibit that use entirely. AI can play a limited, supplemental role, similar to journaling apps, wellness trackers, or psychoeducation tools, as long as that use is discussed with their therapist.

The core treatment must always be delivered by a licensed human clinician. That clinician must evaluate risk, track progress, assess suicidality, adjust treatment plans, and maintain responsibility for safety. Clinicians should also inform the treatment court team, in an appropriate manner, about a participant’s reliance on AI tools. Courts must be explicit. No AI system can assume the therapeutic role.

Treatment courts that have already addressed this issue classify AI as strictly secondary. Participants may use it, but never in place of therapy. Individuals with any history of suicidal ideation, self-harm, or severe mental illness should be strongly discouraged from using AI for emotional support. This is not resistance to technology. It is protection of the participant’s life.

This is not resistance to technology. It is protection of the participant’s life.

What Treatment Courts Need to Do Now

If your treatment court does not have a written policy addressing participant use of AI, now is the time to develop one. A strong policy should include at least four components.

1. Human Provided Treatment Is Mandatory

AI tools cannot replace individual therapy, group therapy, trauma counseling, medication management, or crisis intervention.

2. Enhanced Protections for High-Risk Participants

Individuals with histories of suicidal ideation, suicide attempts, acute psychiatric symptoms, or active self-harm behaviors should not use AI for emotional support.

3. Transparent Communication With Participants

Courts should explain clearly, both verbally and in writing, that AI is not a clinician, cannot assess safety, cannot intervene, and cannot provide treatment. Participants must understand these limits.

4. Integration Into Supervision and Treatment Plans

Any AI use should be discussed during staffing, documented in treatment plans, and reviewed by clinicians. Increased reliance on AI should be treated as clinically relevant information, not simply as a personal preference.

AI Will Shape the Future of Treatment Courts, but It Must Not Replace Their Core

Treatment courts succeeded because they rejected the assembly line model of justice and built interventions grounded in behavioral science, compassion, and accountability. AI can support that mission by helping staff flag early warning signs, streamline administrative tasks, or expand access to educational materials. That is only possible when courts maintain strict oversight.

AI cannot build trust. It cannot provide empathy. It cannot treat trauma, addiction, or severe mental illness. That work requires people.

The strength of treatment courts has always been personal connection. Judges speak directly to participants. Teams coordinate care. Clinicians guide change. Participants learn that accountability and support can exist together. AI may enhance some parts of that process, but it can never replace it.

This is the moment for treatment courts to adopt clear, written policies that protect participants, preserve clinical standards, and ensure that technology serves the court and not the other way around.

Other Articles On AI in the Courts

Introduction: Artificial Intelligence and the Courts: A Blog Series from Justice Speakers Institute
Part 1: AI in the Courtroom: Opportunities and Risks
Part 2: AI in the Courts: Ethical Challenges
Part 3: AI on Trial – Admissibility of AI-Generated Evidence
Part 4: Judicial Decision-Making: Transparency, Accountability, and the Judicial Role
Part 5: Courts of the Future-Innovation, Access, and Global Trends
Part 6: Judging the Machine-Lessons, Guardrails, and the Path Forward

Get more articles like this
in your inbox

Subscribe to our mailing list and get the latest information and updates to your email inbox.

Thank you for subscribing.

Something went wrong.

We respect your privacy and take protecting it seriously

Related

Tagged under: AI in treatment courts, Behavioral health and technology, Evidence-based justice, Judicial ethics and technology, Treatment court policy

What you can read next

AI-generated evidence admissibility
Part Three: AI on Trial – Admissibility of AI-Generated Evidence
trust in the judiciary
Trust in the Judiciary
Strong Unitary Executive Theory
Preserving Balance: State Responses to the Strong Unitary Executive Theory

Subscribe to JSI’s Blog Posts

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts

  • Treatment Court Data

    Treatment Court Data and National Infrastructure with Dr. DeVall

    This Justice Speaks episode examines how treatm...
  • AI systems in the criminal justice system

    AI Describes Many Technologies and None of Them Are Intelligent

    What courts call “AI” is rarely intelligent—and...
  • AI governance in the justice system

    Hardwiring Justice: Governing AI Before It Governs the Justice System

    Artificial intelligence is already embedded acr...

Upcoming Events

MENU

  • Home
  • Our Services
  • Why the JSI?
  • JSI Blog
  • Contact JSI

Copyright © 2022  Justice Speakers Institute, LLC.
All rights reserved.



The characteristics of honor, leadership and stewardship are integral to the success of JSI.

Therefore the Partners and all Associates subscribe to a Code of Professional Ethics.

JOIN US ON SOCIAL MEDIA

JUSTICE SPEAKERS INSTITUTE, LLC

P.O. BOX 20
NORTHVILLE, MICHIGAN USA 48167

CONTACT US

TOP

Get more information like this
in your inbox

Subscribe to our mailing list
and get interesting content and updates to your email inbox.

Thank you for subscribing.

Oops. Something went wrong.

We respect your privacy and take protecting it seriously

https://justicespeakersinstitute.com/wp-admin/admin-ajax.php