Justice Speakers Institute

  • Home
  • What We Do
    • What JSI Can Do For You
    • Curriculum & Training Development
    • Corporate Road Safety
    • Selected Trainings & Publications
    • Service Inquiry
  • Meet JSI
    • Why the JSI?
    • The Partners and Associates of JSI
    • Our Topics of Expertise
    • Upcoming Events
    • Worldwide Expertise
    • Testimonials
    • Becoming JSI Associate
    • JSI Code of Ethics
  • JSI Blog
    • JSI Blog Menu
    • Justice and AI
      • AI in the Courts – An AI Series Hub
      • Hardwiring Justice – An AI Series Hub
  • JSI Podcast
  • JSI Justice Publications
    • JSI Justice Publications
    • Science Bench Book for Judges
      • Additional Resources
    • Drug Testing Programs
    • Corporate Road Safety
  • Resources
    • JSI Justice Publications
      • JSI Justice Publications
      • Science Bench Book for Judges
        • Additional Resources
    • Veterans Courts
    • Drug Testing Programs
    • Corporate Road Safety
    • Procedural Justice
    • Drugged Driving
  • Contact Us
Contact
JSI
Criminal justice reform consultant
Hon. Brian MacKenzie (Ret.)
Tuesday, 03 February 2026 / Published in Artificial Intelligence, Law

AI Describes Many Technologies and None of Them Are Intelligent

Share Button

Hardwiring Justice – Part II[1]

The term artificial intelligence (AI) is something of an oxymoron. This technology is not intelligent, it is software following rules written by people, and there is nothing artificial about how it work. The term simply refers to computer programs designed to handle tasks that would otherwise be done by people. They do not think or understand. They follow instructions, look for patterns, apply rules, and produce results based on data and programming created by humans.

These tools work in different ways. Some follow fixed rules. Others look at past data to make predictions. Some sort or label images, video, or sound. Others manage schedules, assign resources, or generate text by guessing which words usually come next. Many systems combine several of these functions into a single platform.

No matter the form, these systems operate through automation and pattern matching. They do not have judgment, intent, or awareness. They do not decide what is true, fair, or lawful on their own. Any impact they have comes from how they are built, the data they rely on, where they are used, and how much weight people give to their outputs.

Different systems do very different things. To understand how AI systems in the criminal justice system actually affect the system, it is necessary to look at the specific types of systems in use and the roles each one plays.

1.  Rule-Based Systems

Rule-based systems are best understood as standard computer programs, not AI learning systems.[2] They follow instructions written by people and carry out specific actions when set conditions are met.

These systems do not learn from experience or change over time. They do exactly what they are programmed to do, and their impact depends entirely on the choices made by the people who wrote the rules.

Where Rule-Based Systems Are Used in Criminal Justice

They are commonly used in diversion and treatment eligibility screening, statutory compliance checks, sentencing calculation tools, probation condition enforcement, and threshold determinations tied to mandatory minimum requirements.

Common Examples of Rule-Based System

A diversion eligibility tool that excludes individuals with specified prior convictions. A probation system that records a violation when a required appointment is missed. A sentencing calculator that applies statutory ranges based on offense level and criminal history.

2.  Statistical Learning Systems

In the criminal justice system, these tools are often called risk assessment or prediction tools.[3] They are usually described as aids to decision-making, not as ordinary computer programs making decisions on their own.

These tools look at what happened in past cases and use those patterns to make guesses about new ones. They do not judge a person’s unique situation or circumstances. Instead, they compare a person to groups of people from earlier cases and estimate how likely certain outcomes might be.

The results are typically shown as a score, a category, or a percentage, based on the data and assumptions built into the system.

Risk Assessment and Prediction Tools

They are widely used in pretrial risk assessment instruments, probation and parole supervision tools, recidivism prediction models, violation forecasting systems, and supervision planning tools.

Where Statistical Learning Systems Are Used

A pretrial assessment tool that estimates the likelihood of failure to appear. A probation model that predicts the risk of technical violations. A parole forecasting system used to estimate return to custody rates for planning purposes.

AI systems in the criminal justice system

3.  Predictive Scores and Risk Indicators

Predictive scores and risk indicators are simply the results produced by these systems. [4] They are not separate tools. They exist to turn complex calculations into information that is easy for people to read and act on.

These results usually appear as numbers, labels, colors, or alerts. They summarize likelihoods without showing how those estimates were reached. Although, the scores are really estimates, their simplicity often leads them to be treated as conclusive.[5]

Where they are used in the Criminal Justice System

They appear in pretrial detention and release recommendations, bond determinations, supervision level assignments, child welfare risk flags, probation violation alerts, and prioritization tools used by courts and supervision agencies.

Where Predictive Scores Appear in Practice

A numerical risk score displayed on a pretrial report. A color-coded supervision level assigned at intake. A flag generated when a person reaches a predefined threshold such as repeated missed contacts.

4.  Visual Analysis Systems

Visual analysis systems work with images and video.[6] They look for visual features and patterns, not words or reasoning.

These systems do not understand what they see. They simply sort and label images based on what they were trained to recognize.

Image and Video Processing in Criminal Justice

They are used in body worn camera review, video evidence analysis, license plate readers, facial recognition systems, document scanning, jail security monitoring, and surveillance review.

Common Uses 

A system that scans body camera footage to locate segments involving use of force. License plate readers that log vehicle movements. Facial recognition used to search photo databases. Automated document scanners that classify and route evidence files.

5.  Audio and Speech Processing

Audio and speech systems work with sound. [7] Some turn spoken words into text. Others try to identify voices or flag certain types of sounds. They process audio signals, not meaning.

These systems can have trouble with accents, background noise, people talking over one another, or low-quality recordings.

Where they are used in the Criminal Justice System

They are used in court transcription services, jail and probation call monitoring, voice recognition tools, automated report generation, and gunshot detection systems.

Examples

Automated transcription services used for hearings. Jail call monitoring systems that flag keywords. Gunshot detection networks used by law enforcement. Voice recognition systems used for call authentication.

AI systems in the criminal justice system

6.  Optimization Tools

In the criminal justice system, optimization tools are usually used to manage schedules, caseloads, and resources.[8] They are meant to help the system run smoothly, not to make policy decisions or judgments about people.

Like rule-based programs, these tools follow rules set by humans. The difference is how the rules are used. Rule-based programs produce fixed outcomes when certain conditions are met. Optimization tools use rules as limits, then weigh multiple needs, such as time, staff, and capacity, to decide when and how work gets done. Their goal is efficiency and coordination, not enforcement.

These tools do not measure risk or predict individual behavior. Instead, they influence how cases and resources move through the system.

Managing Schedules, Caseloads, and Resources

They are used in court scheduling and docket management, jail population management, staffing allocation, probation caseload distribution, transportation logistics, and supervision planning.

Common Uses

A court scheduling system that balances courtroom availability, judge calendars, and statutory deadlines. A jail population management tool that allocates bed space across facilities. A probation system that assigns caseloads based on officer capacity and workload limits.

7.  Adaptive Learning Systems

Adaptive systems change how they operate over time based on feedback.[9] Instead of following fixed rules or staying the same, they adjust in response to what happens.

This ability to change can improve results, but it also makes the system’s behavior harder to understand or explain at any given moment.

Where they are used in the Criminal Justice System

They appear in traffic control systems connected to law enforcement activity, monitoring technologies, and emerging public safety infrastructure.

Examples

Traffic signal systems that adjust timing based on incident patterns. Monitoring technologies that adjust alert thresholds based on prior responses.

8.  Large Language Models

Large language models produce text by recognizing patterns in how words are usually arranged and guessing what comes next.[10] They do not observe real-world events, check facts, or weigh evidence.

As a result, they can generate writing that sounds confident and convincing even when it is inaccurate or untrue.

Where they are used in the Criminal Justice System

In criminal justice settings, they are used to draft reports, summarize case files, assist with discovery review, prepare training materials, and support administrative writing. They are also used for legal case research, including summarizing opinions, suggesting precedent, and drafting research memoranda.

Known Risks in Legal Research and Drafting

When used for legal research, large language models do not check whether a case actually exists, accurately describe what a court decided, or determine whether a decision is still valid law.

This creates a known risk that the system will produce summaries or citations that sound authoritative but are wrong, or entirely made up.

Examples

Case summaries, draft probation reports, discovery summaries, and research memos suggesting case law

9.  Hybrid Systems

When these tools are used together, they form what is often called a combined or hybrid system.[11] This is usually what people have in mind when they talk about “AI.” Not because the system is intelligent, but because several automated tools are linked together in a way that looks coordinated and goal-directed.

A combined system may bring together fixed rules, pattern-based predictions, risk scores, scheduling tools, and sometimes text generation. Within one platform, it might screen cases, estimate likelihoods, display scores, manage timing or priorities, and produce written summaries.

On their own, each of these parts is limited and mechanical. When linked together, they can give the appearance of decision-making. What looks like a single judgment is often the result of multiple automated steps working in sequence.

Where they are used in the Criminal Justice System

These systems are used in integrated case management platforms, supervision and electronic monitoring systems, evidence review platforms, and vendor tools that support multiple criminal justice functions. 

Examples 

 Supervision platforms that enforce conditions, estimate risk, schedule check ins, and generate reports; electronic monitoring systems that apply violation rules, escalate alerts, and summarize compliance data; and case management systems that route cases, flag priorities, and produce summaries for judges and court staff.

Conclusion – Hardwiring Clarity

AI systems in the criminal justice system is not a single tool that thinks. It is a collection of systems that, particularly when combined, perform very different functions and shape justice processes in very different ways.


[1] This article was edited with the assistance of AI in the form of a large language model. It was used solely for grammar, editing, and footnote support. All substantive content and conclusions reflect human authorship.

[2] George Lawton, How to Choose Between a Rules-Based vs. Machine Learning System, TechTarget (Aug. 31, 2022).

[3] John Monahan & Jennifer L. Skeem, Risk Assessment in Criminal Sentencing, 12 Annual Rev. Clinical Psychol. 489 (2016).

[4] Richard Berk, Machine Learning Risk Assessments in Criminal Justice Settings, 19 Criminology & Pub. Pol’y 375, 378–80 (2020).

[5] Partnership on AI, Report on Algorithmic Risk Assessment Tools in the U.S. Criminal Justice System (2021).

[6] BM Corp., What Is Computer Vision?, IBM (last visited Mar. 8, 2025).

[7] IBM Corp., What Is Speech Recognition?, IBM (last visited Mar. 8, 2025).

[8] Pramit Das, Moulinath Banerjee & Yuekai Sun, Optimal Intervention for Self-Triggering Spatial Networks with Application to Urban Crime Analytics (Dep’t of Stat., Univ. of Mich. 2025).

[9] Acceldata, What Is Adaptive AI? A Complete Guide to Self-Learning Systems, Acceldata Blog (last visited Mar. 8, 2025), https://www.acceldata.io/blog/what-is-adaptive-ai-a-complete-guide-to-self-learning-systems.

[10] IBM Corp., What Are Large Language Models?, IBM (last visited Mar. 8, 2025), https://www.ibm.com/think/topics/large-language-models

[11] Stuart Russell & Peter Norvig, Artificial Intelligence: A Modern Approach, 55–58 (4th ed. 2021).

Get more articles like this
in your inbox

Subscribe to our mailing list and get the latest information and updates to your email inbox.

Thank you for subscribing.

Something went wrong.

We respect your privacy and take protecting it seriously

Related

Tagged under: AI in the Courts, Algorithmic Decision Systems, Criminal Justice Technology, Hardwiring Justice Series, Risk Assessment Tools

What you can read next

Hospital liability
Hospital Liability: Understanding Malpractice and Administrative Negligence
bite-mark evidence
Bite-Mark Evidence and Frye: When Science Goes Wrong
Chief Mack Jenkins Criminal Justice Expert
Chief Mack Jenkins Joins JSI as Vice President

Subscribe to JSI’s Blog Posts

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts

  • Treatment Court Data

    Treatment Court Data and National Infrastructure with Dr. DeVall

    This Justice Speaks episode examines how treatm...
  • AI governance in the justice system

    Hardwiring Justice: Governing AI Before It Governs the Justice System

    Artificial intelligence is already embedded acr...
  • Judicial Independence

    Judicial Independence and Court Leadership: A Conversation with Justice Elizabeth Clement 

    In this episode of Justice Speaks, Justice Eliz...

Upcoming Events

MENU

  • Home
  • Our Services
  • Why the JSI?
  • JSI Blog
  • Contact JSI

Copyright © 2022  Justice Speakers Institute, LLC.
All rights reserved.



The characteristics of honor, leadership and stewardship are integral to the success of JSI.

Therefore the Partners and all Associates subscribe to a Code of Professional Ethics.

JOIN US ON SOCIAL MEDIA

JUSTICE SPEAKERS INSTITUTE, LLC

P.O. BOX 20
NORTHVILLE, MICHIGAN USA 48167

CONTACT US

TOP

Get more information like this
in your inbox

Subscribe to our mailing list
and get interesting content and updates to your email inbox.

Thank you for subscribing.

Oops. Something went wrong.

We respect your privacy and take protecting it seriously

https://justicespeakersinstitute.com/wp-admin/admin-ajax.php