
As artificial intelligence continues its advance into regulated industries, the legal world finds itself at a crossroads. Contract review, clause tagging, and legal research are no longer slow or manual, they’re increasingly driven by large language models and machine learning.
But as legal automation becomes more sophisticated, a deeper question emerges:
Can AI support decision-making in law without compromising the human principles at its core?
From Legal Tech to Legal Reasoning
Today’s contract lifecycle tools, like AI-powered repositories, can extract metadata, flag anomalies, and summarize key legal points in seconds. They bring structure to complexity and reduce risk in high-volume document flows.
But law is not just a process. It is a practice of judgment, of intent, and often of empathy.
While AI can map past patterns, it cannot yet understand fairness or contextualize a moral dilemma. The language of justice is nuanced and not every exception fits inside a prompt.
What Empathy Really Means (Even to a Machine)
Empathy, the ability to perceive and respond to another’s emotional state, is foundational to ethical reasoning, which in turn underpins law. But is empathy exclusively human?
According to research from the University of California San Diego, AI-generated responses were rated as more empathic than those from physicians in patient communication contexts (JAMA Network, 2023).

While AI cannot “feel,” its programming allows it to simulate empathic behavior through linguistic calibration, often outperforming humans in consistency and composure. In controversial or emotionally charged cases, AI may provide emotionally neutral, nonjudgmental communication that humans might struggle to deliver.
Moreover, AI systems are built from vast datasets, trained on legal texts, ethical theories, and societal norms. In this sense, AI doesn’t invent morality, it mirrors it. What is considered “right” by most becomes the AI’s default reasoning.
As Peter Banda noted, this also means that AI, while "lobotomized" of extreme views, represents the centerline of societal consensus - a potentially powerful ethical compass in judicial and regulatory contexts.
Why AI Still Needs Guardrails
Yet, as promising as these capabilities are, we must recognize the limits.
-
AI is only as strong as its data.
-
Context can be misread, edge cases misunderstood.
-
And without human oversight, even well-meaning tools can amplify bias.
This is why Cequence believes AI should augment, not override, human decision-making.
Why Empathy Still Matters
Beyond risk mitigation and workflow efficiency lies the intangible:
The tone of a contract. The weight of a precedent. The hesitation before judgment.
In these spaces, empathy cannot be automated, but it must be preserved.
And any AI supporting legal decisions needs to stay in service of human-led processes, not seek to replace them.
“The law is not just a system to be optimized. It’s a reflection of values, society, and people. And AI must learn to operate in that shadow, not cast its own.”
– Rastislav Kovaľ, CEO of Cequence
What We’re Building at Cequence
At Cequence, we’re developing AI-powered tools that make legal work easier, not emptier. Our digital contract repository:
-
Helps legal and procurement teams find and understand key clauses instantly
-
Extracts structured data from unstructured legal text
-
Allows teams to collaborate securely, at scale
-
Always operates under the control of human reviewers
We believe that AI can enhance clarity and compliance when used with intention and integrity.
Why It Matters
Justice is not just about regulation, it’s about how laws are interpreted, how people are treated, and how systems build trust.
AI might not feel empathy, but it can represent empathy, learn from consensus, and support ethical outcomes through scale, speed, and precision.
And that’s not only useful, that’s transformative.
Want to see how AI can help you manage contracts responsibly?
👉 Explore our platform or book a quick demo.
📚 Looking for more expert content?
Check out our weekly AI newsletter.