Transform Collections Calls with Agentic VoIP
February 6, 2026

Webinar – The Trust Gap: Why Finance Leaders Are Stalling on AI (and the Path Forward)

Share with your network
Why are finance teams stalling on AI? Discover how to overcome the trust gap and confidently scale AI in your finance operations.

Finance teams are pouring their budgets into AI pilot programs, but when it comes to scalability, these initiatives are hitting a brick wall. Why? Because there is a difference between automating repetitive tasks and granting an AI model the autonomy to make decisions.

Finance is an accountability function. If a CFO cannot explain why an AI algorithm made a credit allocation decision or why it flagged payment, they will reject it. This is the AI Trust Gap in Finance — the hidden barrier stalling the adoption of advanced automation. It’s happening at companies large and small, across every industry.

In a recent panel, IDC Senior Research Director Kevin Permenter joined finance leaders Charles Edwards and Leon Zhang from SRS Distribution (a major U.S. building products distributor) to dissect this exact problem. The panelists revealed the trust gap that’s quietly killing AI in finance, why current approaches to AI are flawed, and what forward-thinking leaders are doing differently to champion innovation at faster speeds.

Watch the Webinar:

Why Do Finance Teams Resist Autonomous AI?

“If no one can explain the decision, no one wants to own it.” This insight from Permenter cuts to the heart of executive hesitation. Leadership cannot simply transfer responsibility to a machine. CFOs and finance leaders are ultimately on the hook for compliance, audits, and financial integrity, explains Permenter. When things go wrong, it is the CFO’s head on the chopping block — sometimes literally facing legal and financial consequences for corporate mismanagement.

The risk of legal liability creates what Permenter describes as a natural, highly rational resistance to AI that takes a “black-box approach” to AI, hiding its logic deep in software code. It can’t be a mystery that buyers are asked to blindly trust. If AI approves a risky credit limit, flags a legitimate transaction as fraudulent, or alters a payment schedule, the finance leader needs to know why and how it happened. Otherwise, the machine decisioning cannot be corrected and trusted moving forward. As he pointed out, any CFO standing in front of an auditor saying, “I don’t know why AI did that. It just did it,” is in trouble. This is not a defensible position, and this is what keeps CFOs up at night.

What are the Primary Barriers to Scaling AI in Finance?

When examining why AI projects fail to move past the pilot phase, three consistent barriers undermine AI adoption:

  • Loss of Control: Finance teams rely on precise governance tools for risk management. Granting autonomy to an unproven system feels like a direct threat to financial integrity. If a technology appears to weaken accuracy and control, the deployment will face internal resistance, perhaps never even getting off the ground.
  • Digital Transformation Fatigue: Most finance departments have endured a decade of continuous cloud transitions and ERP migrations. Teams often view AI as just another point of disruption to their already strained daily workflows.
  • The Data Reality: In a poll conducted during the panel, 55% of finance professionals identified “data quality and availability” as their biggest barrier to scaling AI. You cannot automate bad data. Fragmented, inconsistent, and siloed data environments only scale operational chaos when AI is introduced.

Bridge the AI Trust Gap in Finance with One Simple Equation

IDC defines the path to AI adoption in finance through a simple formula:

Control + Confidence = Trust

  • Control is built through visibility into processes, override capabilities, and clearly defined governance.
  • Confidence is built through explainable logic, predictable outputs, and measurable accuracy over time.

Finance leaders do not want automation for automation’s sake; they want performance improvement without sacrificing control and financial accuracy. They require what Permenter calls “glass-box AI.”

Transparency in AI Decisioning: The Glass-Box Approach

The following table compares the two primary approaches to financial AI.

Feature Black-Box AI (High Risk) Glass-Box AI (Audit-Ready)
Visibility Opaque and hidden Explainable and transparent decision-making
Explainability Weak (No audit trail) Strong. Full defensible record of how outcomes were produced
Human Control Locked out of the control room Thresholds can be set and decisions can be overridden
Predictability Variable outcomes Consistent, evidence-based and reliable, getting smarter over time

The AI Implementation Playbook

Successful implementation requires a fundamental shift in how humans interact with AI and its algorithms. Edwards perfectly illustrated this disconnect with his shoe-tying metaphor.

Stop Telling AI to “Tie Its Shoes”

Imagine artificial intelligence operating inside the human body. You command it to “tie your shoes.” But the AI takes the left shoestring and the right shoestring and ties them together in a knot, because your instructions were not specific enough to arrive at the right outcome. You then provide hyper-detail: take the left lace, put it over the right lace, and pull. The AI executes the knot perfectly, then immediately drops dead. Why? Because you forgot to tell the system to breathe.

Finance teams face this exact challenge when implementing AI. You cannot simply plug in an AI tool and tell it to “collect money.” That command means nothing to an algorithm. Instead, teams must map out the exact sequence:

  1. How does the system identify which customer to call?
  2. Where is the contact information stored?
  3. How is the call note structured?
  4. What is the threshold for escalation?

AI requires absolute, granular, binary logic (if this, then that) to function within a financial environment where workflows determine accuracy. But building an AI model from scratch all yourself can be daunting. This is why purchasing an AR automation software built on agentic AI can accelerate ROI on innovation projects. IDC studied investors and found that, on average, they achieve 384% ROI and a 9-month payback period.

Encourage Adoption by Showing AR Practitioners their New Superpowers

Contrary to common fears that AI is coming for finance jobs, adoption in finance increases the value of AR practitioners, particularly as they become experienced automation coaches for AI models. Edwards argues that AI raises the professional bar because successful implementation relies entirely on the nuanced, boots-on-the-ground expertise of finance practitioners. He attested that in his own experience, mapping out his company’s accounts receivable sequence required human experience that algorithms couldn’t always replicate independently.

AR practitioners who understand the intricate, undocumented steps of the daily workflows are the only ones capable of training a model effectively. This experience makes them invaluable.

Practitioners also bear witness to the many ways AI projects improve foundational AR processes. Finance leaders are using AI adoption as a way to pressure-test their own operations. This involves deconstructing every step, decision point, and data pull to ask: Why is the team performing the task this way? Is there a better way?

Make Human Oversight Your AI Risk Mitigation Plan

Human oversight is often viewed as a drag on innovation. In reality, it’s the only way to scale responsibly, according to Permenter. A human-in-the-loop acts as the traffic cop.

The AI does the big data crunching — consolidating vast information, running anomaly detection, identifying payment patterns and duplicates, and grading its own accuracy using a confidence scale. The end result: deep data science bringing forth findings and recommendations capable of improving cash flow management and reducing credit risks.

At critical points of decision, however, a human is still needed.

While autonomous AR is possible, a human is required to get there. AR practitioners must be on the receiving end of AI’s insights, approving AI-generated recommendations and correcting any decisioning mistakes. Human judgements and AI coaching moments are the secrets to success. They’re also the reasons why getting started now will determine who arrives at the milestone of full autonomy. The most advanced AI systems have a strong team of human trainers and watchdogs behind them.

Seeing the Big Picture

Deploying human-in-the-loop architecture also helps bridge the trust gap, shifting the narrative from cost savings to workforce elevation. As Zhang noted, when AR teams associate AI with “cost savings,” it often breeds fear of headcount reduction. Instead, Permenter suggests that finance leaders positioning AI as an engine for capacity expansion that paves the way for an entirely new set of skills and experience.

The goal is not to replace AR practitioners; the goal is to free them from being mired in data aggregation so they can apply critical thinking, creativity, and strategic judgment to the intelligence that AI algorithms uncover. As he put it, you’re giving a great employee a powerful tool to amplify their impact and grow their career as AI experts.

Your 3-Step AI Action Plan

“Trust is built through predictable outcomes, not AI promises,” noted Permenter. To leverage AI effectively in your finance organization, start with operations rather than technology.

A Practical Plan for Getting Started Quickly

  1. Pick One Workflow: Isolate a specific, time-consuming process like prioritizing collections calls, resolving short pays, or identifying credit risk. To win your team over, you need to identify an area where you can prove ROI quickly.
  2. Deconstruct to the Binary Level: Map every step of the workflow in exhaustive detail. Edwards recommends getting granular: “Think about teaching a robot to tie their shoes… You have to really define every step.” What data is required? What judgement calls are we making as humans, and how can we define a clear set of rules to govern machine-decisions? Where are the exceptions? When should escalations happen?
  3. Audit Your Data Hygiene: Verify the accuracy of data feeding your chosen workflow. Is it clean? Is it accessible? Permenter stated: “At the end of the day, accounts receivable, AP, tax… most of finance is a data management problem.” If the data is fragmented, fix the data pipeline before introducing automation or risk AI failure.

Trust in AI is not a “nice to have.” It determines whether AI scales or stalls. By designing transparency, governance, and process clarity into the innovation strategy from day one, finance leaders can close the trust gap. When leadership has both control and confidence, AI stops being a buzzword and starts delivering the type of tangible benefits companies need to outpace the competition.

Get in touch with Billtrust to explore how we can help improve your AR performance.

Table of Contents

Table of Contents

Share with your network

Frequently asked questions

What is the AI trust gap in finance?

The AI trust gap occurs when finance leadership refuses to scale artificial intelligence because the algorithm’s decision-making lacks transparency and auditability. CFOs need to be able to explain flagged payments or credit decisions, and they cannot do that with opaque “black-box” systems.

Black-box AI hides decisioning rationale leaving humans out of the loop, which leads to variable outcomes and undefensible positions. In contrast, glass-box AI provides transparent, explainable logic where humans can set thresholds, override decisions, and maintain an audit-ready record of how outcomes were produced.

Autonomy without governance is unmanaged risk in corporate finance. A human-in-the-loop acts as a traffic cop. AI handles data aggregation and anomaly detection, while ensuring a human AR professional makes the final strategic judgment calls, accepting or denying AI-generated recommendations.

Finance teams resist fully autonomous AI when systems are unproven, and humans don’t have complete control over the AI’s decisions. AI that feels like a direct threat to a company’s financial integrity and risk management is rejected.

Browse related content by:

Robot finger touching person finger
The silhouette of a man with a thought bubble
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Robot finger touching person finger
Blog

Resistance to AI in Accounts Receivable: What Finance Needs to Trust Advanced Automation

The AI trust gap is very real. Let’s slow it down and work through what’s underneath it so you can move forward with confidence and control.
Right row purple icon
ai shield with data
The silhouette of a man with a thought bubble
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
ai shield with data
Blog

Implementing AI in Accounts Receivable: 6 Trust Requirements Every Finance Leader Must Know

What’s holding you back from truly trusting AI? Let’s get you over the hump with a no-frills breakdown of the controls, guardrails, and design elements that turn chaos into control.
Right row purple icon
Thoughtful businessman leaning on railing
The silhouette of a man with a thought bubble
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Thoughtful businessman leaning on railing
Blog

Breaking Down Silos with AR Automation: Free Your Data. Free Your Cash Flow.

Data drives cash. Until you fix AR silos, both stay stuck. Here’s how to change it.
Right row purple icon

Learn what Billtrust can do for you

Reduce manual work, get paid faster, and deliver superior customer experiences with Billtrust’s unified AR platform.