Part 1 of a 2-Part Series on AI in Corporate Finance
Note: While “AI” can include a wide range of technologies like machine learning and neural networks, for simplicity, this article will use “AI” to refer specifically to Large Language Models (LLMs) like ChatGPT, unless otherwise stated.
Last fall, I had the privilege of attending the Association of Financial Professionals National Conference in Nashville, TN. For the uninitiated, the AFP conference is arguably one of the biggest in the finance and accounting world. It’s kinda like ComicCon—but for corporate treasurers and CFOs. And by all accounts, this was their best showing yet, with more than 4,000 attendees.
Although it was my first AFP, I’ve been to my fair share of similar conferences. If ever there was a venue tailor-made for corporate buzzwords, this would be it. About 15 years ago, the words on everyone’s lips were crypto-currency and blockchain. These were new, exciting, and captured the imagination.
And while maybe someone out there had a crystal ball, most of us in the payments industry were genuinely curious about how these technologies would change how people exchange monetary value. Were we witnessing the end of ACH? Of wires? Of cash as we knew it?
Naturally, the cottage industry of vendors that percolate underneath the surface of the banking industry were eager to ride the buzzword wave and appeal to executive sponsors everywhere.
Now, more than a decade later, we have a more grounded view. Blockchain? Still useful. Crypto? Mostly a hustle.
The AI Workflow Question Nobody Can Answer
Before arriving at AFP, I took a little time in my Airbnb to review the session agenda and plan my schedule. I chose a smattering of topics—risk management, liquidity management, technology in finance, and integrated banking solutions.
And in more than one session, I heard the same open-ended question tossed into the room: “How are you using AI in your treasury and finance workflows?”
And let me tell you—the silence was deafening.
Why the Room Went Quiet
When it comes to AI workflow in corporate finance, I think there are two big reasons why that question landed like a lead balloon.
But first, let’s clarify something.
We’re not talking about machine learning here. That’s already the domain of data analysts, scientists, and engineers—people with established roles and well-defined remits. That’s not the point of this conversation.
The real question is this:
Dear corporate treasurers, how are you using ChatGPT to make your daily workflows easier?
(Spoiler: They aren’t. At least, not yet.)
Hallucinations: The Risk No One Wants
Large Language Models like ChatGPT are essentially elaborate versions of your phone’s autocomplete feature. They don’t “think.” They predict.
They’ve been trained on enormous amounts of text scraped from the internet to guess the next word in a sequence—based purely on probability. So when you type a prompt into a chat interface, you’re not talking to something that understands you. You’re reading an impressively polished string of predictions.
But predictions can be wrong.
And when they’re wrong in this context, we call them hallucinations—fabrications that sound true, but aren’t. These are not just minor errors. They’re confident misstatements that can completely derail trust. The AI doesn’t know it’s making things up. It believes it’s helping—because that’s what its algorithm is designed to do.
Unfortunately, those algorithms sometimes draw from insufficient or misleading patterns in the data. And that leads to some wildly inaccurate, sometimes dangerous, outcomes.
These hallucinations can take many forms:
- Citing sources that don’t exist
- Fabricating policy language
- Creating entire fake business cases
- Inserting invented data
… all while being delivered with the confidence of a naïve but precocious 9-year-old.
It’s one thing to hallucinate in a college book report. It’s another thing entirely to hallucinate in a budget forecast or a compliance audit. The risks aren’t just reputational. They’re regulatory.
For finance professionals, even imagining that risk is enough to make them avoid experimenting with these tools altogether.
The Second Problem: Not Knowing What We Don’t Know
The second reason for all the awkward silence? Most practitioners know there’s value in AI—they just don’t know where or how to start. They know it could help. But the idea of figuring it out feels like too big a lift.
I recently attended a session on using AI in nonprofit grant writing and marketing. The speaker showed some AI-powered tools, sure. But what really clicked with the audience were the small wins—the everyday tricks of the trade that didn’t require a cross-functional working group or a six-month rollout plan.
They were simple, practical, and immediately useful. The kinds of tools that sit just under the radar—below the budget approval line—but that make work faster, easier, and smarter.
And that’s where I think the AI workflow conversation really needs to go.
Coming Up Next: AI Tools That Actually Help
In Part 2 of this series, I’ll share a few practical tools and use cases—real-world ways finance and treasury teams can start integrating AI into their everyday workflows without taking on enormous risk.
No moonshots. No jargon. Just practical wins.