Back to blog
Back to blog
August 15, 2024
-
6
Min Read

How Accountants Can Manage Risks with AI

Concerned about AI implementation on your accounting team? In this article, we provide some key context for understanding AI's risk profile as well as best practices for mitigation.

Nigel Sapp
All Articles
Featured Articles

We can thank the infamous HAL-9000 of 2001: A Space Odyssey for teaching generations of moviegoers about the dangers of blind trust in AI systems. It’s not hard to imagine that HAL’s cold, red stare and sinister betrayal linger in the minds of today’s AI skeptics.

It’s reasonable for accounting teams to have some reservations about increasing their AI usage: the tech is fairly new and needs carefully vetting (like all tools).  Plus, according to the Connor Group's survey of CFOs, 37% of finance leaders are concerned about cybersecurity risks when implementing AI.

Chart of challenges that accounting teams face with implementation and the ongoing use of AI. At the top of the list is technical expertise, followed by cost and resource constraints, followed by staff training and adoption, data quality, integration with existing systems, and cybersecurity risks.
From the Connor Group's AI in Action Report

To directly address those concerns, we've broken down the most common questions that we hear from accounting teams. Ultimately, with a deeper understanding of AI’s risk profile and the proper mitigation strategies in place, teams can maintain their excitement about AI’s innovation potential with an appropriate level of caution. 

What are common concerns ahead of using AI in accounting?

We typically hear three primary questions on risk from accounting teams: 

  1. Is my data secure? 
  2. Can I trust AI outputs?
  3. What about AI & audits?

Below, we’ll cover the approaches for mitigating concerns tied to each of these questions. 

Concern #1: Is my data secure?

Thinking about data security is second nature by now — and with AI there are a few more elements involved. 

Specifically, AI tools typically take in inputs (like your prompts or context that you provide) and spit out a response or action.

To get smarter and more useful over time, often models train on your feedback and initial questions. While that’s how the model improves, it’s also true that you may not want your prompts, feedback, or files used to train models when the information is more sensitive, like financial data. 

So here’s what you can do: 

Ask providers how your data will be used

Before signing any contracts or using a tool, simply confirm that software providers won’t be using your data to train their model.

If you’re asking quick questions to ChatGPT— like generating a grocery store list, models being trained based on your feedback may be completely fine. But if you’re passing sensitive financial data or other more sensitive material, this is where clarifying how your data is used becomes important. 

You also may have to go directly into a platform and turn off model training there. 

Otherwise, after clarifying that your data isn’t being used to train models, treat AI tools similar to a shared file storage system like Google Drive or Microsoft OneDrive, if you typically loop in IT or infosec for that genre of purchase, do the same here.

Use dummy data for code testing 

If you’re using an LLM to generate scripts to manipulate data, no need to feed the model the exact data to test the script. Instead, create some dummy data. If you’re importing a table, it’s perfectly fine to keep row/column headers but avoid any numerical inputs. 

Train your team on AI usage  

Sometimes, data risks aren’t external: careless or uninformed employees are equally capable of accidentally releasing confidential information to the wrong source. It’s important to implement some training and oversight to create alignment across the team about AI precautions. 

To create a solid foundation, reach out to other departments that have incorporated AI, like sales & marketing, to see if they have existing policies in place.

Concern #2: Can I trust AI outputs? 

A key distinction: AI provides probabilistic answers, not guarantees. While these answers are still generally well-informed and useful, assuming complete accuracy is irresponsible.

One noted AI phenomenon – hallucination – occurs when a model starts generating false information in the midst of what might otherwise be a suitable response.

Since most models are trained on vast amounts of internet data, they may reference information that was false from the start or struggle to determine the most accurate and relevant information available. Hallucination can occur at any time, but the likelihood increases when a model is overloaded with requests or parameters.

So, what steps should you take to ensure you aren’t passing off faulty information from the model? 

Before anything, it’s useful to operate from the vantage point that AI is great at accomplishing up to 80% of your desired objective, and that human intervention can bring it across the finish line.

Trust but verify 

It’s the classic Internet rule: don’t trust everything you see on the Internet. As a general rule, you should review all AI outputs as you receive them. AI is not an answer machine – it is a highly predictive, algorithmic system that still has margin for error. For example, we see accounting teams leverage Numeric's AI auto-drafted flux explanations to get a head start, then they drill into specifics and quickly confirm core details ahead of passing along reporting.

Restrict source material 

Hallucinations increase in likelihood when AI doesn’t have clear parameters for where it should be referencing information. 

Keep AI from getting its wires crossed by providing clear source material for content generation – if you want a strong answer about revenue recognition but don’t want to leave AI completely to its own devices, then feed the model a copy of say, Deloitte’s ASC 606 guide, to help. 

On this matter, Numeric is already one step ahead: for future accounting questions, you can default to using Numeric’s technical accounting AI. Trained on GAAP guidance, public company filings, and more, the accounting AI tool can provide answers specific for any standard (US GAAP, IFRS, UK GAAP) and for guidance from any Big 4 audit firm.

Numeric's Technical Accounting AI

Add priming to your prompts 

With AI models, you get out what you put in: the more detail you can provide on the front-end, the better AI’s output will be. These are two common approaches to what is known as “priming” an AI system:

  1. Provide “role” context: Start your prompt by explaining to AI what position or mindset it should adopt. If you begin a prompt with “You’re a seasoned financial controller” and follow with a request, AI will complete that task through a more expert accounting lens than had you not included the role. 
  2. Provide example outputs: If you already have an output that you would like the model to imitate, you can say “Here’s an example of what your response should look like.” You can also specify what format you’d like the response to be – bullet points, table, etc. – and define what tone of voice should come across in the response.

For a more detailed breakdown on priming with tips for creating accounting prompts, check out the accountant’s guide to ChatGPT.

See how accounting teams use AI in the month-end close

Schedule Walkthrough

Concern #3: Will AI complicate my audit processes?

When integrating AI into your accounting processes, it’s natural to wonder how it might impact your audits. The primary challenges stem from the opaque nature of many AI systems as well as the potential for AI-generated outputs to be accepted without proper scrutiny. Since auditors prioritize accuracy, transparency, and compliance, AI’s complexities can provide cause for concern. 

Create an AI audit trail 

Accounting teams can stay audit ready in the AI era by cultivating an audit trail of their AI usage.

What does that look like? 

Primarily, teams should be keeping track of any prompts, responses, scripts, generated code, or other AI implements that have business impact. Simply stated, if something AI has created is actively helping your work and by extension, the business, it ought to be documented. As always with audits, the more detailed your documentation, the better. 

Collaborate with your auditors

Work closely with your auditors to explain the AI tools you use and how they enhance your processes. Provide them with access to the necessary documentation and support to facilitate their understanding and verification of the AI’s outputs. 

How the Future of AI Influences Risk 

As AI continues to evolve, its potential to transform accounting processes grows, bringing both opportunities and potentially new risks to consider. 

By staying informed about AI advancements and implementing robust risk management strategies, accounting teams can confidently embrace AI's future while mitigating potential challenges. The key lies in balancing innovation with vigilance to ensure AI's benefits are fully realized.

All Articles
Featured Articles
Get Started with Numeric Essentials
Free forever. Available today.
Thank you
Your submission has been processed.
Oops! Something went wrong while submitting the form.
Thank you
Your submission has been processed.
Oops! Something went wrong while submitting the form.
Follow Us
Twitter icon

More Blog Posts

November 5, 2024
-
9
Min Read

Material Weaknesses — A Potential Obstacle to IPO Success

See why material weaknesses can derail your IPO process. Learn causes, prevention, and remediation strategies to secure your business's financial future.
Nigel Sapp
All Articles
All Articles
Featured Articles
November 4, 2024
-
8
Min Read

5 Common Accounting Bottlenecks on the Road to IPO

See how to navigate the 5 most common accounting bottlenecks on the road to IPO, from ERP systems to financial controls.
Nigel Sapp
All Articles
Featured Articles
All Articles
October 23, 2024
-
4
Min Read

Looking to IPO? 3 Insights from the Connor Group’s Bay Area IPO Summit

Explore the insights from the 2024 Bay Area IPO Summit, where finance leaders, IPO advisors, and other experts spoke on the critical roles of accounting and AI in IPO readiness, and the importance of early equity management.
Parker Gilbert
All Articles
Featured Articles
All Articles

Close fast & with confidence

AI-assisted. Operationally efficient. Audit ready.