// INTELLIGENT TECHNOLOGY //
Research reveals mid-market finance teams are racing into AI without adequate safeguards
" People are saying‘ yeah, we ' ve adopted AI’, but what they ' re really doing is working around their finance system rather than within it."
Almost two-fifths( 58 %) of respondents say they are concerned about the security and compliance risks of adopting AI without clear governance – a figure that rises to 68 % for finance leaders in the Software and Technology sector.
A ccording to a survey of 250 UK-based mid-market finance leaders commissioned by cloud accounting provider, iplicit, while 83 % claim to have adopted AI technology within their finance function, only 53 % have a formal policy or framework in place for its safe and compliant use.
The disconnect is particularly stark among those who have partly adopted AI – the largest group at 48 % of respondents. Of these partial adopters, only 45 % have a formal policy in place. That means over half of partial adopters are using AI in their finance function without clear governance or guidance.
The research also reveals widespread‘ Shadow AI’ usage – finance professionals using external AI tools outside their core finance systems and official processes.
Of those who are only now planning to adopt AI formally – meaning their finance function hasn ' t officially rolled it out yet – almost half( 46 %) say they are currently using AI assistants such as ChatGPT, and almost a third( 30 %) are using AIpowered forecasting and analytics.
Even among those claiming adoption, patterns suggest unofficial usage remains common. Manual data entry is still the most time-consuming task for 42 % of finance leaders – rising to 45 % among CFOs. Yet those same CFOs report using AI primarily for strategic decision-making( 48 %) and integration with other functions( 55 %).
" When I speak to finance professionals across the sector, a common pattern emerges," said Ben Crow, VP of Partnerships at iplicit.
" Many tell me they ' re not actually using AI features built into their finance systems. Instead, they ' re opening tools such as ChatGPT and asking them to help them put something into a board report."
When asked about their concerns regarding AI in finance software, data and privacy risks topped the list at 47 % – with CFOs( 56 %) and finance directors / VPs of finance( 62 %) most likely to flag this concern. This was also significantly higher for finance leaders in Technology and Software( 61 %) than in Healthcare, Social Care, Education, Non-profits, Charity and Housing( 35 %).
Other top concerns include risk of fraudulent activity going undetected by AI( 44 %); concerns about the accuracy of AI output( 38 %) and lack of training or understanding of AI within the finance function( 32 %).
" When someone drops a file into a public AI tool, they ' ve effectively shared that document with the world," said Crow. " If that file contains sensitive employee data, customer information or confidential financial details, you ' ve just opened up a serious data privacy risk.
" LLMs are also making assumptions based on their training data, and you ' re then using that to make decisions and present to the board. For a finance function, AI should be native to the data within your system – giving you answers based on your actual financial data." �
Intelligent SME. tech
29