blue and purple three dimensional bar chart on black background
Professional Insights

Tips for using algorithms: Insights for finance professionals

Sep 16, 2025 · 3 min read · AICPA & CIMA Insights Blog

Internal investment decisions are a core responsibility for management accountants, affecting long-term performance and strategic direction.

Evaluating capital purchases, allocating resources, forecasting performance, and hiring talent are all decisions that carry long-term impact for your organisation. With algorithmic decision support systems, there is an opportunity to make internal investment decisions even more consistent and data-driven.

AI-powered, algorithmic decision support systems can be used to process large amounts of data quickly. But they bring more than just speed — these systems also bring clarity and consistency to decision-making. They can ensure that your teams are not relying on fragmented information but rather on traceable data, which improves transparency.

Algorithmic decision support systems can, for example, aid in recruitment processes by scanning online platforms like LinkedIn, identifying potential candidates, and assessing likely behavioural patterns related to role suitability and future performance, offering a more impartial and consistent approach than human evaluators.

Despite the potential of algorithmic decision support systems, many professionals hesitate to use these tools, a phenomenon explored in our latest research report, Algorithm Aversion in Internal Investment Decision-Making. Research suggests that algorithms often outperform people in cognitive tasks, yet many employees remain reluctant to rely on them. This hesitation, known as algorithm aversion, is the focus of the report.

Funded through CIMA’s research programme and led by researchers from Ruhr University Bochum, the report explores why algorithm aversion persists and how you can encourage employees to use algorithmic decision support systems across different internal investment decisions.

Use of algorithmic support systems depends on the decision type

To understand how and when employees use algorithmic support systems, the researchers behind the Algorithm Aversion in Internal Investment Decision-Making report conducted a controlled online experiment in which participants acted as managers making investment decisions in a fictional company.

The participants were given three options based on five objective criteria, with two decision categories:

  • Non-human-related decision: Selecting the most suitable machine for procurement.

  • Human-related decision: Choosing a candidate for hire.

Participants could request algorithmic advice at a cost, and some were given explanations about the algorithm’s benefits, while others were not.

Findings from the experiment revealed:

  • Participants were more likely to request algorithmic advice for non-human-related decisions, such as procurement, than for hiring decisions.

  • Explanations of the algorithm improve uptake, especially in human-related decisions. When no explanation was provided, fewer participants in the hiring scenario requested algorithmic advice.

  • Research findings indicate that the rise in the use of algorithms is more likely to be based on the availability of an algorithm explanation when the decision type is human-related versus non-human-related

  • Participants responded better to simple, benefit-focused explanations rather than to detailed technical ones. Clear and relevant explanations proved more influential than technically focused language.

  • The challenge is deciding to request it in the first place. Employees tend to follow advice once they’ve asked for algorithm support.

Steps to implementing algorithmic support systems

Overcoming algorithm aversion among your teams requires clear communication from you about how and why to employ algorithmic support systems.

Findings from the report suggest that the use of algorithms in human-related decisions can be increased by explaining to your team what the algorithm does and how it can help them make stronger decisions. Communication remains critical for team member buy-in.

Considerations to support your employees include:

  1. Tailor communication to the decision context. People are more receptive to algorithmic support in technical or procurement decisions. For human-related choices, transparency and reassurance are essential.

  2. Provide clear, straightforward explanations. Focus on what the algorithm does well (e.g., reducing bias, analysing large datasets quickly, and offering consistent evaluations) rather than overwhelming your team members with technical detail.

  3. Maintain user control. Employees are more likely to engage with algorithmic tools when they know the final decision remains theirs. Make it clear that algorithmic tools are a support mechanism, not a replacement for their professional, human judgment.

  4. Build an organisational culture that supports enquiry and exploration. Encourage teams to explore and experiment with algorithmic tools. Familiarity builds confidence, lessening uncertainty and algorithm aversion.

Algorithm aversion can prevent business outcomes

Organisations investing in algorithmic decision support systems must understand how employees interact with these systems to inform decisions around forecasting, procurement, hiring, and other areas.

Algorithmic tools don’t replace human judgment; they enhance it. By understanding how and when employees engage with these systems, finance leaders can design decision processes that are both data-driven and human-centred.

Read the full report to explore the findings in detail and consider how they apply to your team’s internal investment decisions.

Mari Sagedal, M.A.

Mari Sagedal is a senior content writer at AICPA & CIMA, together as the Association of International Certified Professional Accountants.

What did you think of this?

Every bit of feedback you provide will help us improve your experience

What did you think of this?

Every bit of feedback you provide will help us improve your experience

Related content

}