Artificial intelligence (AI) could elevate board effectiveness but it could also raise issues about directors’ liabilities, through increasing the volume and relevance of data available to boardrooms.
Paul Johnston, a consultant with One Advisory and researcher at the Centre for AI in Board Effectiveness, raised the issue at a special breakfast event organised by Board Agenda and Diligent exploring issues associated with AI as a governance tool.
Johnston’s alert is that AI could place directors under pressure to act more often on its information to avert danger.
Directors under pressure
Johnston said: “There is a risk… Directors have a duty to act in the best interests of shareholders and the best interests of stakeholders.
“If there’s more and more information available that is more and more real-time [information], there’s potentially questions about directors’ liabilities that are really difficult.
“If you have the potential to have access to information in real time, and there’s integrity to that data, the question could be raised: why did the board—why did the directors—not pick this up?”
Johnston’s remarks came as part of a special panel discussion exploring the role of AI, and in particular generative AI (genAI), in boardroom decision making.
AI has been the subject of intense focus in business since OpenAI launched its ChatGPT model at the end of 2022.
Attention has mostly focused on the use of AI as a driver of business efficiencies. But as AI tools to summarise board meetings become increasingly widespread, there is debate about the way AI could be applied in the boardroom and its impact on the way governance works.
Humans required
Richard Anderson, a board chair, fintech expert and currently working on an AI start-up, added his concerns to the question of AI involvement in decisions by describing how he often discovers “anomalies and problems and issues and opportunities” by reading board papers to connect previously unconnected points.
“An AI can do some of that, but it won’t do it the same way because each of us sitting on the board has a unique way of doing it, a unique set of experiences. We don’t want to drive toward a homogeneity that could be implied by the ‘average of average of averages’ coming from AI.”
Leanne Allen, head of AI at KPMG, warned boards and governance specialists about the risk of “bias” in AI.
She noted that bias can appear in the data sets used by AI, the algorithms applied in the large language models used and in the interpretation of AI outputs.
“You’ve got think about bias,” said Allen, “where it goes right from the source all the way through the process to the decision that’s being made. It will go all along that journey.”
Michael Borrelli, a director and AI compliance expert with AI & Partners, said the ethics of AI could be maintained by following the “highest regulatory standards”, in particular the EU’s AI Act, adding that it offered “global reach”.
He said: “Being ethical is also being aware of what you’re doing and why you’re doing it. Generally, common sense should prevail and not getting caught up on the hysteria.”
The panel also discussed the vast quantities AI would make available to boards. Quality would potentially be improved but panellists said care should be taken in the face of so much available data.
Borrelli said: “There’s a question of paralysis by analysis with data,” noting that some things variables that should be measured can be, while not everything that is measured is useful.
That places an onus on boards to pick their data carefully despite its proliferation.
Choosing the right data may go beyond boardroom concerns and this raises big questions about which information to pick. Leanne Allen said it will rest on thinking through what “shareholders care about”.
Richard Anderson was more blunt. “I have a degree of scepticism that we want to drown in yet another flood of data.” He suggested management levels below the boardroom could act as an important filter.
Read Diligent and Board Agenda’s report: Demystifying AI: What does it really mean for the boardroom?