After Deloitte had to reissue a $440,000 report to the Australian government due to a serious error from artificial intelligence (AI), competitors in the "Big 4" group including EY, KPMG, PwC and Boston Consulting Group (BCG) quickly spoke up to affirm that they have strict control processes to avoid similar errors.

Deloitte is now facing intense pressure after the original report contained three fictitious references and a fabricated quote from a Federal Court ruling. The revised version removes more than a dozen erroneous quotes and corrects the bibliography, although the recommendations remain the same.

The case has attracted international attention, becoming a typical example of “hallucination” – when AI “fabricates” information as if it were real. Public opinion and experts are demanding that Prime Minister Anthony Albanese’s government force consultants to disclose when they use AI in State contracts.

Big 4 race to prove they "use AI responsibly"

EY insists that all AI-generated results must be thoroughly reviewed by employees before being used. KPMG says it has an “AI Trust Framework” and a public “AI Registry” of tools it uses in its services, including with government clients.

BCG stressed that all consulting products undergo “rigorous leadership vetting,” while PwC stressed that it adheres to the “humans in the loop” principle — humans are always ultimately responsible for products involving AI.

Deloitte and McKinsey, two firms that often boast about their AI capabilities, declined to comment. Deloitte has agreed to reimburse some of the costs to the Australian Department of Employment and Industrial Relations.

deloitte afr
Deloitte caught up in a scandal over AI "fabricating information" when writing reports. Photo: AFR

At a Senate hearing, Gordon de Brouwer, head of the Australian Public Service, stressed: “AI is not responsible for your work. You are responsible.”

Greens senator Barbara Pocock called on the government to force contractors to publicly disclose their use of AI and verify all AI-generated content. “This is not just artificial negligence, it is deliberate negligence… Deloitte’s work is not up to the standard of a first-year university exam… If contractors hired to do government work are going to continue to outsource AI work, there needs to be checks. That’s not too much to ask,” she said.

Ms Pocock has introduced legislation to ban contractors found to have engaged in unethical behaviour from government contracts for five years and called for full refunds from Deloitte.

AI supports, not replaces, humans

Smaller consulting firms were also quick to reassure clients. David Gumley, CEO of ADAPTOVATE, said they were using AI to improve efficiency, but it would never replace human expertise.

Another consultant, Lisa Carlin – who runs the one-person firm The Turbochargers – said that consulting firms need to have a public policy on the use of AI, explaining how the technology is applied in their work, but not necessarily disclosing details of the technology used.

“AI is now part of standard business tools, like Excel, but the company still has full responsibility for managing risk,” said Carlin. “Transparency should be at the policy level, not in every report, because what customers need is confidence that the company has strong controls, not a warning line on every page.”

Carlin advises corporate clients on AI strategy and implementation, and she emphasizes that every detail in a report, regardless of how it was created, should be carefully vetted before it is sent to a client.

“Companies need to have their own quality assurance processes for AI and a clear risk assessment,” she added. “There is no excuse for ‘AI-generated’. Consultants need to be accountable for their work, just like they would for an intern.”

(According to AFR)

Source: https://vietnamnet.vn/4-ong-lon-kiem-toan-len-tieng-sau-su-co-ai-nghiem-trong-tai-deloitte-2451266.html