Business
JAM | Oct 16, 2025

Deloitte fiasco illustrates the dangers of AI-generated professional work

Al Edwards

Al Edwards / Our Today

administrator
Reading Time: 4 minutes
FILE PHOTO: A logo of Deloitte sits outside the Meta House on the opening day of the 55th annual meeting of the World Economic Forum (WEF) in Davos, Switzerland, January 20, 2025. REUTERS/Yves Herman/File Photo

The Government of Australia hired Big Four accounting firm Deloitte to write a report on the use of automated penalties in Australia’s welfare system. The 237-page report cost the government 440,000 Australian Dollars (US$290,000).

It turns out it was AI-generated and riddled with errors uncovered by Sydney University’s Chris Rudge.

This is most worrisome and dangerous and is a portend of what is to come if allowed to go unchecked, like social media has over the last 25 years.

We are already seeing sloppy and unprofessional work, with many people submitting AI-generated reports. This level of blatant disregard for oversight, lack of courtesy, and accountability is most disturbing.

It is already occurring in the media with reporters submitting and uploading AI-generated content. A cursory read of the opening three sentences tells an editor with a seasoned eye that the copy is not from the writer and is generated by a non-human entity….but it is offered up as if it comes from the human soul with both intelligence and professionalism.

Most people defer to brand-name professional outfits and don’t question them—after all, they have a well-cultivated reputation. But as they seek more fees and greater revenues, many want to expend less time on the actual work and see AI as a God-send (pardon the pun…that’s not AI-generated), allowing them to feed the scope of work into a computer and then voila! You have your report.

Let’s be clear here: This is a breach of integrity and a breach of trust. Deloitte used Azure OpenAI GPT-4 to generate this error-laden report, and then charged the Australian government big money.

It will come as little surprise if something similar is not occurring in Jamaica and other Caribbean countries. Why work when AI can do it for you and you can still get to ride around in a BMW or Benz and quaff Chardonnay? AI is going to enable plenty of charlatans. 

AI is not a disruptor as it is said to be; rather, it poses an existential threat to life as we know it if it goes unmonitored. AI should be a tool used by the initiated, and it should be made known that the product or service is AI-generated.

“This clearly is an unacceptable act from a consultancy firm, and this case highlights that we need to be always ensuring that departmental processes deal with this emerging technology,” said Australian Government Minister Murray Watt.

The Government of Australia was ripped off here by Deloitte despite it being asked to resubmit the report.

“Deloitte misused AI and employed it inappropriately, misquoted a judge and cited references that were non-existent. I mean, the kinds of thing that a first-year university student would be in deep trouble for,” said Senator Barbara Pocock, the Australian Greens spokesperson on the Public Sector.

The Deloitte report was first published on the Australian government’s Department of Employment and Workplace Relations website sometime in July of this year. It didn’t even bother to check whether its contents were correct. This is a major disrespect—major.

Last month, Deloitte announced that it would invest $3 billion in AI development all the way until fiscal year 2030. This AI-generated Deloitte report illustrates the necessity of putting proper safeguards in place. Don’t accept such documents at face value, regardless of how reputable the professional organisation producing them is.

“AI isn’t a truth-teller, it’s a tool meant to provide answers that fit your questions,” said Bryan Lapidus, FP&A Practice Director for the Association for Financial Professionals.

Nikki MacKenzie, an assistant professor in the Georgia Institute of Technology’s Scheller College of Business, added, “We’re constantly hearing about how ‘intelligent AI has become, and that can lull people into trusting it too much. Whether consciously or not, we start to over-rely on it.”

Jamaican and Caribbean companies must guard against this. Senior attorneys have recounted how junior counsel is doing sloppy work in Jamaica using Chat GPT to create documents and form case files. Pretty soon, finance houses will go this route too, if allowed to go unchecked, not holding their professionals accountable for work submitted.

It’s becoming a case of trusting those who have spent considerable time in their field and have a reputation for being honourable and doing stellar work. You can’t take a chance on some of these young folks with degrees from Mickey Mouse universities and who are untested. The first instinct is to give you AI-generated work because they simply don’t care…they are all about the money, not value for money.

Nearly 6 out of 10 employees confessed to making heinous mistakes in their work due to AI errors, according to a KPMG study released in April of this year.

Earlier this year, Apple, founded by Steve Jobs, was forced to suspend an AI feature designed to summarise news alerts after complaints that readers were inundated with AI-generated false and inaccurate information.

That’s how bad it has become. 

What is most disturbing about the Deloitte situation is that it made no attempt to totally refund the funds it was paid for the report, opting to make a partial refund. Instead, it issued a terse statement which read: “The matter has been resolved directly with the client.”

Really? 

This year, Deloitte Australia reported revenues of $2.5 billion and won 48 government contracts worth $58 million.

In June, the UK Financial Reporting Council declared that the Big Four were failing to keep an eye on how AI was being used and how automated technologies affected the quality of their audits.

“I instantaneously knew it was either hallucinated by AI or the world’s best-kept secret because I’d never heard of that book (referenced in the Deloitte report), and it sounded preposterous.

“They’ve totally misquoted a court case, then made up a quotation from a judge, and I thought, well, hang on: that’s actually a bit bigger than academics’ egos. That’s about misstating the law to the Australian government in a report that they rely on. So I thought it was important to stand up for diligence,” said Chris Rudge, speaking with Thomson Reuters.

Comments

What To Read Next