ST. JOHN'S — The Canadian federal government is currently defending its decision to award a contract worth up to $1.1 million to Deloitte Inc., a global consulting firm, for consulting on artificial intelligence deployment. This contract has drawn criticism following multiple AI-related inaccuracies the firm acknowledged in reports prepared for various governmental entities.
Employment and Social Development Canada (ESDC) stated that if Deloitte fails to meet the stipulated conditions of the contract, consequences will follow. The contract was awarded in September, just before Deloitte admitted to having relied on AI for certain research citations in a report for the government of Newfoundland and Labrador, prompting the need for a revision of erroneous citations.
In October, Deloitte Australia also recognized that it had submitted a flawed report to the Australian government, containing citation errors linked to the use of AI. These false citations have stirred calls from critics for increased oversight of public contracts involving consulting firms and their AI applications. Experts are questioning the rationale behind the Canadian federal government's continued collaboration with Deloitte on AI matters.
Robert Shepherd, a Carleton University professor specializing in Canadian public management, expressed concerns regarding ESDC's adherence to this contract. He underscored that if government employees submitted work based on fabricated evidence, they would face significant repercussions, thereby raising doubts about public trust in governmental operations.
Last month, an online news outlet in Newfoundland and Labrador exposed numerous non-existent citations in a human-resources plan drafted by Deloitte for the province's Health Department. Martha MacLeod, a nursing professor, verified that a cited article co-authored by her does not exist. The provincial government incurred approximately $1.6 million in costs for this report, according to documents disclosed by blogger Matt Barter.
Deloitte Canada asserted its confidence in the report's recommendations for the Newfoundland and Labrador government, stating that corrections to the citations were underway and that these inaccuracies do not impact the overall findings. The firm clarified that while AI was not used to compose the report, it was selectively utilized to assist with a small number of research citations.
Public Services and Procurement Canada confirmed awareness of the erroneous citations from Deloitte's past reports, stating that suppliers now must preemptively disclose any AI utilization. This requirement aims to ensure quality workmanship and value for money. However, experts argue that the onus lies on the federal government to revise its oversight mechanisms concerning public contracts to guarantee that consulting firms are compensated for reliable research.
In light of the growing dependency on consulting firms and the substantial public funds allocated to them, there is a pressing need for the government to establish new protocols to vet and fact-check these consultants' reports. Shepherd suggested that governments should mandate these companies to demonstrate self-conducted research, while Ebrahim Bagheri from the University of Toronto proposed that government involvement in the research and reporting processes should increase through more frequent interactions with consulting firms.
Bagheri stressed that consulting firms are engaged to provide specialized expertise beyond what is readily available in public service. Yet, he cautioned against the potential for AI-generated recommendations that may lack originality, potentially producing outcomes that public sector employees could devise independently through large language models.
Overall, this situation highlights a crucial conversation about the balance between leveraging advanced technologies like AI and ensuring accountability and integrity in governmental processes and outcomes.










