https://newsletter.po.creamermedia.com
Deepening Democracy through Access to Information
Home / Opinion / Saliem Fakir - Low Carbon Future RSS ← Back
Business|Consulting|Gold|Paper|Systems|Solutions
Business|Consulting|Gold|Paper|Systems|Solutions
business|consulting-company|gold|paper|systems|solutions
Close

Email this article

separate emails by commas, maximum limit of 4 addresses

Sponsored by

Close

Article Enquiry

Hallucination traps and knowledge gaps in policy


Close

Embed Video

Hallucination traps and knowledge gaps in policy

10th October 2025

By: Saliem Fakir

ARTICLE ENQUIRY      SAVE THIS ARTICLE      EMAIL THIS ARTICLE

Font size: -+

Policy-wonking can at times be a crude endeavour, and reality is not reached directly but through untested internal beliefs. Without direct personal experience, one worldview tries to grasp the lived experience and worldview of others. In between, hallucinations can creep into the mix.

Then something is written and presented as some sort of knowing without really knowing. There is always epistemic treachery when you have not done the hard yards of working at something and gained tacit knowledge – prior experience is gold. Even if we recover some modicum of authenticity, policy elites can take leaps of faith: a view exercised is viewed as universally shared because someone with purported influence has said so.

Advertisement

We should not lose sight of a further dimension of the word ‘evidence’, suggesting an objective exercise only to mask political affiliations and belief systems that have percolated, knowingly and unknowingly, through the process of evidence gathering. Belief systems should not be underestimated in their sway over evidence, lending a hand to what are preset and locked-in, immovable political choices.

And let us not fall into the trap of the ‘halo effect’ – someone profound, celebrity-like in the policy world, due to exceptional accomplishments in one domain of knowledge, does not always make them the best equipped to provide solutions in other domains.

Advertisement

Here, listening is the key art, and then wisdom, when placed well, can be exercised.

Policy advocacy can live in its own cave, watching shadows, mistaking them for real knowledge.

What lessons can we draw from the problem of hallucinations from another domain of knowledge?

Large language models (LLMs) are known to have the problem of hallucinations – being able to create the illusion of giving you a reasonably sounding string of words but entirely false and inaccurate content.

Depending on the weighting and ranking that are part of algorithms of AI machines like ChatGPT, Grok and others, some papers recently written show that hallucinations cannot be removed – in what one paper described as ‘perplexingness’ – defined as “the degree to which new knowledge conflicts with an LLM’s learned conceptual hierarchies and categorical relationships”.

AI machines can have programmed flaws or develop bias, as highlighted by recent news reports, which later prove hard to correct through editing once the algorithm has been trained on data through reinforcement and learning mechanisms.

The point of AI hallucinations and perplexingness is the problem of bias inherent in the categorisation and hierarchy with which data is evaluated by an algorithm.

In any case, it is still early days with AI and sceptics remain: Emily Bender (FT interview, 20 June 2025) referred to LLMs as large plagiarising machines or “stochastic parrots”. Not to dismiss this or confine this to AI machines alone, Bender’s is a serious point – AI machines are changing the human condition and what we may call authentic intellectualism into synthetic laziness.

In order to address the problem of algorithmic bias in shaping public discourse and what information shapes political choices, it is curious that some AI companies have come up with a solution: the “Habermas Machine”, inspired by the philosopher Jurgen Habermas’s theory of communicative action, in which a free society is one of reason, tolerance and intellectual maturity where the process of deliberation leads to understanding and consensus.

DeepMind, inventor of this Habermas Machine, is giving its LLM a go at mediating political debates and disputes to help to achieve the Habermasian ideal of communicative action. As far as culture wars and partisan debates go, it does not seem machines are anywhere near solving the deep fractures and polarisation hammering away at our civilisation.

These concerns should not be reserved for young pupils or university graduates but also for the very business of policy research and knowledge. How much real research, versus consulting the “stochastic machines”, for policy answers is now necessary, is an open question.

While LLMs speed up the process of policy knowledge and answers, they also turn policy-wonking into something inauthentic and in danger of automated plagiarism if no ethical guardrails are applied.

In the allegory of the cave that Plato introduces in the Republic, there are people who have lived all their lives in a cave and have become accustomed to seeing only one thing – the shadows of objects that are reflected on the walls of the cave. Until one person takes the brave step of venturing out of the cave and discovers that the shadows are a result of actual objects that exist outside of the cave.

The story goes further: if the said brave person who ventured beyond the cave came back to tell the other cave dwellers that everything they saw was an illusion, they would not believe him.

There is a lot in LLM hallucinations, AI bias and the story of Plato’s cave that has relevance for policy-wonking. We all suffer from one or the other form of hallucination if we do not take corrective measures – hopefully we do not find ourselves in the position some AI programmers discovered: that once a hierarchical bias sets in, no amount of editing can change the weighting of the bias.





EMAIL THIS ARTICLE      SAVE THIS ARTICLE ARTICLE ENQUIRY

To subscribe email subscriptions@creamermedia.co.za or click here
To advertise email advertising@creamermedia.co.za or click here


About

Polity.org.za is a product of Creamer Media.
www.creamermedia.co.za

Other Creamer Media Products include:
Engineering News
Mining Weekly
Research Channel Africa

Read more

Subscriptions

We offer a variety of subscriptions to our Magazine, Website, PDF Reports and our photo library.

Subscriptions are available via the Creamer Media Store.

View store

Advertise

Advertising on Polity.org.za is an effective way to build and consolidate a company's profile among clients and prospective clients. Email advertising@creamermedia.co.za

View options

Email Registration Success

Thank you, you have successfully subscribed to one or more of Creamer Media’s email newsletters. You should start receiving the email newsletters in due course.

Our email newsletters may land in your junk or spam folder. To prevent this, kindly add newsletters@creamermedia.co.za to your address book or safe sender list. If you experience any issues with the receipt of our email newsletters, please email subscriptions@creamermedia.co.za