Dailymaverick logo

Opinionistas

This is an opinion piece. The views expressed are not that of Daily Maverick.....

Algorithmocracy: How hidden codes are shaping lives and deepening inequalities in South Africa

These coded instructions have become gatekeepers, deciding who and what gains visibility, often favouring corporate profit over public interest. Nowhere is this more apparent than in how they shape the online experiences of South Africa’s youth, especially young women.

What happens when the power to shape society no longer resides in institutions or individuals but in invisible lines of code? We are living through the emergence of an “algorithmocracy” where algorithms — those silent architects of digital life — govern our information, interactions, and identities.

In South Africa, this algorithmic rule is deepening existing inequalities and redefining agency, often to the detriment of the most vulnerable.

While algorithms have revolutionised efficiency, their unchecked dominance poses serious challenges, particularly in contexts where low data literacy and weak regulatory oversight prevail. These coded instructions have grown into gatekeepers, deciding who and what gains visibility, often favouring corporate profit over public interest.

Nowhere is this dynamic more apparent than in how they shape the online experiences of South Africa’s youth, especially young women.

The invisible gatekeepers of visibility


In a world increasingly driven by algorithms, platforms like Instagram and YouTube serve as arbiters of relevance. Their algorithms amplify polarising or sensational content to keep users engaged — a phenomenon that Tarleton Gillespie, a senior principal researcher at Microsoft and co-editor of Media Technologies: Essays on Communication, Materiality, and Society, ties to profit-driven design.

For young South African women, this often means an endless stream of hyper-sexualised and consumer-driven narratives that align with corporate goals rather than personal or community empowerment.

With limited data literacy, many young users are unequipped to question these curated feeds. This lack of critical engagement allows algorithms to erode individual agency, subtly but powerfully reshaping self-esteem and digital identity. Instead of fostering empowerment, these platforms perpetuate a profit-driven narrative that sidelines personal growth in favour of corporate gain.

Profit over people: the market logic of algorithms


Algorithms are not neutral; they are the products of economic agendas. Platforms like Meta and Google thrive on “behavioural surplus” extraction — a term coined by Shoshana Zuboff, Professor Emeritus at Harvard Business School and author of The Age of Surveillance Capitalism.

This model is particularly exploitative in a country like South Africa, where youth unemployment is high, and social media often presents itself as a ticket to economic opportunity.

For young creators, this translates into a “labour of authenticity” where they craft commodified digital identities designed to appeal to algorithmic preferences. Brooke Erin Duffy, Associate Professor at Cornell University, and Emily Hund, a research affiliate at the University of Pennsylvania, argue that these personas serve corporate interests at the expense of personal aspirations.

The result? A digital culture where the pursuit of visibility forces conformity to marketable ideals, undermining individual autonomy.

The regulatory void and the imperative for data literacy


South Africa’s regulatory framework lags behind the rapid evolution of algorithmic systems. While the Protection of Personal Information Act (Popia) addresses data privacy, it falls short of tackling the broader societal impact of algorithmic governance. This absence of oversight enables tech companies to prioritise profit without accountability, leaving users — especially marginalised groups — exposed to harm.

Building a future-proof society requires integrating data literacy into education. Scholars Nick Couldry and Ulises A Mejias, authors of The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism, emphasise the concept of “data justice”, urging individuals to understand how algorithms influence their lives.

Equipping young South Africans with these skills could shift them from passive consumers to active participants, capable of engaging critically with the digital systems shaping their futures.

Reclaiming the digital narrative


If we allow algorithms to dictate our identities and values, we risk creating a homogenised society driven by corporate interests rather than personal authenticity. As Julie E Cohen, a professor at Georgetown University and author of Between Truth and Power: The Legal Constructions of Informational Capitalism, warns, the “platformisation of society” reduces self-worth to likes and shares, with especially harmful consequences for young South African women.

Visibility metrics have become modern markers of value, pressuring users to conform to ideals that erode individuality and cultural richness.

To counter this, South Africa needs to adopt a comprehensive strategy that combines regulatory reform, corporate accountability and education. Policymakers should mandate algorithmic transparency and conduct regular audits to ensure platforms prioritise public interest.

Age-appropriate algorithm design must also be enforced to protect young users from harmful content.

Towards an ethical digital future


The unchecked power of algorithms is reshaping South African society, often in ways that exacerbate inequality and stifle individuality. Addressing this requires bold action: a commitment to ethical digital cultures, robust regulatory frameworks, and widespread data literacy. By confronting the challenges of algorithmocracy, we can foster a society where digital spaces are inclusive, empowering, and reflective of our diverse identities.

The future of South Africa’s digital landscape depends on our collective ability to question, resist, and reshape the systems governing us. Algorithms may wield power, but it is up to us to decide who — and what — they serve. DM

Categories: