top of page

Decolonising AI and Why Underreported Stories Should be in Focus

Updated: Jun 24



Last week, I had the opportunity to attend two major events hosted by One World Media: their inaugural Media Freedom Lab on June 18th, followed by the 36th annual One World Media Awards ceremony on June 19th.


One World Media is a London-based charity that has championed quality reporting from the global south* for over 30 years. Founded by journalists who sought to break down stereotypes and improve international coverage of important stories beyond Europe, they support filmmakers and journalists globally through grants, mentorship, and training. Their annual awards ceremony also serves as a fantastic opportunity to shine a spotlight on those who might otherwise go unseen and unheard.


As a media lawyer, I spend a lot of time helping storytellers and journalists get their messages across. This might simply involve me reviewing a contract to ensure the correct permissions are in place before something goes to press or on air. But more broadly speaking, I'm passionate about storytelling - and One World Media's commitment to elevating voices from underrepresented communities is incredibly worthwhile.


* The global south refers to regions of Latin America, Asia, Africa, and Oceania that are often characterised by lower levels of industrialisation, income, and development compared to the global north.


a collage of film stills, film posters and images of journalists
several of the films and stories that I found particularly captivating

Media Freedom Lab: AI Bias & Decolonisation


One of the most thought-provoking aspects of the Media Freedom Lab (kindly hosted by OWM sponsors, the European Bank for Reconstruction & Development) was the exploration of bias and the importance of decolonising AI.


While there's no singular definition, AI bias refers to the tendency of algorithms to produce skewed results due to inherent prejudices in training data or design. Acknowledging and mitigating this sort of bias is crucial, because biased algorithms can perpetuate and amplify existing inequalities, leading to unfair and discriminatory outcomes - especially in news coverage and content distribution.


So, how does AI bias happen? In reality, there are many factors throughout the data lifecycle and development process that can lead to biases. For example, historical data containing societal prejudices or harmful stereotypes might be used to train an algorithm, causing the AI to then perpetuate these antequated views. Alternatively, datasets used for training might fails to represent an overall population, leading to oversimplified generalisations. We can't forget the possibility of human bias in the algorithm's development either, which stems from the subjective decisions made by programmers during data preprocessing, feature selection, and model tuning. Making the situation even worse is the phenomenon of feedback loops, which occur when the output generated by an AI system influences its future inputs. As computer scientist Emilio Ferrara PhD has explained, this reinforces and magnifies biases over time, resulting "in a self-perpetuating cycle of unfair outcomes that disproportionately impact certain groups."



digital collage by Kelsey Farish x Midjourney

AI bias links closely to the concept of decolonising AI, as both address the importance of recognising and hopefully rectifying the various power imbalances already deeply embedded in AI systems.


"Decolonising AI" might mean different things to different people, but for me it refers to the practical process of challenging - and changing - the ways in which algorithms and AI platforms are developed, deployed, and understood. Put simply, the goal of decolonisation is to avoid perpetuating existing biases, and inequalities rooted in historical colonial practices. If we can dismantle the dominance of Western-centric views and practices, the technology can serve a broader and more global community, which can only be a good thing.


Now obviously, this is a complex and systemic issue spanning time and space, and there is no one single solution. But a good place to start is critically examining and restructuring how AI technologies are developed and deployed, to ensure they are inclusive and representative of diverse cultures, perspectives, and contexts. And I absolutely love how this goes hand-in-hand with One World Media’s broader mission to amplify underreported stories and foster diverse narratives.



Awards Ceremony: AI Bias & Decolonisation


The following day, I glammed myself up a bit for the One World Media Awards Ceremony at the Curzon in Soho. As I mingled with nominees and winners at the reception, I was struck by the diversity and depth of their work - ranging from joyful explorations of how real-world Quidditch matches are changing lives in Uganda (The Ugandan Quidditch Movement) to a heartbreaking exposé of Russian women desparately seeking to find information on their sons and husbands lost to Putin's war machine in Ukraine (Hunting Russia's Lost Sons).


This year, One World Media received more than 500 entries from 117 countries, for awards such as "innovative storytelling", "environmental reporting", "freelance journalist of the year", "refugee reporting", "press freedom", and "women's solutions reporting".


The winner in the Short Documentary category was Can I Hug You? (above) directed by Elahe Esmaili. From the Sheffield Documentary Festival: "In this astoundingly intimate film, Hossein brings his parents together to confront a difficult secret from his past. In the religious Iranian city of Qom, there are many restrictions imposed on women in the name of ‘sexual safety’ – from the mandatory hijab to forced gender separation. Hossein grew up in this misogynistic and patriarchal context, but as a young boy he found himself subject to sexual assault by older men. Keeping a secret that no one should have to keep, this ordeal has followed him throughout his adult life. Now, with the help of his wife Elahe, he is confronting his trauma. In Elahe’s courageous and profoundly intimate film, a boxed mosquito net on the roof of the family home becomes a confessional space for conversations that have been waiting a lifetime to take place."

Many of these productions were gorgeously shot and edited, and full of poignant beauty. But not all. Some were raw and visceral, and quite frankly difficult to watch. Herein lies the most important takeaway, for me at least: there is so much thought-provoking complexity worth exploring beyond the "developed" Western world.


And at the same time, so much pain, systemic oppression, loss of agency, and exploitation. 


The stories highlighted by One World Media challenge us to see beyond our own borders. Perhaps more importantly, they serve as a call to action to engage with and support the people represented in these incredible works of journalism. In the coming weeks, I look forward to reading and watching the work of the nominees and winners, and I encourage others to do the same.


Three women posing for a group photo. One woman is from Singapore and the other two women are white Europeans with brunette hair.

A photo of Nabilah Said (L) whose organisation won the Innovative Storytelling Award for "A Woman’s World: Creating spaces for joy, leisure, and resistance in South and Southeast Asia", together with me (!), and my fellow Reviewed & Cleared lawyer Natalie McEvoy (R).



80 views0 comments

Comments


bottom of page