Why ‘All Eyes On Rafah’ went viral – and why you need to be careful when sharing AI images

Why ‘All Eyes On Rafah’ went viral – and why you need to be careful when sharing AI images

What do supermodel Bella Hadid, actor Mark Ruffalo, racing driver Lewis Hamilton and approximately 47 million people have in common? They all reposted the viral “All Eyes on Rafah” picture on their social media platforms.

The image, generated by artificial intelligence (AI), depicted tent camps for displaced Palestinians stretching out into the horizon, overlaid with the phrase “All Eyes on Rafah”. The phrase itself comes from World Health Organization representative Dr Richard “Rik” Peeperkorn, who said that “all eyes” were on Rafah when speaking to journalists back in February.

The graphic was created by a Malaysian Instagram user (known by his Instagram handle, @shahv4012). He has since posted to Instagram Stories: “There are people who are not satisfied with the picture and template, I apologize if I have made a mistake on all of you.”

This was in response to many calling out the image for “sanitising” what the death and destruction in Gaza looks like, when there is endless real content from Gazans and journalists, risking their life to show the reality on the ground.

Actress Rachel Zegler, for example, recently expressed that she found it “disturbing” that it took an Al-generated image that doesn’t reflect the actual horrors faced by the Palestinian people, for people to support the cause.

There have been plenty of shocking images, shareable graphics and heartbreaking stories of the conflict shared to social media. So why has this particular image, created through AI rather than a depiction of the reality on the ground in Gaza, gone viral?

The raised fist became symbolic of the Black Lives Matter movement.
trevorwk/Shutterstock

In the age of social media, certain images have the power to transcend geographical and cultural boundaries, at times becoming symbols of social movements. For example, the Black Lives Matters raised fist icon fast became a powerful symbol for anti-racist struggles. The symbol highlighted racial injustice and police brutality beyond the US, resonating with people from different backgrounds.

Such images and symbols encapsulate powerful, often heartrending narratives in a single frame, evoking empathy, outrage or solidarity. When people encounter such content, they feel compelled to share it – not just as an act of dissemination, but as a form of participation in a broader conversation or movement.

Beyond their visual impact, these images carry wider social and political relevance. They incorporate symbols which rise above specific demographics and resonate with universal human experiences and values. This broad appeal makes them powerful tools for uniting diverse groups around a common cause.

The Rafah image uses strong symbols that suggest displaced families and desolation. Through the countless tents stretching to the horizon, it depicts harsh conditions of people fleeing brutality, with nothing but the barest of shelters in a barren land.

AI-generated images use what are called “pre-attentive processing features”, which refers to the way our brains process of sensory information before our conscious minds start to pay attention. Humans are hardwired to notice certain visual elements, including contrast, colour and unusual composition, almost instantaneously.

AI-generated images, in particular, excel at this. By design, they capture our attention, making them exceptionally effective at standing as we scroll rapidly through our digital feeds.

Generative AI (an emerging branch of AI that has the capacity for creativity and innovation) images often introduce new symbols or reframe existing ones in ways that capture sociocultural moods, beliefs or trends.

Human-created images might unconsciously reflect the creator’s perspective. AI, however, can draw from a vast and varied pool of inputs, potentially offering a more balanced and varied view.

But AI still lacks nuance and can easily be misconstrued. The portrayal of perfectly aligned shelters implies a level of organisation and management that does not reflect the reality of the Rafah refugee camp. It’s one many reasons the image has been criticised.

The reluctance of people to share disturbing images on social media platforms clearly seems to influence the creation of more passive representations. Censorship and sensitivity towards graphic content on social media platforms may also influence creators to opt for less explicit imagery to avoid being filtered.

Generative AI has the ability to bypass censorship, not only in terms of avoiding flagged keywords but also in visual content, which aids the widespread dissemination of such content.

Real photos of victims in Gaza and Rafah are often considered too horrific and graphic to share on social media platforms. This means that more sanitised, AI-generated images proliferate wider and faster.

Social media users are increasingly adept at using the capabilities of generative AI to create more effective and censorship-resistant imagery. This approach ensures that their messages reach a wider audience while working within the constraints of content moderation policies.

What social media users should be aware of

The rise of AI-generated images also calls for a heightened sense of responsibility among social media users.

Images such as “All Eyes on Rafah”, despite its inaccurate representation, can still can serve a purpose by tapping into our emotions and resonating widely across social and political landscapes. AI-generated images add an extra layer of creativity, making them prone to wider dissemination. Yet, with this power comes the responsibility to share wisely, ensuring that the images we amplify contribute positively to the global dialogue.

Before sharing an AI-generated image on social media, reflect on whether it is truly representative of reality. Akin to a tiny bush-fire which can instantly become a raging wildfire, content posted online can become viral in a matter of moments.

The post “Why ‘All Eyes On Rafah’ went viral – and why you need to be careful when sharing AI images” by Kamran Mahroof, Associate Professor, Supply Chain Analytics, University of Bradford was published on 06/03/2024 by theconversation.com