In contemporary digital society, remembering is automated. Social media platforms and smartphones often offer features like iPhone’s and Facebook’s “Memories” that resurface users’ past posts and photographs.
For many people, these reminders of the past are a source of joyful reminiscence. For others — like survivors of gender-based violence (GBV) — they can be harmful.
These nostalgia-driven Memories features enact what I call “platform violence:” unintended but harmful consequences, caused by automated features, designed to profit tech companies without adequately considering users’ well-being.
Algorithmic recall
Algorithms select and retrieve images from users’ digital archives, with the supposed goal of reminding users of happy moments. Introduced in 2018, Memories was promoted by Facebook’s product manager, Oren Hod, as a tool for improving mood and connection with others.
Yet these algorithms can get it wrong by bringing up painful, or even traumatic, memories instead. Writing about the feature in Forbes Magazine, Amit Chowdhry acknowledges that “memories … are not all positive.”
While Facebook’s algorithm attempts to filter out negative memories using keywords and feedback from users’ reactions, these safeguards are often inadequate. As my research has found, resurfaced photos of abusers can trigger emotional, psychological and even physiological distress for survivors of GBV.
When iPhone Memories draws images from a user’s Photos cache to create slideshows, smartphone users can be similarly triggered. The fact that these slideshows are set to cheerful music is something survivors find particularly “creepy,” as images of abusive exes scroll by.
Familiar faces
GBV encompasses a spectrum of abusive behaviours, ranging from catcalling and rape jokes to sexual assault and femicide. In Canada, a woman dies every other day due to GBV, with intimate partner violence claiming a life every sixth day. One in four women reports GBV in their lifetime, although the actual number is higher due to fears of not being believed or stigmatization.
Particularly relevant to my research, in at least 80 per cent of cases, the perpetrator is someone the survivor knows, such as a partner, friend or family member. This makes it likely that survivors once shared social media connections or posted images with their abuser, increasing the risk these photos will resurface as a memory.
For survivors, encountering a photo of their abuser can be as traumatic as seeing them in person. In interviews with 15 survivors, all reported intense emotional reactions including panic, upset and physical symptoms like nausea and a racing heart. Those with post-traumatic stress disorder (PTSD) were particularly vulnerable to being triggered.
For instance, one participant, Nyla (names have been changed), described experiencing “full panic mode” and emotional shutdown for days after seeing a photo of her abusive ex-partner. Kelly, another participant, felt her “heart race” and avoided her smartphone and social media altogether. Other participants’ responses included feelings of social disconnection, fearfulness when out in public and mistrust of their own judgment of others. This presented barriers to forming new, healthy relationships.
Nancy, a survivor of an abusive relationship, recalled photos from the period when she was planning her escape.
“I look into my eyes in those photos and know I was secretly planning on leaving my partner,” she said. The resurfaced images were a “surreal” reminder of the facade she maintained during the final years of her marriage.
Inclusive, safe design
Survivors often lack the familiarity with platforms’ settings to pre-emptively block or delete potentially triggering content. Even when settings exist, they are often buried in menus, hard to navigate or require survivors to manually confront and delete painful memories or photographs.
Once the survivor has been triggered, they often no longer have the emotional capacity to take the steps needed to delete or remove the upsetting memory at the time.
Recommendations like telling survivors to leave their device at home or deactivate their social media accounts place responsibility for addressing abuse on survivors, rather than perpetrators. Mobile phones and social media are essential to daily life, including for work, social interaction and access to safety-related services. Advising survivors to simply log off or avoid their devices shifts responsibility onto survivors and distracts from the underlying issues: society’s high rates of GBV and the need for safer, more inclusive design.
And inclusive design is needed: nostalgia-producing algorithms, as they currently function, disproportionately harm communities exposed to higher rates of violence, including women and LGBTQ+ and BIPOC individuals.
Opt-in rather than out
Interview subjects suggested that platforms require users to opt in if they wish to have their past resurfaced, rather than being forced to opt out, often after being triggered.
Tech developers, often from privileged backgrounds, fail to account for marginalized users’ experiences when designing features.
Platforms must prioritize user safety by making it easier to control and customize the memories that resurface. Settings for managing features like Memories should be accessible, easy to use and sensitive to the needs of those who have experienced trauma.
By recognizing the unintended consequences of algorithmically driven nostalgia, tech companies can take steps toward creating platforms that empower all users.
The post “Features like iPhone’s and Facebook’s ‘Memories’ can retraumatize survivors of abuse” by Nicolette Little, Assistant lecturer, Media and Technology Studies, University of Alberta was published on 02/04/2025 by theconversation.com
Leave a Reply