Features of Apple's iPhone operating system such as Memories and AirDrop can harm vulnerable women by randomly presenting them with photos of past abusers or allowing perpetrators to send unsolicited sexual images, according to new research.
Nicolette Little, a professor in the University of Alberta's Media and Technology Studies program, and Tom Divon, a researcher in digital culture at the Hebrew University of Jerusalem, found that Apple iOS features and their algorithms can "inadvertently perpetuate hostile digital environments for users."
Without prompting, the iPhone's Memories feature will automatically create personalized slideshows with photos and videos from a person's past, set to music.
"It's a really awful experience for people when a picture of your abusive ex keeps coming up, because the algorithm thinks you want to see them," says Little. "Some people described getting 10 images, almost in a row, of a previous partner, so when it gets it wrong, it gets it really badly wrong. Survivors of abuse who are faced with these images in a slideshow find the whole experience creepy."
The researchers decided to team up after they realized mutual concern over Apple iOS features and their unexpected consequences for marginalized users. For this study, they conducted a series of semi-structured, trauma-informed interviews, attentive to ethical protocols.
The research follows a study by Little published in 2022 on a similar feature Facebook launched in 2015, also called Memories, that can trigger survivors of gender-based violence.
As Little pointed out in her previous study, 80 to 90 per cent of survivors of gender-based violence know their abuser — often a former acquaintance, family member or intimate partner. Even if the survivor unfriends this person, they still show up in resurfaced Facebook photos.
"Rates of gender-based violence are extraordinarily high in Canada, especially since the pandemic," she says. "It's now referred to as an epidemic in its own right."
Some of those Little interviewed would leave their phones at home, or hide them away on anniversary dates or on Valentine's Day, dreading what might unexpectedly turn up. In Little's own walk-through of Facebook Memories, she found that the process from initial internet search to accessing the feature's settings took a "time-consuming seven steps" — too long for a survivor who may be triggered by seeing images of an abusive ex.
In the current study of Apple iOS Memories, Little explains that, because survivors usually know and were quite close with their abuser, "past photos with the perpetrator likely populate the survivor's iPhotos cache. This not only gives the algorithms plenty of now-triggering material to draw upon, but gives them the wrong message that this is someone you enjoy spending time with."
Apple iOS's other popular feature, AirDrop, allows predators to transfer photos or other files to a nearby user's device using wireless communication. Women and girls who are in public spaces, such as on transit, frequently report receiving unsolicited sexual content, such as "dick pics," via AirDrop. Divon, who spearheaded a study of AirDrop misuse on public transit, refers to this as the "AirDrop trap."
Unlike Facebook's and iPhone's Memories features — and what Little sees as an automated form of violence, or "platform violence," that is embedded in the system's algorithms — the AirDrop trap is deliberately exploited by abusers. Divon explains that, although the mechanism of abuse is different, AirDrop's design once again does not support victims' ability to address it.
Divon interviewed 16 women who received unsolicited sexual content from a stranger while in public spaces such as trains and buses. He explains that, although "users receive a preview and prompt when content is shared via AirDrop, reducing the risk of harassment … the presence of sexual content in previews can adversely impact victims." Also, problematic senders can "persistently resend content, which recipients have to continually engage with to dismiss."
Such AirDrop incidents erode users' sense of safety in public. And because the "dropper" is an unknown stranger, the recipient cannot address the matter, leaving them feeling frustrated and helpless, Divon says.
Little and Divon note that, with both Memories and AirDrop, default settings are at play in perpetuating harm. Memories is preset to "on," and AirDrop's receiving settings come configured to "everyone."
"The easiest solution would be for designers to select what receiving settings they want on AirDrop before they are subjected to unexpectedly sent sexual content," Divon says.
As for Memories, "We're faced with a situation where features are set to 'on' in a way that assumes memories are positive for everyone, that nostalgia is something we all thrive on," Little adds. "It's just not setting up people who've had bad experiences for well-being or success."
Little and Divon agree that the issue is not that these technologies are built maliciously. Instead, people find ways of using them to unfortunate ends, or developers don't adequately consider the fallout of the features they build.
While acknowledging that technology companies have made some changes, the authors agree they have not responded fast enough, adding that the aggressive assault on DEI policies in the United States will only make reform more challenging.
Nonetheless, Little is advocating for more effective, trauma-informed approaches to training and design in the digital sector, "recognizing that people from different groups and identities have vastly different experiences and are exposed to things like gender-based violence at vastly different rates."
The research is part of Little's ongoing exploration of how digital technologies are used and experienced by survivors of sexual abuse and other groups who face violence.
"It's about getting Silicon Valley to give people agency to make decisions before they're harmed in the first place."