I’ve recently published two journal articles on different topics related to software design and violence. These articles extend my interest in exploring how software matters as social infrastructure. What role does software play in enacting culture and regulating our lives? What are the potential recursive effects of programming practices on society? I am interested in examining the conditions that are produced by software, the relations structured by software, and how software is programmed to enable/disable/encourage/reward/obscure. Ultimately, I want to make visible the durability of systemic oppressions.
The first is called Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers. This was a collaboration with Oliver L. Haimson (UC Irvine) and the article is open-access in Social Media + Society.
- For this project, we looked at the top ten social media platforms to explore (1) how gender is made durable through social media design, and (2) the shifting composition of the category of gender within the social media ecosystem more broadly.
- Using the walkthrough method, we investigated these platforms from two different subject positions: (1) a new user registering an account; and (2) a new advertiser creating an ad.
- While some user front-ends appeared to be genderless (YouTube, Twitter, LinkedIn), all of the ad portals offered gender-based targeting. This data was either (1) directly collected through gender fields programmed into sign-up pages and/or profile pages; (2) collected through strategic alliances with other platforms; or (3) algorithmically inferred through other user data and actions (for Twitter and LinkedIn users have a system-assigned gender).
- We show how custom gender (offering options beyond the binary of male and female) is trending as a programming practice but it is still limited and dominated by a normalizing binary logic. Only some platforms offer categories beyond the binary to advertisers, which is interesting since the ad industry plays a major role in calcifying gender as both a binary and segmenting device that naturalizes essentialist stereotypes. Yet with increasing granularity of identity categories comes increased surveillance of marginalized users.
- Overall, we are concerned with the degree to which social media platforms are entrusted to control how categorization systems are defined and deployed. These categories have the potential to shape the perceived needs and desires of users, advertising clients, and even researchers.
The second article is called Rape: is there an app for that? An empirical analysis of the features of anti-rape apps. This is another collaboration, this time with Amy A. Hasinoff (UC Denver) and is published in Information, Communication & Society.
- This project focused on mobile phone apps designed to prevent sexual violence. Our investigation of 215 apps quickly turned into an analysis of all features programmed into these apps (n = 807).
- We were particularly interested in looking at all of these apps as a field of design. For instance, what have app developers been working on to intervene in the problem of sexual violence? Who are their intended users? What rape prevention strategies do they embed into their designs? What ideas about the nature, cause, and prevention of this social problem do the 807 features – as a whole – reflect?
- Despite good intentions, these tech-focused ‘solutions’ can inadvertently exacerbate the problems they seek to resolve. Overall, these apps reproduced myths that sexual violence experts have been debunking for decades.
- For instance, perpetrators are usually known to their victims; they are rarely strangers. Yet the vast majority of apps are designed to intervene during an incident where a stranger suddenly attacks someone, or they are designed to turn off when the user has entered a ‘safe’ space like home or work.
- Sexual violence experts will also stress that potential victims should not be the main target of prevention strategies. Persuading perpetrators to stop assaulting could be more effective. Yet only 0.02% of apps in our sample targeted potential perpetrators. Other apps perpetuate racialized discourses masked by ‘safety audits’ and facilitated by gated community networks.
- Overall, apps were largely designed to increase the vigilance of potential victims, often encouraging acceptance of intrusive surveillance measures to facilitate a prevention strategy that merely focuses on avoidance tactics. Giving potential victims new tools to prevent specific incidents may be valuable to them, but the notion that this is a meaningful way to end rape implies that rape, as a broad social problem, actually cannot be prevented, and can only be avoided by vigilant and responsible individuals.