Stop the Nonconsensual Use of Nude Images in Research (Published at NeurIPS 2025 - Oral)
About
In order to train, test, and evaluate nudity detection models, machine learning researchers typically rely on nude images scraped from the Internet. Our research finds that this content is collected and, in some cases, subsequently distributed by researchers without consent, leading to potential misuse and exacerbating harm against the subjects depicted. We argue that the distribution of nonconsensually collected nude images by researchers perpetuates image-based sexual abuse and that the machine learning community should stop the nonconsensual use of nude images in research. To characterize the scope and nature of this problem, we conducted a systematic review of papers published in computing venues that collect and use nude images. Our results paint a grim reality: norms around the usage of nude images are sparse, leading to a litany of problematic practices like distributing and publishing nude images with uncensored faces, and intentionally collecting and sharing abusive content. We conclude with a call-to-action for publishing venues and a vision for research in nudity detection that balances user agency with concrete research objectives. You can check out the paper here: openreview.net/pdf?id=Ev5xwr3vWh
Speaker

Princessa Cintaqia
Princessa Cintaqia is a PhD student at Boston University's Faculty of Computing and Data Sciences working with Allison McDonald. Previously, she earned her bachelor's from the University of Indonesia in her beautiful home country of Indonesia. She is interested in socially aware computer security, especially in the context of sexual privacy and human-centered cryptography.