*WATCH THE KEYNOTE FOR FREE!
Today, our “data doubles”—algorithmically-generated, computation-friendly versions of our identities produced by and through digital data—present new challenges for social justice. Among these challenges is the problem of promoting dignity and self-respect, values that provide individuals and groups with a sense that their identities and experiences are valuable and their goals are worth pursuing. For vulnerable or marginalized populations, dignity and self-respect can be undermined by violent actions, symbols, or cultural ideas promoted through mass media, law and policy, or—increasingly—the design of data-intensive systems that seek to sort, evaluate, and rank people according to opaque or biased criteria.
In this talk, I position “data doubles” as sites of potential violence—especially when they conflict with our own moral self-perceptions, ideas, and beliefs in ways that implicate our dignity. Through an examination of 1) historical human rights abuses perpetrated through population data, 2) current critical discussions of surveillance, algorithms, and data ethics, and 3) the experiences of transgender women navigating systems that fail to account for their particular identities and bodies, I show how institutionalized and other biases work in and through data-driven systems to deprive certain people of what philosopher John Rawls called “the social bases of self-respect.”