Putting an Ethics of Care into Operation — Notes from Anais Nony’s talk, “‘Data-Mining the Body’: Racialized Bodies, Data-Mining, and Technics of Control”

In her talk titled “‘Data-Mining the Body’: Racialized Bodies, Data-Mining, and Technics of Control,”Anais Nony, post-doctoral fellow in French and Francophone Studies at Florida State University, stressed the ways in which the “technical is political” and that “no technique is apolitical.” Nony explained that the technical always includes the relation of an object to a living thing, therefore the political is inescapable. Additionally, Nony argued for the use of the term “digital studies” as opposed to “digital humanities,” because as digital humanists, we should be working with scientists, working with science, and that a division between the two was counterproductive and Nony’s talk supported the notion that such a division may also be potentially harmful, to view the objective as divided from humanism is to ignore a political reality, and to view humanism as uninfluenced by objective science allows for a clandestine creeping of colonialism—an “unintentional” imposition of epistemologies and ideologies upon the frameworks of our knowledge making. “Unintentional” programming—English as a default in MS Word, male and female binary options within surveys, etc, or as Duarte and Belarde Lewis make clear in “Imagining: Creating Spaces for Indigenous Ontologies”, the ways in which archival naming practices are self-referential to the archivalist.

The tension between objective science and humanism is not old, but Nony drew attention to the ways in which the digital is changing the dynamic tension between the technical and our humanity, and she pointed to two distinct changes, one, the synchronization of information—world wide data sets are aligned with the power of the internet and cloud computing, and two, the ability to shape time and space through technology that is inscribed in the network. We are the receptors of that communication, but unlike previous text technologies, most of us don’t necessarily have the abililty to shape the technology. We are fed by data, but don’t have access to the platform.

It’s attention to this gap, this gap between platform and use, the gap between the system and the content, between the system and use, and it is attention to this gap that forms a “digital ethics of care.” A digital ethics of care would involve awareness of modalities and their relation to human bodies, and it would involve awareness of the ways in which the digital can create toxic environments or can be used for toxic purposes. Nony argues that it’s our (as digital humanists and our by extension as citizens) responsibility to develop remedies for “our own toxicity”. This is where the nootechnics come in, which I translate loosely to mean, “intelligent craft”—a technics that is aware of how it operates, and how it affects the beings involved. Examples of the absence of care can be found in the use of the Palentir database by ICE agents that is used against undocumented immigrants in recent years, and early in the 20th century, X-ray technology was used by South African diamond mine owners to scan the bodies of slaves for smuggled diamonds. The tool is used to legitimate colonial forces, and to automate acts of racism, but the tool itself is not questioned.

So what does a “digital ethics of care” look like or how can we work towards viewing the digital in non-neutral ways? In response the issues Nony brought to light, participants in the discussion expressed a desire for pedagogical action, how can we teach this? Awareness of technical modality, being aware of scholarship and acknowledgement of the ways in which resistance exists and is possible, these were three notions that were proposed. A central tension that surfaced in response to the question of action was where does the burden of a digital ethics of care reside: is it in the individual or in the collective? And while in some senses, of course the answer is both/and, located sites for apathy seemed to be relevant in individual and collective contexts. Individuals can resist the addiction of data feed and collective transparency can be questioned. Big data may anticipate the moves of body, but the automation of data is “implemented” at some point and not automatic in its genesis. An ethics of care can be put into motion and operation.

Nony, Anaïs. (2017) Nootechnics of the DigitalParallax, 23:2, 129-146 [FSU access]

Nony, Anaïs. (2017) From Dividual Power to the Ethics of Renewal in the Anthropocene, Azimuth, International Journal of Philosophy, 9, 31-41

Risam, Roopika (2015), Beyond the Margins: Intersectionality and the Digital HumanitiesDigital Humanities Quarterly, 9:2

Duarte, Marisa Elena, and Miranda Belarde-Lewis (2015) Imagining: Creating Spaces for Indigenous OntologiesCataloging and Classification Quarterly [FSU access]


One thought on “Putting an Ethics of Care into Operation — Notes from Anais Nony’s talk, “‘Data-Mining the Body’: Racialized Bodies, Data-Mining, and Technics of Control””

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s