Complicate

There is a paradox.  DH wants to be accessible, but DH wants to address complex and underlying issues in society.   DH wants to “unmask racialized systems of power,” yet DH is based in technology that “developed out of a racialized system (Gallon).” DH wants to create a grassroots recovery, yet DH starts not from the community, but from academia.  DH states it wants to “move ‘beyond normative ideas of who is a digital humanities scholar (Terman),’” yet DH is almost completely based in higher education. Where do we start to address these contradictions?

Are we, as digital humanists, exclusionist in the language we use to describe the systemic issues that exclude people?  Yet, while discussing topics of race and gender, simplifying an issue can create misinformation and misunderstanding.  We all have complex and rich histories, we just need a starting place to begin exploring them.   These histories are all about perspective, meaning we need more people with diverse backgrounds to engage in these topics and allow for their voices to be heard.  If we reached more people, would this help DH reach the goal of impacting complex, underlying issues?

My basic question is where is the entry point to DH?  Earhart and Taylor base their paper on the idea of creating a grassroots recovery of lost humanity.  Yet, I question how they view the terms grassroots.  For them, it means starting with entry level technologies and broad partners. Here the DH point of entry is only simplified in their technology choices.  However, Gallon suggests this is not a good thing. Digital Humanists should not compromise the technology they use, as this is the basis for their work.  Gallon references Johana Drucker’s statement that we should “use and build digital infrastructure and tools steeped in humanistic theory so that they function in ways that reflect the core values of the humanities (Gallon).”  While using technologies steeped in humanistic theory for humanities research is more theoretically sound, these technologies tend to have a higher technological entry point.  Therefore, there is a tension in choosing what technology to use.  Should professors use a simple, easy tool that can be used by many, or use a tool that recognizes the nuanced way the humanities work as a discipline?  The use of entry level technologies may be a good starting point to bringing more undergraduates into digital humanities, but it simplifies and perhaps undermines identification of, “racial dynamics in digital spaces (Terman).”

Is it beneficial for digital humanists to create an entry point for those interested in developing and sharing their knowledge?  A simplified entry point might not be the right answer, as simplifying has its own problems, but where do we start teaching more people about DH?  Is it more important to bring in more people with more perspectives, or first clean the issue DH has in itself?  Finally, would opening DH to reach more people create a dialogue that would bring richer complexity and depth?

 

Works Consulted:

 

Advertisements

Exclusionary Logics in the Digital Humanities

In preparation for our upcoming discussion around Mercado’s Black at Bryn Mawr, we explored the aforementioned project alongside a number of similar efforts, all of which strive toward the preservation, publication, and promotion of suppressed narratives. While many hail the rise of DH as an emancipatory force that facilitates these sorts of cross-cultural collaborations, others contend that the institutional parameters of the field (digital though they may be) still exist along exclusionary lines. As this week’s assigned readings suggest, only through the acknowledgement and deconstruction of gendered, racialized logics can we hope to truly achieve the diversity and inclusivity long touted by proponents of our discipline.

Although few would disagree that certain internalized biases remain embedded within prevailing infrastructures, there seems to be some debate over the exact nature and scope of the central problem at hand. In “Black Studies and Digital Humanities: Perils and Promise,” Terman posits that the primary challenge facing scholars within Black Studies is determining “how to produce quality content that is centralized enough to provide a cumulative critical apparatus.” Preoccupied less with content and more with material conditions surrounding its production, Earhart and Taylor suggest that attention be turned to “inequitable distributions of digital humanities resources and labor.” For Wernimont, however, attempts to impose a strictly economic or consumptive framework upon DH undermine its fundamentally qualitative concerns: “a celebration of plentitude reproduces certain commercial metrics — notably production as value and information as capital — of which there is significant feminist critique.” Finally, Gallon sidesteps the structural conversation around academic racialization, focusing instead on the perpetuative source of these hegemonic formations, namely, our collective refusal to engage them.

As you might expect, proposed responses to these issues vary just as widely as articulations of the issues themselves. While Terman advocates the celebration and centralization of scholarly work occurring at the intersection of DH and Black Studies, Earhart and Taylor promote a model of dispersal, “one designed to decenter traditional power structures by shifting power centers, eliminating funding needs, and reducing the necessity for advanced technical knowledge.” Working toward a feminist critique of the digital archive specifically (and DH more broadly), Wernimont calls for an indexical, rather than enumerative, approach to the “proliferation of projects” that Earhart and Taylor envision. As for Gallon, she believes that the answer lies not in the projection of a digital future but in recovery of a human past: “‘How can digital tools and processes such as text mining and distant reading be justified when there is so much to do in reconstructing what it means to be human?’”

Demonstrating a potential synthesis of digital functionality and humanistic restoration, Black at Bryn Mawr is an extensive research project that “explores the experiences of Black students, faculty, and staff at the College,” thereby raising awareness of “racial power dynamics inside and outside of the classroom” (“About the Project”). The result of student collaboration under the guidance of Monica Mercado and Sharon Ullman, Black at Bryn Mawr encompasses a series of blog posts, walking campus tours, and an interactive digital map, all of which support the project’s ultimate objective: “to build institutional memory of the College’s engagement with race and racism, enabling future students to hold both themselves and the College community to higher standards of…accountability” (“About the Project”). The Amistad Digital Resource for Teaching African American History and the Florida Memory Black History photograph exhibit are two related projects, similarly awareness-driven. Looking forward, these efforts can serve as inspiration—if not definitive models—for future attempts to break down exclusionary logics, using our digital tools not to suppress, but rather to promote difference within DH.

Works Consulted

 

“Black at Bryn Mawr” and Technologies of Recovery

Thursday, November 9, 3:30-4:45 pm
Williams (WMS) 415 [turn L off elevators, then R]

Being “Black at Bryn Mawr”: Past as Legacy and Project

For our third meeting this term, the Digital Scholars group will peruse some recent legacy projects and engage in conversation about technologies of recovery. Central to our discussion will be “Black at Bryn Mawr,” a collaborative project begun by Emma Kioko and Grace Pusey in the Fall of 2014 under the guidance of Monica Mercado and Sharon Ullman. Initially conceived as a cross-disciplinary attempt to re/build institutional memory of the College’s “engagement with race and racism,” BBW represents a growing number of legacy projects that hope to re-situate institutions’ relationships to their past and present communities. While the digitization project is ongoing, during the AY 2017-2018, Bryn Mawr has also begun discussions about installing other physical projects and/or naming physical landmarks on campus to highlight some of the content amplified by this work. We may take up the following questions:

  • How might projects like these satiate or provoke ongoing concerns about the “whiteness” of Digital Humanities?
  • Is “legacy” an appropriate term for data-oriented projects driven by models of data-gathering that may potentially flatten?
  • Since Digital Scholars first raised this question in 2011, how far have we come in considering how a “critical code studies” might inform (or transform) this work?
  • Assuming their interest in the material and cultural implications of technologies of recovery, what seems an appropriate set of questions for digital humanists to ask, or with which to build such projects?
  • What stands in the way of authentically anti-racist dialogues surrounding technology within DH?
  • How is DH complicit in barring such dialogues from occurring?

Participants will be encouraged to share their perspectives on and experiences with other inclusion projects, and all are invited to read and view the following in advance:

All are welcome! We hope you can join us,

-TSG

Putting an Ethics of Care into Operation — Notes from Anais Nony’s talk, “‘Data-Mining the Body’: Racialized Bodies, Data-Mining, and Technics of Control”

In her talk titled “‘Data-Mining the Body’: Racialized Bodies, Data-Mining, and Technics of Control,”Anais Nony, post-doctoral fellow in French and Francophone Studies at Florida State University, stressed the ways in which the “technical is political” and that “no technique is apolitical.” Nony explained that the technical always includes the relation of an object to a living thing, therefore the political is inescapable. Additionally, Nony argued for the use of the term “digital studies” as opposed to “digital humanities,” because as digital humanists, we should be working with scientists, working with science, and that a division between the two was counterproductive and Nony’s talk supported the notion that such a division may also be potentially harmful, to view the objective as divided from humanism is to ignore a political reality, and to view humanism as uninfluenced by objective science allows for a clandestine creeping of colonialism—an “unintentional” imposition of epistemologies and ideologies upon the frameworks of our knowledge making. “Unintentional” programming—English as a default in MS Word, male and female binary options within surveys, etc, or as Duarte and Belarde Lewis make clear in “Imagining: Creating Spaces for Indigenous Ontologies”, the ways in which archival naming practices are self-referential to the archivalist.

The tension between objective science and humanism is not old, but Nony drew attention to the ways in which the digital is changing the dynamic tension between the technical and our humanity, and she pointed to two distinct changes, one, the synchronization of information—world wide data sets are aligned with the power of the internet and cloud computing, and two, the ability to shape time and space through technology that is inscribed in the network. We are the receptors of that communication, but unlike previous text technologies, most of us don’t necessarily have the abililty to shape the technology. We are fed by data, but don’t have access to the platform.

It’s attention to this gap, this gap between platform and use, the gap between the system and the content, between the system and use, and it is attention to this gap that forms a “digital ethics of care.” A digital ethics of care would involve awareness of modalities and their relation to human bodies, and it would involve awareness of the ways in which the digital can create toxic environments or can be used for toxic purposes. Nony argues that it’s our (as digital humanists and our by extension as citizens) responsibility to develop remedies for “our own toxicity”. This is where the nootechnics come in, which I translate loosely to mean, “intelligent craft”—a technics that is aware of how it operates, and how it affects the beings involved. Examples of the absence of care can be found in the use of the Palentir database by ICE agents that is used against undocumented immigrants in recent years, and early in the 20th century, X-ray technology was used by South African diamond mine owners to scan the bodies of slaves for smuggled diamonds. The tool is used to legitimate colonial forces, and to automate acts of racism, but the tool itself is not questioned.

So what does a “digital ethics of care” look like or how can we work towards viewing the digital in non-neutral ways? In response the issues Nony brought to light, participants in the discussion expressed a desire for pedagogical action, how can we teach this? Awareness of technical modality, being aware of scholarship and acknowledgement of the ways in which resistance exists and is possible, these were three notions that were proposed. A central tension that surfaced in response to the question of action was where does the burden of a digital ethics of care reside: is it in the individual or in the collective? And while in some senses, of course the answer is both/and, located sites for apathy seemed to be relevant in individual and collective contexts. Individuals can resist the addiction of data feed and collective transparency can be questioned. Big data may anticipate the moves of body, but the automation of data is “implemented” at some point and not automatic in its genesis. An ethics of care can be put into motion and operation.

Nony, Anaïs. (2017) Nootechnics of the DigitalParallax, 23:2, 129-146 [FSU access]

Nony, Anaïs. (2017) From Dividual Power to the Ethics of Renewal in the Anthropocene, Azimuth, International Journal of Philosophy, 9, 31-41

Risam, Roopika (2015), Beyond the Margins: Intersectionality and the Digital HumanitiesDigital Humanities Quarterly, 9:2

Duarte, Marisa Elena, and Miranda Belarde-Lewis (2015) Imagining: Creating Spaces for Indigenous OntologiesCataloging and Classification Quarterly [FSU access]

Collective Conciousness for Rewriting History

This week in preparation for Dr. Anaïs Nony’s visit, our Digital Scholars reading group engaged with two of her works. Nony’s texts challenge us as digital humanists to work towards a re-writing of history so that we may, as a society, collectively address our current geo-political condition. In “From Dividual Power to The Ethics of Renewal in the Anthropocene,” Nony states that “the battlefield of the Anthropocene is one that demands action” (2017, p.31). Nony argues that we need to address the urgency of our anthropocenic condition and to take ownership of our geo-political situation.

“To rewrite history,” she argues, “is to heal the festering wounds that thwart the possibility of becoming otherwise in the world. This becoming other than what we are is the promise launched by collective action, by the processes of collective emancipation” (2017, p. 31). Therefore, we cannot hope to address our anthropocenic condition if we do not first acknowledge the role that we as a collective society play. In the challenge towards becoming something other than who we have historically been, we must collectively act if we are to address the urgency of our condition.

Viewing our anthropocenic condition as a battlefield allows us to question relations of power. Power, according to Nony, “is both an ontological and epistemological problem that develops into reflections on philosophies of being and natures of knowledge” (2017, p.33). Nony goes on to argue that in our current situation power is dividual and used to divide rather than unite. This segregated setting produces the possibility of watching from a distance, from a remote place of privilege and comfort, where the actions deployed in front of one’s eyes can be fictionalized to produce feelings of pity and fear (2017 p.36).

This division of power allows some to claim that they have not contributed to the Anthropocene. If we are to overcome our condition however we need to unite and work together. To move forward and address this condition we need to collectively embrace an ethic of care.  Care in this situation is taken to mean an “investment in the future of a living relation, be it with a (deceased) person, a plant, an animal, an object, or a space. Caring is cultivating a relation by investing in it” (2017 p.39). If we are to work towards becoming then we need to invest in changing our current condition not just for the moment but for the future.

In the second article of Nony’s that we were asked to read this week “Nootechnics of the Digital” Nony discussed the role of technics in our digital spaces. Nony argues that nootechnics offers a mode for thinking about the genesis of both noos (intuition, intelligence, flair, intention) and techné (technique, craft, art) as the condition and the consequence of our cultural condition, of our ability to mediate and negotiate different realms of reality” (20017, p.130). Thus, nootechnics can allow researchers a way not only to mediate and negotiate multiple realities, but to also to create change. This brings me to one of the additional reading that our reading group was asked to engage with this week.

In Marisa Elena Duarte & Miranda Belarde-Lewis’s article “Imagining: Creating Spaces for Indigenous Ontologies” the authors explore the way that indigenous communities use online spaces to share cultural artifacts and create a sort of cultural archive. In order to work with Indigenous communities towards decolonization practices, we must first step back from normative expectation (2015 p.678). Doing this allows scholars to open the way indigenous communities make meaning. Duarte and Lewis note that “Indigenous epistemic partners will want to step outside their comfort zone, sensitize themselves to Indigenous histories and political realities, learn to listen in new ways, and position themselves as followers in collaborative projects with Indigenous specialists leading the way” (2015, p.697). What stood out for me was the way that the authors challenge those of us who are not from indigenous communities to listen and be open to hearing.

To allow indigenous voices to create their own narratives, those who work alongside these indigenous people scholars need to be own to new ways of knowing. We need to move away from

Western text-based systems so visible and, therefore, apparently superior to oral, kinesthetic, aesthetic, and communal Indigenous ways of knowing—quipu, ceremonies, dances, songs, oral histories, oratory, stories, hunting and growing practices, healing arts, weaving, painting, pottery, carving, dreaming, and vision work—are the institutions through which Western text-based systems are legitimated (683).

As scholars, we need to be open to these new meaning making moments if we are to work towards decolonization and empowerment of indigenous communities.

If nootechnics allows scholars a way to negotiate between different realities, then we can potentially apply this technique to the ways that we work with indigenous communities to give voice to multiple ways of knowing. If we are to value multiple ways of knowing, then we need to figure out a way to highlight the work of these indigenous communities. Many indigenous communities value oral traditions and storytelling, so the nootechnics approach would seek to figure out a way for technology to allow this tradition the legitimacy that it deserves.

Nony, Anaïs. (2017) “From Dividual Power to the Ethics of Renewal in the Anthropocene.” Azimuth, International Journal of Philosophy, 9, 31-41

Finding the Individual within the Digital

In Nootechnics of the Digital, Anais Nony focuses on a divide between the digital (notably Big Data) and the technical (what I assume to be the human work, the analytical, the critical, the close readings, the social periphery — I struggled with the term technical because Nony never defines it). Nony states that “While the technical pushes us to face the past while backing up into the future, the digital is rushing up on us from behind, reminding us that we are late in our own present” (129). I take this to mean that our work in the humanities asks us to look back to the past to understand the future and where we’re going, to understand more about who we are and how we function.

Meanwhile, the digital creates a sense of urgency to push forward, to explore the untouched terrain of technology and see where it leads us. This reminds me of John Seely Brown and Paul Duguid’s critique of tunnel design, which “takes aim at the surface of life” and strives toward new technologies without paying attention to the social periphery, “the communities, organizations, and institutions that frame human activities” (5). They claim that, in a tunnel designed world, “we are expected to live on a strict information-only diet. It’s a world that addresses worries about information by simply offering more” (3). Thus, perhaps it’s the case that the technical aligns with the social periphery surrounding technologies, while the digital aligns with the tunnel design mentality of creating more technology and adding more information.

A similar concern for the periphery around information can be found in Katherine Bode’s “The Equivalence of ‘Close’ and ‘Distant’ Reading: Or, Towards a New Object for Data-Rich Literary History” from our first week’s discussion on taxonomies. She claims that “the methodological achievement does not translate into historical insight because the study considers only an abstract amalgam of literary works.” In other words, information stripped of context does not produce meaningful knowledge (9). Thus, these information databases do not “offer any alternative influences, nor comment on the extent to which gender, nationality, and chronology shape literary history” (9). Here Bode is critiquing the way that two distant reading scholars have flattened the job of scholars and scholarship on their way to claiming they have turned all of the world’s texts into a searchable database of sorts. In legitimizing their database, they critique microanalysis and argue that any form of interpretation is defective, “interpretation is fueled by observation, and as a method of evidence gathering, observation is flawed” (Jockers 31, qtd in Bode 4). Bode emphasizes that distant reading maintains the “interpretative acts” found in all scholarship through “constructing literary data, organizing it, and ascribing an historical explanation to the results” (3).

While Bode emphasizes the role of humans in distant reading, regardless of what some of it’s advocates claim, Nony suggests that information is the agent that is constantly pushing us forward and defining us as individuals: “The digital has recently surpassed the technical realm due to the economy’s use of highly addictive, yet intuitive, relations to digital platforms that are designed to function without the mastery of any user skills” (1, emph. added). I don’t think Nony and Bode are making conflicting claims here; while Bode points out the role of human agents within Big Data and distant reading technologies, Nony points out that whatever system we as humans construct begins to take on a life of its own. Central to Nony’s claim is that digital platforms are intuitive and thus do not require mastery. User “jimcross42” writes about the programmer becoming a gatekeeper to memory, and notes that “has this not happened throughout history?  When writing was invented, did not the storyteller lose his place as gatekeeper?  When the printing press started printing in the vernacular, did not the monastery lose its role as gatekeeper?  Why does “get off my lawn” come to mind?” (Digital Scholars Blog). However, I think the key difference here is the tension that Brown and Duguid locate between tunnel design and the social periphery. When the writer took the role of storyteller, human beings were not lost, their status was amplified. Instead of one storyteller, there were many. And while the printing press signals a shift toward mass production, human beings were still at the center of this shift, writing and printing and buying and reading. Even if readers did not see it happening, there was a social periphery going on around the printing; humans did the work. As we’ve gotten further attached to technological tools to do the tedious work for us, we’ve gotten further removed from the social periphery. This is the concern, as I understand it, that digital scholars have with the programmer being the gatekeeper to memory. The gatekeeper’s values and expertise does not naturally or necessarily align with the humanities values and expertise. So how do we locate the individual, and furthermore the cultural, within the digital?

Nony emphasizes that the individual does not need to be lost within this new age: “the digital offers the opportunity for a temporal revolution in the way we cultivate information in both space and time” (130). She explains that as individuals within a culture we have “cultural agency”: “the ability to cultivate singularized forms of instrumental mediation, which are needed to foster individualizing relations within a milieu” (130). Because a great majority of us cannot code, our relationship to the digital is always mediated. The closer we can get to understanding this mediation, the closer our relationship to these technologies can become. Information moves at a dizzying rate, but if we do not grab hold of the digital and write ourselves into it, it will not be a tool embedded with our values. She notes that “a myriad of unpredicted approaches to the digital have already been developed by imaginative users, thus demonstrating the openness of digital objects in adopting new purposes outside the use value directed by the market” (130). While Brown and Duguid emphasize that individuals must consider the social periphery that goes into new technologies and new information when they themselves consume them (for example, the news doesn’t just occur, it is curated, distributed, negotiated, presented, etc.),  here we see Nony emphasizing that we consider the social periphery in terms of how humans write themselves into technologies and in so doing subvert their “value directed by the market.” In paying attention to the ways that individuals use technologies–not those who consume passively, but those who grasp the tools and use them to their own ends (the individual’s ends, not the technology’s)–we can foster a relationship between the individual, the cultural, and the technology. In doing so, we can better understand the “being” within the triad of being, milieu, and objects.

-Joel Bergholtz

WORKS CITED

BodeKatherine. “The Equivalence of ‘Close’ and ‘Distant’ Reading: Or, Towards a New Object for Data-Rich Literary History.” Draft. Final version forthcoming in Modern Language Quarterly (December 2017).

Brown, John Seely, and Paul Duguid. The Social Life of Information. Harvard Business School Press, 2017.

Nony Anais (2017) Nootechnics of the Digital, Parallax, 23:2, 129-146, DOI: 10.1080/13534645.2017.1299293

Conversations: Interdisciplinary – Style

Digital Humanities cuts across a range of fields, and as such it draws its lexicon from all of these disciplines.  As I continue to read DH articles and attend DH discussions, I am assailed by this concept of a shared vocabulary.  Although the individual words may be the same, each discipline has added its own nuance to the concepts behind the words.  This week’s articles are no exception.  Nony “asks us to face a decisive rite of passage” in our battle of the Anthropocene and goes further by telling us that “the liminal is simultaneously here and there.”  I had to pause as the anthropologist in me winced at this.  This is where the lexicon breaks down.

Anthropologically speaking, a rite of passage has three stages: First, the separation stage; Second, the liminal stage; Finally, the reaggregation stage.  The liminal entity is in a transformative state and is neither here nor there.  They are between cultural statuses.  To complete this rite of passage the entity must reincorporate themselves into the society after the culture accepts the new status.  Even if we assumed that the culture itself was the liminal entity, it would have nothing to reaggregate with because the totality is in the liminal state.

Instead of a rite of passage, and in keeping with the battle theme stated at the beginning of the article, perhaps we should fight and assimilate to a politics of care.  Am I missing the argument while getting caught up in semantics?  Possibly, but as we all know, the devil is in the details.  Yes, we should just assimilate as resistance is futile.

In Nony’s other article, the section about digital storage and memory formation was intriguing.  I could not help the evil giggle that escaped me when I read that the programmer is the gatekeeper to memory.  Although, has this not happened throughout history?  When writing was invented, did not the storyteller lose his place as gatekeeper?  When the printing press started printing in the vernacular, did not the monastery lose its role as gatekeeper?  Why does “get off my lawn” come to mind?

The fight for diversity in Digital Humanities is crucial.  The consequences of losing are dire.  We not only lose history and culture, but we lose the future possibilities that these groups bring to the table.  The question is: should we win or should the battle for improvement itself be the goal?  If we win, is there a loser and do they then get pushed to the margins?  Does the issue stay in active thought, if we win, or will it recede to memory?

Race, Data-Mining, Control

Thursday, October 12, 12:30-1:45 pm
Williams 415

“Data-Mining the Body”: Racialized Bodies, Data-Mining, and Technics of Control

Digital Scholars is pleased to welcome Anaïs Nony, post-doctoral fellow in French and Francophone Studies at Florida State University, for our second discussion of what can occur at the methodological intersections of DH, race, and alterity. Nony will introduce the “nootechnics of the digital” — the psycho-cultural practices of care and empowerment (2017, p. 130) — asking us to consider the ways in which we thoughtfully and thoughtlessly use medical devices as modes of control that extract data from bodies to assess medical conditions and influence transnational flows of migration. Drawing primarily from a chapter in her book and secondarily from ongoing work in nootechnics, Nony will examine various technologies of control in the context of nationalist discourses on security. Participants are invited to read the following in advance of our meeting:

All are welcome. We hope you can join us,

-TSG

Ontologizing the Taxonomy

During her talk on Friday, FSU Digital Humanities Librarian Sarah Stanley put forward several challenging questions about (and responses to) the ontological assumptions and infrastructural formations that underlie taxonomic methodology. As humanists tasked with the preservation, publication, and evaluation of artifacts in an increasingly heterogeneous environment, can we—in good conscience—continue to work within traditional academic architectures? Are these systems capable of repair or must they be rebuilt entirely? Joining other thinkers and practitioners in the field, Stanley proposes renewed interrogation of prevailing methods with an eye toward new and more complex descriptive practice.

Early in her talk, Stanley acknowledged that any attempt at analysis necessarily entails a degree of reduction, particularly when large amounts of data are involved. This “flattening” can be (and has been) productively employed toward the recovery of erased narratives and population of archival silence. However, some have asserted that the taxonomic impulse is fundamentally colonial, thus tainting even the most well-intentioned of projects. By this logic, many of the current channels of academic communication—libraries, conferences, journals—are compromised.

A core principle of many data-driven efforts in DH is scalability, the capacity for stable interchange between units of analysis across a given research frame. In theory, a scalable system can accommodate indefinite growth without loss of semantic functionality. While this sort of framework can be useful for macroscopic research, the lack of granularity poses serious issues for qualitative, humanistic endeavors. According to Tsing, professor of anthropology at the University of California, Santa Cruz, scalability “disguises such divisions by blocking our ability to notice the heterogeneity of the world; by its design, scalability allows us to see only uniform blocks, ready for further expansion.”

Similarly, data cleaning—a common practice in the “hard” sciences—facilitates manipulation of information on a large scale. But once again, important differences are lost in the process of sanitizing datasets, differences which distinguish, and perhaps even define, our discipline: “the humanities cannot import paradigms and practices whole from other fields, whether from ‘technoscience’ or the nearer ‘social’ sciences, without risking the foreclosure of specific and valuable humanistic modes of producing knowledge” (Rawson and Munoz).

Beneath these various critiques of the taxonomic impulse lies a recognition of subjectivity as essential to and inherent in all methodology. Subjectivity is not simply, according to Drucker, “individual inflection or mere idiosyncracy,” but rather “codependent relations of observer and phenomena,” relations which shape and inform even our descriptions thereof. The taxonomy, then, is a not a neutral apparatus; we must “stop acting as though the data models for identity are containers to be filled in order to produce meaning, and understand that these structures themselves constitute data” (Posner).

Charting a path forward, Stanley pointed to three DH projects—TaDiRAH, the Orlando Project, and the Early Caribbean Digital Archive—that have successfully resisted (or at least attempted to resist) the allure of hierarchical models in favor of data ontologies, which are better suited to the representation of “awkward, fuzzy translations and disjunctures” (Tsing). Several practices have been proposed and applied in the effort to “ontologize” digital taxonomies. In place of scalable nesting, for instance, we can embrace the principles of nonscalability theory. Rather than cleaning data, we might use indexing as an alternate information structure “designed to serve as a system of pointers between two bodies of information” (Rawson and Munoz). It will take time to imagine and implement a robust, ontology-driven infrastructure, but through continued critique of old knowledge architectures and iterative engagement with new methodologies, we can hope to improve our collective effectiveness as humanists in a digital age.

Works Cited

  • Drucker, Johanna. “Humanities Approaches to Graphical Display.” 5.1 (2011): Digital Humanities Quarterly (DHQ).
  • Posner, Miriam. “What’s Next: The Radical, Unrealized Potential of Digital Humanities.” 27 July 2015 Post on Posner Blog.
  • Rawson, Katie, and Trevor Muñoz. “Against Cleaning.” 6 July 2016 Post on Curating Menus.
  • Tsing, Anna Lowenhaupt. “On Nonscalability: The Living World Is Not Amenable to Precision-Nested Scales.” Common Knowledge3 (2012): 505-24.

The Privileges of Data in DH

I complained (lightly in self-deprecation) last semester about the nebulousness of the common threads among our readings for last semester’s Digital Humanities selections, without yet fully realizing the diverse nature of writing about the Digital Humanities as a collective, or the difficulty of a meaningful inspection of intersectionality. I was obviously wrong in my initial take in both the nature of the complaint and in the response of complaining in front of a nuanced critical analysis—I will not make that mistake this week. My first thought upon reading these selections was very similar to my reaction last semester, though: what a diverse selection!

Filtering the readings through McPherson’s text probably creates the best lens for me to approach this theme as a unifier here: How are we roadblocking alterity?

As this week’s selections teach us, attempting to narrow our perspective (as I do with filtering these readings through the lens of alterity in the Digital Humanities) also demonstrates our privilege.

Through Tsing’s discussion of scalability, we see the common hegemonic practice of simplifying for the sake of clarity in expression—that is to say, there’s a lot that gets ignored once non-convenient data is swept aside, never considered, or misplaced for the sake of convenience. As Tsing notes, when we contextualize data, we have a very focused filter of what we can understand and compare. Data that falls out of our view is reduced in importance or ignored. Systematically, we process information in ways we’ve been taught or shown; often, this happens without regard for how those larger frameworks compartmentalize and exclude.

As my ignorance demonstrated from last semester, comprehending only those knowledge systems works to falsely reinforce our (mis)understandings of data. As Rawson and Muñoz succinctly note, “this reductiveness can feel intellectually impoverishing to scholars who have spent their careers working through particular kinds of historical and cultural complexity.” Though not always apparent within our own work, we should be aware of how our own perspectives can be intellectual dampeners while also reinforcing our own privileges; what we sometimes see as clarity also creates adjacent distortions (as we see with ULAN’s database not recognizing gender as a spectrum and short-handed representations of visual spaces). As developing scholars, these unseen and non-representative knowledges sustain a daunting influx of ignorance we have to actively practice awareness of.

In discussing UNIX last semester—a personal scholarly interest—I used McPherson as a springboard to link UNIX’s original community’s ideology to that of access and representation. Until now, I never considered what this mythologized narrative flattens: What non-Western approaches to digital access were steamrolled by the language or system barriers established by UNIX-running systems? Whose work disappeared or went uncredited in establishing open source databases? What aspects of UNIX coding fits into frameworks that favor masculine input and privileges hegemonic processes?

Even in projects with altruistic intentions, the majority of now-recognized pre-Silicon Valley programmers were white males.

Some questions to consider ahead of Stanley’s presentation:

  1. In thinking about our specific fields, interests, and research, what are the frameworks and taxonomies we deal with but rarely consider alternative approaches of? What do they downplay, hide, or misrepresent? What knowledges do they frustrate? More importantly, how can we respond to this?
  2. Even within the context of these articles (and my post), binary framings are centerpoints (i.e., voices of alterity v. hegemonic; flat v. widened views; close v. distant readings; inclusive databases v. exclusivity). What are these articles also missing in their representations and how can we respond to what they do not discuss?