Stanley Fish versus the Digital Humanities

Wednesday, February 22, 12:00-1:00 pm
Williams Building, Skybox conference room (fourth floor)

Stanley Fish versus the Digital Humanities

In recent years, the month of January has become debate season for the digital humanities. Whatever it comprises, DH has become increasingly conspicuous at national disciplinary conventions including the Modern Language Association (MLA) and the American Historical Association (AHA) which occur around the new year. As a result, journalists and scholars have used these occasions to scrutinize the problems and possibilities of DH. This year, the literary theorist and legal scholar Stanley Fish turned his New York Times-based blog into a three-part series on the digital humanities, provoking a great deal of public comment and published responses.

For the next meeting of the Digital Scholars reading and discussion group, we will consider the three essays by Fish and some of the commentary they generated.

Responses to these columns abound and many are worth pursuing. We will concentrate on a particular response from Mark Liberman, a distinguished professor of linguistics and computer science at the University of Pennsylvania, for how explicitly his computational analysis engages with the methods as well as claims that Fish disputes.

As always, anyone is welcome to join the discussion.


7 thoughts on “Stanley Fish versus the Digital Humanities

  1. For extra credit someone might want to dip into some of the essays from Fish’s 1980 collection, Is There a Text in This Class? In his seminal essay on “Literature in the Reader,” for example, made the general point that the pattern of expectations, some satisfied and some not, which is set up in the process of reading literary texts is essential to the meaning of those texts. Hence any adequate analytic method must describe that essentially temporal pattern. In the course of developing his argument Fish asserted that “What is required, then, is a method, a machine if you will, which in its operation makes observable, or at least accessible, what goes on below the level of self-conscious response.”

    Someone reading that phrase “a method, a machine if you will” in a digital humanities environment, or a cognitive science environment, might think, ah, yes, computer simulation. Let’s simulate the process of reading and then examine what the computer does as it moves through a text. But that’s not where Fish went and there’s no reason to think he even considered such a thing. And yet computers and computation were certainly in the air even then; in other essays in that collection he looks at some work in computational stylistics.

    Why did Fish even think about such a possibility? While I don’t see any immediate or even predictable prospect of robust simulation of reading, still it’s something worth thinking about, if only to sharpen one’s ideas about computing, processess, and texts.

  2. Fantastic question. Why didn’t affective stylistics become algorithms? Or how might Fish’s earlier claims be compatible (or not) with the algorithmic criticism that S. Ramsay has argued for much more recently? The “machine” metaphor from [as Bill points out below] Is There A Text… can also be found in an earlier essay available through JSTOR: “Literature in the Reader: Affective Stylistics.” New Literary History 2.1 A Symposium on Literary History (Autumn, 1970): 123-162.

  3. Note, the passage I quoted is from the essay you cite, which is reprinted in the 1980 collection, which also has other essays using the language of mechanism. Given the general tenor of the times, with generative grammar all new and exciting and cognitive science happening all around (the term was coined in ’73), it’s hard not to think that Fish absorbed that language from computing, either directly on indirectly.

    In 1976 I published a “hand simulation” of the semantics of Shakespeare’s Sonnet 129 in the Centennial issue of MLN (see abstract below). That same year David Hays and I published a review of the computational linguistics literature in which we, in effect, proposed an algorithmic criticism, which we called Prospero. What we proposed wasn’t possible at that time, which we acknowledged, and still isn’t. But . . . Maybe something’s to be gained be reopening that conversation.

    Willim Benzon. Cognitive Networks and Literary Semantics. MLN
    91: 952-982, 1976.

    Abstract: A cognitive network is a type of semantic model developed for simulating natural language on digital computers. A concept is a node in the network while connections between nodes represent relations between concepts. One generates a text by tracing a path through the network and rendering the successive concepts and relations into language according to the appropriate conventions. Elementary concepts are grounded in sensor-motor schemas while abstract concepts are grounded in patterns of network relationship. The semantic structure for Shakespeare’s “Th’ expense of spirit” (Sonnet 129) is given by an abstract pattern for the Fortunate Fall, which is linked to a pattern specifying a fragment of the conceptual basis for faculty psychology.

    William Benzon and David G. Hays. Computational Linguistics and the Humanist. Computers and the Humanities 10: 265 – 274, 1976.

  4. Pingback: Big Data (Digital Scholars) | jewish philosophy place

  5. Let me remind everyone that iconoclasm was a big agenda in the Duke English department in the late 1980’s. Radical formulations and a loose associations with all sorts of novelties, even with “machines” was in service of taking pot-shots at opponents, George Steiner and others. The list was long. There was even enthusiasm for cold fusion!? I would incline toward a reading that he is not happy about digitization, since it is done by many who have no history in 17c. English lit. crit. He is Prof. Fish and we are not. There is no take-away for me from his blogs. Digital Humanities, in the labs around the world is not a messianic movement. We are trying to organize cultural artifacts, including the texts of Prof. Fish’s. I plan to treat this with humor, an opportunity to poke fun. on my blog. Let us not err on the side of reasonableness. No mea maxima culpa for doing important work.

  6. Has Prof. Fish responded to Liberman’s challenge? It seems to me that he owes Liberman a response, for the article’s data appear to refute decisively Fish’s offhand and intuitive claim that that passage on which he focused is unique in the density of its p and b sounds. L’s article presents a perfect case study of how new computer techniques and traditional humanities scholarship should not just be able to happily coexist, but can be symbiotic, mutually helping each other. Computers can show us patterns in the text that no individual reader could (probably) discover, like those other areas of heightened p and b sounds in Areopagitica in the wonderful chart L presents. But only good old textual analysis techniques as far as I can see can give us good working hypotheses for what those patterns might mean–hypotheses that we can then test by turning back to the text as readers, not machines. Fish’s interpretive techniques may show us WHY Milton would get so exercised, sonically and thematically, over p’s and priests and presbyters, and how this relates to Areopagitica’s rhetorical strategies as a whole. But with Liberman’s help we can now also see that we have to be careful about making too broad a claim about this passage’s uniqueness in the text, as it appears Fish does. A good interpreter trying to link sound and sense in this text will want to explore other passages where similar consonant patterns seem to dominate–passages just such as those that L’s computer has singled out, including the part of Areopagitica — how could Fish ignore the irony here? — when Milton is interested in the revolutionary role that new media — the printing press — could play in cultural reform. How can we not want to jump back into the text just at that point that the computer highlighted and begin doing a new close reading of it?

    In short, Fish needs to engage withe Liberman’s evidence; that’s what a secure, rather than insecure, humanist would do (and be grateful for the chance to do it). I’ll end by saying that a little humility is needed all around, and that the folks at MITH, rather than being secret evangelists, seem to me to balance very nicely curiosity about the new digital tools with admirable un-doctrinaire theories about how they may prove useful. Not to mention healthy skepticism. We’d better be humble. We’re still at a very early stage in learning — through trial and error — how to use these new digital reading techniques and mesh them with earlier interpretive techniques that seem tried and true now but (remember?) were threatening in their own day.

  7. Pingback: The Dark Side of DH | FSU Digital Scholars

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s