Reading with the Swarm

Statistical machine reading is often implicitly pitted against human reading. My paper complicates this distinction by combining the swarmic potential of crowdsourced literature with the interpretive power of machine reading.

"Swarm Reading" describes a set of interpretive tactics that combine crowdsourced and algorithmic readings in order to facilitate the interpretation not of a vast collection of texts ("distant reading") but of a single text. Swarm Readings consists of two steps: First I crowdsource a large number of textual "deformances." In the simplest version, I enlist a small swarm of as many as 300 Amazon Mturk crowdworkers to fill in one or more missing words from a poem in a way that best completes the text according to their judgment. I then submit these deformances to modes of machine reading developed to statistically analyze and represent these rewritings. One technique transforms a collection of deformances into an interactive network visualization, linking together those guesses that are semantically-similar according to a word2vec language model; this gives an overall sense of the swarm's semantic tendencies. Another technique produces a data animation in which the poem "degrades" from the swarmic consensus to the semantic outliers.

In my paper I consider how such paratexts can facilitate a new kind of hermeneutics---one that is statistical but not positivist. An individual reader may rewrite a poem herself before quite literally measuring her version against and within a swarm of possibilities. I theorize this mode of reading in light of Bernard Stiegler's concept of "transindividuation." The paper concludes by exploring how Swarm Reading can serve not just a hermeneutic function but also an editorial one, organizing and combining the poetic labor of crowdsourced poets and forming new texts according to different "objective functions" qua reading strategies.
Neukom Institute, Dartmouth College
Postdoctoral fellow

My Schedule

Add to Your Schedule