In my opinion, data encompasses all things that can be considered information. Information can vary in type and scope, and by looking at it from different angles we can reach different conclusions about its nature. Distant reading, or “macroanalysis” focuses on understanding beyond the minutia of individual works but rather more general understanding of a larger class – such as a genre or time period [1]. Just as close, detailed reading has it’s merits in understanding the implications of a particular work, macroanalysis can give understanding to it’s context. Take for example the use of macroanalysis to identify J.K Rowling as the author of the crime novel “The Cuckoo’s Calling” [2]. The book was released under a pseudoname, but by comparing it to Rowling’s other books using macroanalysis techniques, like comparing word length and adjacency. Projects like Google N-Grams and roadtrip maps are useful because they provide visual context to a large amount of data. As a result, we can see relationships that would not so easily be spotted in close reading. In the n-gram project, we can see the relationship and uses of words across time periods in literature. We can make conclusions based on the use and disuse of a word over time, like the rise in the use of cities during the industrial period. Projects like these augment scholarship in a scope sense. They allow us to step further back and approach genres rather than particular pieces of literature. I don’t think they necessarily augment reality – but provide a new way of visualizing it.
[1] http://www.matthewjockers.net/2011/07/01/on-distant-reading-and-macroanalysis/
[2] http://www.telegraph.co.uk/culture/books/10178344/JK-Rowling-unmasked-as-author-of-acclaimed-detective-novel.html