The most vivid image I keep from our last seminar session is one of the transforming shape sorting cube. I refer to the analogy of how a search engine creates a result (could we say knowledge?) by using our search questions in the way similar to how a toddler’s cube toy would change the shape of a 2D hole in its surface to fit exactly a 3D object that we try to put inside the cube. I am curious about this idea and would like to know more about how the shapes change.
One question stuck in my mind–perhaps I have not found or created the right-shaped hole to insert it in–is how is the original format in which a text was created/published is taken into consideration in quantitative literary studies? As part of Jim’s question about what quantitative literary studies are, Emma mentioned the need to remove all the “bookness” of a text to conduct the quantitative analysis. But I wonder if that is completely possible to achieve. When a text is created, the format in which it is planned to be distributed should somehow affect its structure, choice of words, and meaning, even if it only by the limitations of length. For example, Charles Dicken’s serialized writing in periodicals will necessarily condition the way he conceived his work and the final result.
Many aspects from Bode’s article seemed unfamiliar to me, but one that I could relate more to was the reference of the histories of transmission and the “infrastructure of knowledge-making” seen as a “process in which meaning is inevitably transformed, if not lost entirely.” Transmitting static knowledge seems difficult to imagine, and I am even skeptical of the possibility that meaning can be completely defined. However, I still find myself aspiring to uncover the real version of a past event or its purest evidence. Is this is not possible, what should we aim for instead?