{"title":"The Logic of the Big Data Turn in Digital Literary Studies","authors":"J. Ganascia","doi":"10.3389/fdigh.2015.00007","DOIUrl":null,"url":null,"abstract":"The Digital Humanities, and especially the literary side of the Digital Humanities, i.e., Digital Literary Studies, propose systematic and technologically equipped methodologies in activities where, for centuries, intuition and intelligent handling had played a predominant role. The recent “big data” turn in the natural and social sciences has been particularly revealing of how these new approaches can be applied to traditional scholarly disciplines, such as literary studies. In so doing, big data can renew, with the use of computers, the Humanities, i.e., the disciplines rationally studying humanworks and cultural production. Digital Literary Studies are emblematic of these new approaches, certainly because they constitute the oldest subfield of the Digital Humanities, as some early projects like the Trésor de la Langue Française attest but also because they are the domain in which the intellectual stakes of mass digitization has already been extensively used and debated as demonstrated by Franco Moretti’s Graphs, Maps, Trees (Moretti, 2005), for instance. Some view this evolution enthusiastically as a shift toward the “hard” sciences. This is the case of Matthew Jockers who affirms in the chapter entitled “Revolution” of his book Macroanalysis (Jockers, 2013) that: “Now, slowly and surely, the same elements that have had such an impact on the sciences are revolutionizing the way that research in the humanities get done” (p. 10). Further on, he declares that literary methodology is “in essence no different from the scientific one” (p. 13). Others assert that some questions cannot be dealt with using the same methods in the humanities and the natural sciences, like physics or biology. That is the case of Stephen Ramsay, who, in Reading Machines (Ramsay, 2011), assures us that, even if some problems in the Humanities, like authorship identification, can clearly find comfort with themethods developed by the natural sciences, for most literary critical endeavors, such as characterizing the subjectivity of Virginia Wolf in her novel The Waves, for instance, it is not possible to clearly identify a set of “falsifiable” facts. Between these two extremes, many scholars provide convincing illustrations of what digitization allows and then discuss the nature and current evolution of the Humanities in general, and literary studies in particular. TheCompanion toDigital Humanities (Schreibman et al., 2004), theCompanion to Digital Literary Studies (Siemens and Schreibman, 2008), and more recently an excellent online MLACommons anthology dedicated to Literary Studies in the Digital Age (Price and Siemens, 2013) all provide various and enriching views on these topics. We attempt here to conciliate the two above-mentioned and apparently antagonistic views with the help of a philosophical approach. More precisely, our Grand Challenge is in the service of establishing solid epistemological foundations for the Digital Humanities, which is necessitated by the increasingly important role attributed to digital tools in humanistic research. We also claim that employing a conceptual apparatus originally built by German neo-Kantian philosophers at the beginning of the twentieth century, in particular by Heinrich Rickert and Ernst Cassirer, seems particularly relevant today with the emergence of “big data,” primarily because the logical nature of the possible inferences drawn from this sort of data needs to be clarified.","PeriodicalId":227954,"journal":{"name":"Frontiers Digit. Humanit.","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers Digit. Humanit.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fdigh.2015.00007","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
The Digital Humanities, and especially the literary side of the Digital Humanities, i.e., Digital Literary Studies, propose systematic and technologically equipped methodologies in activities where, for centuries, intuition and intelligent handling had played a predominant role. The recent “big data” turn in the natural and social sciences has been particularly revealing of how these new approaches can be applied to traditional scholarly disciplines, such as literary studies. In so doing, big data can renew, with the use of computers, the Humanities, i.e., the disciplines rationally studying humanworks and cultural production. Digital Literary Studies are emblematic of these new approaches, certainly because they constitute the oldest subfield of the Digital Humanities, as some early projects like the Trésor de la Langue Française attest but also because they are the domain in which the intellectual stakes of mass digitization has already been extensively used and debated as demonstrated by Franco Moretti’s Graphs, Maps, Trees (Moretti, 2005), for instance. Some view this evolution enthusiastically as a shift toward the “hard” sciences. This is the case of Matthew Jockers who affirms in the chapter entitled “Revolution” of his book Macroanalysis (Jockers, 2013) that: “Now, slowly and surely, the same elements that have had such an impact on the sciences are revolutionizing the way that research in the humanities get done” (p. 10). Further on, he declares that literary methodology is “in essence no different from the scientific one” (p. 13). Others assert that some questions cannot be dealt with using the same methods in the humanities and the natural sciences, like physics or biology. That is the case of Stephen Ramsay, who, in Reading Machines (Ramsay, 2011), assures us that, even if some problems in the Humanities, like authorship identification, can clearly find comfort with themethods developed by the natural sciences, for most literary critical endeavors, such as characterizing the subjectivity of Virginia Wolf in her novel The Waves, for instance, it is not possible to clearly identify a set of “falsifiable” facts. Between these two extremes, many scholars provide convincing illustrations of what digitization allows and then discuss the nature and current evolution of the Humanities in general, and literary studies in particular. TheCompanion toDigital Humanities (Schreibman et al., 2004), theCompanion to Digital Literary Studies (Siemens and Schreibman, 2008), and more recently an excellent online MLACommons anthology dedicated to Literary Studies in the Digital Age (Price and Siemens, 2013) all provide various and enriching views on these topics. We attempt here to conciliate the two above-mentioned and apparently antagonistic views with the help of a philosophical approach. More precisely, our Grand Challenge is in the service of establishing solid epistemological foundations for the Digital Humanities, which is necessitated by the increasingly important role attributed to digital tools in humanistic research. We also claim that employing a conceptual apparatus originally built by German neo-Kantian philosophers at the beginning of the twentieth century, in particular by Heinrich Rickert and Ernst Cassirer, seems particularly relevant today with the emergence of “big data,” primarily because the logical nature of the possible inferences drawn from this sort of data needs to be clarified.