Chapter 6. On Liquid Books and Fluid Humanities

6.1 From Orality to Fixity?

In line with the general discourse surrounding the history of the book I discussed previously, the main debate concerning the development of fixity focuses on whether a book can ever be defined as a stable text; and, if so, whether this quality of stability and fixity is an intrinsic element of print—or in a lesser extent of manuscripts—or whether it is something that has been imposed on the printed object by historical actors.

As I established earlier, Eisenstein is a proponent of the former view. She sees standardisation and uniformity as properties of print culture, properties that were usually absent in a predominantly scribal environment (1979: 16). Where Eisenstein emphasises the fixity brought about by printing in comparison to the scribal culture that preceded it, Ong meanwhile focuses more on the relationship between orality and literacy, specifically on the differences in mentality between oral and writing cultures. The shift from orality to writing, he argues, is essentially a shift from sound to visual space, where print mostly had effects on the use of the latter. Writing locks words into a visual field—as opposed to orality where language is much more flexible (Ong 1982: 11). In oral culture, language is fluid and stories are adapted according to the situation and the specific audience, knowledge being stored in mnemonic formulas of repetition and cliché (Ong 1982: 59). With writing these elaborate techniques were no longer necessary, freeing the mind for more abstract and original thinking (Ong 1982: 24). For Ong, it is thus writing and literacy that are inherently connected to fixity and stability: he argues that scientific thinking is also a result of writing, for instance.

Eisenstein, however, emphasises that fixity could only really come about with the development of print. Hand copying of manuscripts was based on luck or chance as the survival of a book or text depended on the shifting demand for copies by local elites, on copies being made by interested scholars, and on the availability and skills of scribes. Copies were also not always ‘identical’ or identically multiplied, as hand-copying often led to variants in the text copied (Eisenstein 1979: 46). No manuscript at that time could thus be preserved without undergoing corruption by copyists. Long-term preservation of these unique objects also left a lot to be desired, as the use of manuscripts lead to wear and tear, while moisture, vermin, theft and fire all meant that ‘their ultimate dispersal and loss was inevitable’ (Eisenstein 1979: 114). Although printing required the use of paper, which is much less durable than either parchment or vellum, the preservative powers of print lay mainly in its strategy of conservation by duplication and making public: printing a lot of books and spreading them widely proved a viable preservation strategy.

In The Printing Press as an Agent of Change, Eisenstein analyses how print influenced many aspects of scholarship and science. Print influenced the dissemination, standardisation, and organisation of research results, but it also impacted upon data collection and the preservation, amplification and reinforcement of science (Eisenstein 1979: 71). Books became much cheaper and a more varied selection of books was available, to the benefit of scholars. It encouraged the transition from the wandering to the sedentary scholar and stimulated the cross-referencing of books. Increasingly printers also began standardising the design of books. They started by experimenting with the readability and classification of data in books, introducing title pages, indexes, running heads, footnotes, and cross-references (Eisenstein 1979: 52, Ong 1982: 121–123). Nonetheless, as McLuhan, Eisenstein and Ong among others have made clear, scholars benefitted most from the standardisation of printed images, maps, charts, and diagrams, which had previously proven very difficult to multiply identically by hand. This was essential for the development of modern science (McLuhan 1962: 78, Ong 1982: 124). As McLuhan argues, print enhanced visuality over audile-tactile culture, creating a predominantly visual-based world, promoting homogeneity, uniformity and repeatability (1962: 24).

McLuhan speaks in this respect of the frontier of two cultures and of conflicting technologies, which have led to the typographic and electronic revolutions, as he calls them. Eisenstein similarly points out that printing, through its powers of precise reproduction, helped spread a number of cultural revolutions (i.e. the Renaissance, the Reformation and the Scientific Revolution); revolutions that were, as Eisenstein claims, essential in the shaping of the modern mind (1979: 170–172). Febvre and Martin also explore the influence of the book on the Renaissance and the Reformation, analysing print’s causes and effects as part of a socio-economic history of book production and consumption over a long period of time. Being slightly more cautious, they wonder how successful the book has been as an agent for the propagation of new ideas (Febvre and Martin 1997: 9). They see preservation through duplication and (typographic) fixity as basic prerequisites for the advancement of learning, agreeing that it was print that gave the book a permanent and unchanging text (Febvre and Martin 1997: 320). However, printing for them is just part of a set of innovations. The printing press is only one of a number of actors in the general social and political history they try to reconstruct.

Although Eisenstein acknowledges this plurality of actors, in her view print was the main agent of change impacting on the revolutionary developments detailed above. Of course it builds on previous achievements, however, the preservative powers of print were more permanent than previous movements. As Eisenstein emphasises, print revolutionised these previous systems. Even though the early modern hand press did not of course meet modern standards of duplication, its development still meant that early print books were more fixed and standardised than hand-copied manuscripts (Eisenstein 1979: 345–346). Where scribal copying ultimately led to more mistakes and corruption of the text, successive print editions allowed for corrections and improvements to be made, so with fixity came ‘cumulative cognitive advance’ (Eisenstein 1979: 432). Even if the printing press also multiplied and accelerated errors and variants—and many errata had to be issued—the fact was that errata could now be issued. Print thus made corruption more visible at the same time (Eisenstein 1979: 80). Texts were now sufficiently alike for scholars in different regions to correspond with each other about what was, to all intents and purposes, a uniform text. Networks of correspondents were created which in turn lead to new forms of feedback that had not been possible in the age of scribes. This again was an influence on the scientific method, and on the modern idea of scientific cooperation. Print, however, went further than just encouraging popularisation and propaganda and the mere spreading of new ideas (Eisenstein 1979: 454). It was the availability and access to diverse materials that was really revolutionary.

Permanence was also able to bring out progressive change where ‘the preservation of the old (…) launched a tradition of the new’ (Eisenstein 1979: 124). From valuing the ancients the emphasis increasingly came to be placed on admiring the new. Classical texts were recovered through print, offering adequate equipment to systematically explore and classify antiquity. According to Eisenstein, the communications revolution created a ‘fixed distance in time’, influencing the development of a modern historical consciousness. McLuhan similarly claims that with print a fixed point of view became possible where print fosters the separation of functions and a specialist outlook (1962: 175). Eisenstein confesses that it is hard to establish how exactly printed materials affected human behaviour; nonetheless, we have to understand how greater access to a greater abundance of records and a standardisation brought about by printing influenced the literate elite (1979: 8). Printing standardised vernacular languages and led to the nationalisation of politics (where increasingly political documents were written in the vernacular) and the fragmentation of Latin. Drawing further on McLuhan, Eisenstein also shows how the thoughts of readers are guided by the way the contents of books are arranged and presented. Basic changes in book format thus lead to changes in thought patterns. Standardisation helped to reorder the thought of all readers and a new ‘esprit de système’ was developed (including systematic cataloguing and indexing) which proved of the utmost importance for the commercial book-trade. Bookseller’s lists were created to promote works and attract customers, for instance. Eisenstein also makes a clear claim for the importance of print on the development of the Reformation. The press was the ultimate propaganda machine. However, Eisenstein points out that print not only diffused Reformation views but also shaped them. Where print stabilised ‘the bible’ (and scholars were being provided with Greek and Hebrew texts), its availability in vernacular languages changed who read the bible and how they read it (Eisenstein 1979: 326).

As we have established previously, in opposition to Eisenstein’s arguments for the agency of print, Adrian Johns emphasises that it is not printing per se that possesses preservative power, but the way printing is put to use in particular ways. He states that knowledge such as we understand it today has come to depend on stability; however, such a situation of stability has not always been prevalent. It is not easy for us to imagine a realm in which printed records were not necessarily authorised or faithful, Johns remarks. What could one know in such a realm, and how could one know it? (Johns 1998: 5). If we were to reassess the way print has been ‘constructed’, we can contribute to our historical understanding of the conditions of knowledge itself and how knowledge emerged (Johns 1998: 6). Printed books themselves do not contain attributes of credibility and fixity, which are features that take much work to maintain. According to Johns, it was the social system then in place, not the technology, which needed to change first in order for the printing revolution or print culture to gain ground.

Johns brings the cultural and the social to the centre of our attention through his interest in the roles of historical figures (i.e. readers, authors and publishers) in bringing about fixity (1998: 19–20). He argues that Eisenstein neglects the labours through which fixity was achieved, to the extent that she describes what Johns sees as being the results of those labours, as being powers or agency intrinsic to texts instead (Johns 1998: 19). For Johns, then, fixity is not an inherent quality but a transitive one; fixity exists only inasmuch as it is recognized and acted upon by people—and not otherwise. In this sense, fixity is the result of manifold representations, practices and, most importantly, conflicts and struggles that arise out of the establishment of different print cultures.

Chartier similarly argues against the direct influence of print on readers’ consciousness. Chartier is interested in the effects of meaning that books as material forms produce, forms that in his view do not impose, but command uses and appropriations (1994: viii–ix). This means that works have no stable, universal, or fixed meaning as they are invested with plural and mobile significations that are constructed in the encounter between a proposal and a reception. Chartier sees it as part of his work as a historian to reconstruct the variations in what he calls the ‘espaces lisibles’, the texts in their discursive and material forms, and the variations that govern their effectuation. According to Chartier, books aim at installing an order during their whole production process: there is the order of the author’s intentions, of the institution or authority which sponsored or allowed the book, and there is the order that is imposed by the materiality or the physical form of the book, via its diverse modalities. Chartier’s route map to a history of reading is based on the paradox of the freedom of the reader versus the order of the book. How is the order of the book constructed and how is it subverted through reading? Reception and decipherment of material forms again take place according to the mental and affective schemes that make up the culture of communities of readers. In this respect Chartier is interested in the relationship between the text, the book, and the reader (1994: 10).

Although Johns acknowledges that print to some extent led to the stabilisation of texts, he questions ‘the character of the link between the two’ (1998: 36). For him, printed texts were not intrinsically trustworthy, nor were they seen as self-evidently creditable in early modern times, where piracy and plagiarism and other forms of ‘impropriety’ were widespread. This meant that the focus was not so much on ‘assumptions of fixity’, as Johns calls it, but on ‘questions of credit’ and on the importance of trust in the making of knowledge (1998: 31). Print culture came about through changes in the conventions of civility and in the practice of investing credit in materials (i.e. by the historical labours of publishers, authors and readers) as much as through changes in technology (Johns 1998: 35–36). Johns is therefore interested in how knowledge was made (where knowledge is seen as contingent). How did readers decide what to believe?

Reading practices were very important to cope with the appraisal of books. Especially with respect to the issue of piracy, the credibility of print became a significant issue, one with both economic and epistemic implications (Johns 1998: 32). Charges of piracy could lead to allegations of plagiarism (as Johns notes, ‘they were seldom just claims of piracy’), which meant that such charges had direct implications for the reputation of authors as well as threatening the credibility attributed to their ideas. Piracy was always in a way accompanied by accusations of appropriation, and (textual) corruption, meaning the violation of virtues and propriety, which would put at risk a scholar’s authorship, knowledge, and livelihood, as well as those of a publisher or bookseller (Johns 1998: 460). Piracy thus affected both ‘the structure and content of knowledge’ (Johns 1998: 33).

As discussed in previous chapters, the character of a printer or Stationer was very influential in the establishment of trust or credit. This trust was related to a respect of the principle of copy, meaning the recognition of another (printer’s) prior claim to the printing of a work, based on a repudiation of piracy. As Johns shows, the name of the Stationer on a book’s title page could tell a prospective reader as much about the contents as could that of the author (1998: 147). The character of booksellers mattered, too, as they determined what appeared in print and what could be bought, sold, borrowed, and read. Readers thus assessed printed books according to the places, personnel, and practices of their production and distribution. To contemporaries, the link between print and stable or fixed knowledge seemed far less secure, not least because a certain amount of creativity (i.e. textual adaptation) was essential to the Stationer’s craft. Piracy was also not unfamiliar: it was far more common than was certainty and uniform editions. Furthermore, pirates were not a distinguishable social group, existing as they did at all ranks of the Stationers’ community, and at times they were among its most prominent and ‘proper’ members, Johns explains (1998: 167). It is important in this respect to realise that piracy was not attached to an object; it was used as a category or a label to cope with print, as a tactic to construct and maintain truth-claims.

The reliability of printed books thus depended in large part on representations of the larger Stationers’ community as proper and well ordered (Johns 1998: 624). This clashed with the characteristic feature of the Stationers’ Commonwealth, namely uncertainty, where print culture was characterized by endemic distrust, conspiracies and ‘counterfeits’. The concept of piracy was used as a representation of these cultural conditions and practices as they were prevailing in the domain of print. With this uncertainty it became clear that the achievement of print-based knowledge as well as authorship was transient (Johns 1998: 187). Yet readers did come to trust and use print, as books were of course produced, sold, read, and put to use, meaning that the epistemological problems of reading them were, in practice, overcome. Trust could become possible, Johns argues, because of a disciplining regime—including elaborate mechanisms to deal with all the problems of piracy—brought about by publishers, booksellers, authors and the wider realm of institutions and governments, as is exemplified for Johns by the Stationers’ Company. Licensing, patenting and copyright were similarly machineries for producing credit. But the register set up by the Royal Society—which became one of the defining symbols of experimental propriety in the Society itself—and the Philosophical Transactions, which came to function as its brand abroad, were similarly achievements that required strenuous efforts to discipline the processes of printing and reading (Johns 1998: 623). With this regime in place, Johns claims that trust in printed books could become a routine possibility (1998: 188). As he explains, however, struggles over power arose regarding who gets to decide on or govern these social mechanisms for generating and protecting credit in printed books, displaying the complex interactions of piracy, propriety, political power, and knowledge. Conflicts arose over the implementation of patents and/or copyright and on the different consequences a print culture governed by a specific entity (e.g. Stationers or the crown, for Johns) would face. These conflicts held, according to Johns, ‘the potential for a fundamental reconsideration of the nature, order, and consequences of printing in early modern society’ (1998: 258–259).

6.2 Fluid Publishing

As becomes clear from the discourse sketched above, a combination of technological, formal, and cultural factors (as well as discursive, practical and institutional ones) has brought about a certain semblance of fixity, trust and endurance, together with a number of conventions related to the preservation of the printed book. It is these conventions, or the disciplining regime Johns talks about, that have privileged certain cuts in intra-action with the book’s material becoming. With the growing use and importance of the digital medium in scholarship, one could argue that the book’s material becoming has altered. However, it is in the interaction with the established disciplining regime that its development has been structured. An increasing interest in the communication and publishing of humanities research in what can be seen as a less fixed and more open way, has nonetheless challenged the integrity of the book, something that the system surrounding it has tried so hard to develop and maintain.

Why is this disciplining regime, and the specific print-based stabilisations it promotes, being interrogated at this particular point in time? First of all, and as was made clear by the history provided above, in order to answer this question we need to keep in mind that this regime has seen a continuing power struggle over its upkeep and constituency, and as such has always been disputed. Nonetheless, changes in technology, and in particular the development of digital media, have acted as a disruptive force, especially since much of the discourse surrounding digital media, culture and technology tends to promote a narrative of openness, fluidity and change. Therefore this specific moment of disruption and remediation brings with it an increased awareness of how the semblances of fixity that were created and upheld in, and by, the printed medium, are a construct, upheld to maintain certain established institutional, economical and political structures (Johns 1998). This has lead to a growing awareness of the fact that these structures are formations we can rethink and perform otherwise. All of which may explain why there is currently a heightened interest in how we can intra-act with the digital medium in such a way as to explore potential alternative forms of fixity and fluidity, from blogs to multimodal publications.

The construction of what we perceive as stable knowledge objects serves certain goals, mostly to do with the establishment of authority, preservation (archiving), reputation building (stability as threshold) and commercialisation (the stable object as a reproducible product). In Writing Space: Computers, Hypertext, and the Remediation of Print (2001), Bolter conceptualises stability (as well as authority) as a value under negotiation, as well as the product of a certain writing technology: ‘it is important to remember, however, that the values of stability, monumentality and authority, are themselves not entirely stable: they have always been interpreted in terms of the contemporary technology of handwriting or printing’ (2001: 16). This acknowledgment of the relative and constructed nature of stability and of the way we presently cut with and through media, encourages us to conduct a closer analysis of the structures underlying our knowledge and communication system and how they are set-up at present: who is involved in creating a consensus on fixity and stability, and what is valued and what is not in this process?

It could therefore be argued that it is the specific cuts or forms of fixing and cutting down of scholarship that are being critiqued at the moment, while the potential of more processual research is being explored at the same time: for example, via the publication of work in progress on blogs or personal websites. The ease with which continual updates can be made has brought into question not only the stability of documents but also the need for such stable objects. Wikipedia is one of the most frequently cited examples of how the speed of improving factual errors and the efficiency of real-time updating in a collaborative setting can win out over the perceived benefits of stable material knowledge objects. There has perhaps been a shift away from the need for fixity in scholarly research and communication towards the importance of other values such as collaboration, quality, speed and efficiency, combined with a desire for more autonomous forms of publishing. Scholars are using digital media to explore the possibilities for publishing research in more direct ways, often cutting out the traditional middlemen (publishers and libraries) that have become part of the print disciplining regime they aim to critique. Accordingly, they are raising the question: do these middlemen still serve the needs of its users, of scholars as authors and readers? For example, the desire for flexibility, speed, autonomy etc. has caused new genres of formal and informal scholarly communication to arise; a focus on openness and fluidity is seen as having the potential to expand academic scholarship to new audiences; digital forms of publishing have the potential to include informal and multi-modal scholarship that hasn’t been communicated particularly extensively before; and new experimental publishing practices are assisting scholars in sharing research results and forms of publication that cannot exist in print, because of their scale, their multimodality, or even their genre. Making the processual aspect of scholarship more visible—which includes the way we collaborate, informally communicate, review, and publish our research—and highlighting not only the successes but also the failures that come with that, has the potential to demystify the way scholarship is produced.

From blogging software and social media, to mailing lists and institutional repositories, scholars have thus increasingly moved to digital media and the Internet to publish both their informal and formal research in what they perceive as a more straightforward, direct and open way. This includes the mechanisms developed for the more formal publication of research I discussed in the previous chapter, via either green (archiving) or gold (journal publishing) open access. Nonetheless, the question remains whether these specific open forms of publishing have really produced a fundamental shift away from fixity. In this section I therefore would like to draw attention to a specific feature of openness—a feature that can in many ways be seen as one of its most contested aspects (Adema 2010: 60)—namely, the possibility to reuse, adapt, modify and remix material.[1] It is this part of the ethos or definition of openness (libre more than gratis)[2] that can be said to most actively challenge the concepts of stability, fixity, trust and authority that have accompanied the rhetoric of printed publications for so long (Johns 1998). Where more stripped-down versions of openness focus on achieving access, and on doing so in a way that the stability of a text or product need not be threatened (indeed, the open and online distribution of books might even promote its fixity and durability due to the enlarged availability of digital copies in multiple places), libre openness directly challenges the integrity of a work by enabling different versions of a work to exist simultaneously. At the same time libre forms of openness also problematise such integrity by offering readers the opportunity to remix and re-use (parts of) the content in different settings and contexts, from publications and learning materials, to translations and data mining. Within academia this creates not only practical problems (which version to cite and preserve, who is the original author, who is responsible for the text), it creates theoretical problems too (what is an author, in what ways are texts ever stable, where does the authority of a text lie?). Fitzpatrick discusses the ‘repurposing’ of academic content in this regard:

What digital publishing facilitates, however, is a kind of repurposing of published material that extends beyond mere reprinting. The ability of an author to return to previously published work, to rework it, to think through it anew, is one of the gifts of digital text’s malleability—but our ability to accept and make good use of such a gift will require us to shake many of the preconceptions that we carry over from print. (2011: 2)

The ability to expand and build upon, to make modifications and create derivative works, to appropriate, change and update content within a digital environment, also has the potential to shift the focus in scholarly communication away from the product of our publishing and on to the process of researching. It is a shift that, as I discussed previously in this section, may have the ability to make us more aware of the contingency of our research and the cuts and boundaries we enact and that are enacted for us when we communicate and disseminate our findings. It is this shift away from models of print stability and towards process and fluidity (including the necessary cuts) that I want to focus on here, in order to explore some of the ways in which both the practical and theoretical problems that are posed within this development are being dealt with at this moment in time, and whether these should or can be approached differently.

To investigate these potential features of openness, the following section on Remixing Knowledge will analyse a variety of theoretical and practical explorations of fluidity, liquidity and remix, focusing specifically on scholarly research in a digital context. The aim is to examine some of the ways in which scholars within the humanities are dealing with these issues of fluidity and versioning, especially where they concern the scholarly book. This section therefore looks at theories and performative practices that have tried to problematise ideas such as authorship and stability by exploring critically concepts of the archive, selection and agency. At the same time it will offer a critique of these theories and practices and the way they still mostly adhere to fixtures and boundaries—such as authorship and copyright—that have been created within the print paradigm, thus maintaining established institutions and practices. My aim in offering such a critique is to push forward our thinking on the different kind of cuts and stabilisations that are possible within humanities research, its institutions and practices; interruptions that are perhaps both more ethical and open to difference, and which are critical of both the print paradigm and of the promises of the digital.[3] How might these alternative and affirmative cuts enable us to conceive a concept of the book built upon openness, and with that, a concept of the humanities built upon fluidity?

6.2.1 Remixing Knowledge

The ability to reuse and remix data and research to create derivative works is a practice that challenges the stability of a text, and puts into question its perceived boundaries.[4] Within a scholarly context the concept of derivative works also offers the potential to challenge the idea of authorship or, again, the authority of a certain text. The founding act of a work, that specific function of authorship described by Foucault in his seminal article ‘What is an Author?’, can be seen as becoming less important for both the interpretation and the development of a text, once it goes through the processes of adaptation and reinterpretation and the meaning given as part of the author function becomes dispersed (1977). In this section I therefore want to focus on three alternatives to authorship, authority and stability as put forward in discussions on remix; alternatives I will argue are important for knowledge production in the humanities. I will shortly discuss the concept of modularity; before proceeding to the concept of the fluid text and, related to that, the agency of the selector or moderator; and finally, to the concept of the (networked) archive, by looking at the work of remix theorists Lev Manovich and Eduardo Navas, among others, as well as the writing of the textual critic John Bryant.

6.2.1.1 Modularity

Media theorist Lev Manovich discusses the concept of modularity extensively in his research on remix. He explores how, with the coming of software, a shift in the nature of what constitutes a cultural object has taken place, where cultural content no longer has finite boundaries. Content is no longer received by the user, in Manovich’s vision, but is traversed, constructed and managed. With the shift away from stable environments in a digital online environment, he argues that there are no longer senders and receivers of information in the classical sense. There are only temporary reception points in information’s path through remix. Therefore, culture for Manovich is a product that is constructed, both by the maker as well as the consumer, where it is actively being modularised by users to make it more adaptive (2005). In other words, culture is not modular; it is (increasingly) made modular in digital environments. However, the real remix revolution lies not in this kind of agency provoked by the possession of production tools. According to Manovich it lies in the possibility this generates to exchange information between media; what in Software Takes Command he calls the concept of ‘deep remixability’. Here, Manovich talks about a situation in which modularity is increasingly being extended to media themselves. The remixing of various media has now become possible in a common software-based environment, along with a remixing of the methodologies of these media, offering the possibility of mash-ups of text with audio and visual content, expanding the range of cultural and scholarly communication (Manovich 2008).

In his writings on remix, Manovich thus sketches a rather utopian future (one that does not take into account present copyright regimes, for instance) in which cultural forms will be deliberately made from Lego-like modular building blocks, designed to be easily copied and pasted into new objects and projects. For Manovich, these forms of standardisation function as a strategy to make culture freer and more shareable, with the aim of creating an ecology in which remix and modularity are a reality. In this respect ‘helping bits move around more easily’ is a method for Manovich to devise a new way with which we can perform cultural analysis (2005). These concepts of modularisation and of recombinable data-sets offer a way of looking beyond static knowledge objects, presenting an alternative view on how we structure and control culture and data, as well as how we can analyse our ever-expanding information flows. With the help of his software-based concepts, he thus examines how remix can be an active stance by which people will be able to shape culture in the future and deal with knowledge objects in a digital context.

Within scholarly communication the concept of modularity has already proved popular when it comes to making research more efficient and coping with information overload: from triplets[5] and nano-publications[6], to forms of modular publishing, these kind of software-inspired concepts have mostly found their way into scientific publishing. Instead of structuring scholarly research according to linear articles, for instance, Joost Kirzc argues that we should have a coherent set of ‘well-defined, cognitive, textual modules’ (1998). Similarly, Jan Velterop and Barend Mons suggest moving towards nano-publications to deal with information overload, which can be seen as a move in the direction of both more modularity and the standardisation of research outcomes (Groth et al. 2010).

There are, however, problems with applying this modular database logic to cultural objects. Of course, when culture is already structured and modular this makes reuse and repurposing much easier. However, cultural objects differ, and it is not necessarily possible or appropriate to modularise or cut-up a scholarly or fictional work. Not all cultural objects are translatable into digital media objects either. Hence, too strict a focus on modularity might be detrimental to our ideas of cultural difference. Tara McPherson formulates an important critique of modularity to this end. She is mostly interested in how the digital, privileging as it does a logic of modularity and seriality, became such a dominant paradigm in contemporary culture.[7] How did these discourses from coding culture translate into the wider social world? What is the specific relationship between context and code in this historical context? How have code and culture become so intermingled? As McPherson argues, in the mid-20th century modular thinking took hold in a period that also saw the rise of identity politics and racial formations in the US, hyper-specialisation and niched production of knowledge in the university, and forms of Fordist capitalism in economic systems—all of which represent a move toward modular knowledges. However, modular thinking, she points out, tends to obscure the political, cultural and social context from which it emerged. McPherson emphasises that we need to understand the discourses and peculiar histories that have created these forms of the digital and of digital culture, which encourage forms of partitioning. We also need to be more aware that cultural and computational operating systems mutually infect one another. In this respect, McPherson wonders ‘how has computation pushed modularity in new directions, directions in dialogue with other cultural shifts and ruptures? Why does modularity emerge in our systems with such a vengeance across the 1960s?’ (2012). She argues that these forms of modular thinking, which function via a lenticular logic, offer ‘a logic of the fragment or the chunk, a way of seeing the world as discrete modules or nodes, a mode that suppresses relation and context. As such, the lenticular also manages and controls complexity’ (McPherson 2012: 25). We therefore need to be wary of this ‘bracketing of identity’ in computational culture, McPherson warns, where it holds back complexity and difference. She favours the application of Barad’s concept of the agential cut in these contexts, using this to replace bracketing strategies (which bring modularity back). For, as McPherson states, the cut as a methodological paradigm is more fluid and mobile (2014).

The concept of modularity, as described by Manovich (where culture is made modular), does not seem able to guarantee these more fluid movements of culture and knowledge. The kind of modularity he is suggesting does not offer so much of a challenge to object and commodity-thinking, as apply the same logic of stability and standardised cultural objects or works, only on another scale. Indeed, Manovich defines his modular Lego-blocks as ‘any well-defined part of any finished cultural object’ (2005). There is thus still the idea of a finished and bound entity (the module) at work here, only it is smaller, compartmentalised.

6.2.1.2 Fluid Environments and Liquid Publications

Where Manovich’s concept of modularity mostly focuses on criticising stability and fixity from a spatial perspective (dividing objects into smaller re-combinable blocks), within a web environment forms of temporal instability—where over time cultural objects change, adapt, get added to, re-envisioned, enhanced etc.—are also being increasingly introduced. In this respect, experiments with liquid texts and with fluid books not only stress the benefits and potential of processual scholarship, of capturing research developments over time and so forth, they also challenge the essentialist notions that underlie the perceived stability of scholarly works.

Textual scholar John Bryant theorises the concept of fluidity extensively in his book The Fluid Text: A Theory of Revision and Editing for Book and Screen (2002). Bryant’s main argument is that stability is a myth and that all works are fluid texts. As he explains, this is because fluidity is an inherent phenomenon of writing itself, where we keep on revising our words to approach our thoughts more closely, with our thoughts changing again in this process of revision. In The Fluid Text, Bryant displays (and puts into practice) a way of editing and doing textual scholarship that is based not on a final authoritative text, but on revisions. He argues that for many readers, critics and scholars, the idea of textual scholarship is designed to do away with the otherness that surrounds a work and to establish an authoritative or definitive text. This urge for stability is part of a desire for what Bryant calls ‘authenticity, authority, exactitude, singularity, fixity in the midst of the inherent indeterminacy of language’ (2002: 2). By contrast, Bryant calls for the recognition of a multiplicity of texts, or rather ‘the fluid text’. Texts are fluid in his view because the versions flow from one to another. For this he uses the metaphor of a work as energy that flows from version to version.

In Bryant’s vision this idea of a multiplicity of texts extends from different material manifestations (drafts, proofs, editions) of a certain work to an extension of the social text (translations and adaptations). Logically this also leads to a vision of multiple authorship, where Bryant wants to give a place to what he calls ‘the collaborators’ of or on a text, to include those readers who also materially alter texts. For Bryant, with his emphasis on the revisions of a text and the differences between versions, it is essential to focus on the different intentionalities of both authors and collaborators. The digital medium offers the perfect possibility to achieve this and to create a fluid text edition. Bryant established such an edition—both in a print and an online edition—for Melville’s Typee, showing how a combination of book format and screen can be used to effectively present a fluid textual work.[8]

For Bryant, this specific choice of a textual presentation focusing on revision is at the same time a moral choice. This is because, for him, understanding the fluidity of language enables us to better understand social change. Furthermore, constructionist intentions to pin a text down fail to acknowledge that, as Bryant puts it, ‘the past, too, is a fluid text that we revise as we desire’ (2002: 174). Finally, he argues that the idea of a fluid text encourages a new kind of critical thinking, one that is based on difference, otherness, variation and change. This is where the fixation on the idea of having a stable text to achieve easy retrieval and unified reading experiences loses out to a discourse that focuses on the energies that drive text from version to version. In Bryant’s words, ‘by masking the energies of revision, it reduces our ability to historicize our reading, and, in turn, disempowers the citizen reader from gaining a fuller experience of the necessary elements of change that drive a democratic culture’ (2002: 113).

Alongside Bryant’s edition of Melville’s Typee, another example of a practical experiment focusing upon the benefits of fluidity specifically for scholarly communication is the Liquid Publications (or LiquidPub) project.[9] As described by Casati, Giunchiglia, and Marchese (2007), this is a project that tries to bring into practice the idea of modularity as described previously. Focusing mainly on textbooks in the sciences, the aim of the project is to enable teachers to compose a customised and evolving book out of modular pre-composed content. This book will then be a ‘multi-author’ collection of materials on a given topic that can include different types of documents.

The LiquidPub project tries to cope with issues of authority and authorship in a liquid environment by making a distinction between versions and editions. Editions are solidifications of the liquid book, with stable and fixed content, which can be referred to, preserved, and made commercially available. Furthermore the project creates different roles for authors, from editors to collaborators, accompanied by an elaborate rights structure, with the possibility for authors to give away certain rights to their modular pieces whilst holding on to others. As a result, the LiquidPub project is very pragmatic, catering to the needs and demands of authors (mainly for the recognition of their moral rights), while at the same time trying to benefit from, and create efficiencies and modularity within, a fluid environment. In this way it offers authors a choice of different ways to distribute content, from completely open, to partially open, to completely closed books.

Introducing graduations of authorship such as editors and collaborators, as proposed in the work of Bryant and in the LiquidPub project, is one way to deal with multiple authorship or authorship in collaborative research or writing environments. However, as I showed in chapter 3, it does not address the problem of how to establish authority in an environment where the contributions of a single author are difficult to trace back; or where content is created by anonymous users or by avatars; or in situations where there is no human author, but where the content is machine-generated. What becomes of the role of the editor or the selector as an authoritative figure when selections can be made redundant and choices altered and undone by mass-collaborative, multi-user remixes and mash-ups? The projects mentioned above are therefore not so much posing a challenge to authorship or questioning the authorship function as it is currently established, as they are merely applying this established function to smaller compartments of text and dividing them up accordingly.

Furthermore the concept of fluidity as described by Bryant, together with the notion of liquidity as used in the LiquidPub project, does not significantly disturb the idea of object-like thinking or stability within scholarly communication either. For Bryant, a fluid book edition is still made up of separate, different versions, while in the LiquidPub Project, which focuses mostly on an ethos of speed and efficiency, a liquid book is a customised combination of different recombinable documents. In this sense both projects adhere quite closely to the concept of modularity as described by Manovich (where culture is made modular), and therefore do not reach a fluid or liquid state in which the stability and fixity of a text is fundamentally reconsidered in a continual or processual manner. There is still the idea of the object (the module); however, it is smaller; compartmentalised. Witness the way both of these projects still hinge on the idea of extracted objects, of editions and versions, in the liquid project. For example, Bryant’s analysis is focused not so much on creating fluidity or a fluid text—however impossible this might be—but on creating a network between more or less stable versions, whilst showcasing their revision history. He thus still makes a distinction between works and versions, neither seeing them as part of one extended work, nor giving them the status of separate works. In this way he keeps a hierarchical thinking alive: ‘a version can never be revised into a different work because by its nature, revision begins with an original to which it cannot be unlinked unless through some form of amnesia we forget the continuities that link it to its parent. Put another way, a descendant is always a descendant, and no amount of material erasure can remove the chromosomal link’ (Bryant 2002: 85). Texts here are not fluid, at least not in the sense of their being process-oriented; they are networked at the most. McKenzie Wark’s terminology for his book Gamertheory—which Wark distinctively calls a ‘networked book’—might therefore be more fitting and applicable in such cases, where a networked book, at least in its wording, positions itself as being located more in between the ideal types of stability and fluidity.[10]

A final remark concerning the way in which these two projects theorise and bring into practice the fluid or liquid book: in both projects, texts are actively made modular or fluid by outside agents, by authors and editors. There is not a lot of consideration here of the inherent fluidity or liquidity that exists as part of the text or book’s emergent materiality, in intra-action with the elements of what theorists such as Jerome McGann and D.F. McKenzie have called ‘the social text’—which, in an extended version, is what underlies Bryant’s concept of the fluid text. In the social text, human agents create fluidity through the creation of various instantiations of a text post-production. As McKenzie has put it: ‘a book is never simply a remarkable object. Like every other technology, it is invariably the product of human agency in complex and highly volatile contexts’ (1999). McKenzie, in his exploration of the social text, sought to highlight the importance of a wide variety of actors in a text’s emergence and meaning giving, from printers to typesetters. He does so in order to argue against a narrow focus on a text’s materiality or an author’s intention. However, there is a lack of acknowledgement here of how the processual nature of the book comes about out of an interplay of agential processes of both a human and non-human nature.

Something similar can be seen in the work of Bryant, in that for him a fluid text is foremost fluid because it consists of various versions. Bryant wants to showcase material revision here, by authors, editors, or readers, among others. But this is a very specific—and humanist—understanding of the fluid text. For revision is, arguably, only one major source of textual variation or fluidity. In this sense, to provide some alternative examples, it is not the inherent emergent discursive-materiality of a text, nor the plurality of material (human or machinic) reading paths through a text, that make a text always already unstable, for Bryant. What does make a text fluid for him is the existence of multiple versions brought into play by human and authorial agents of some sort. This is related to his insistence on a hermeneutic context in which fluid texts are representations of extended and distributed forms of intentionality. As I will ask in what follows, would it not be more interesting to perceive of fluidity or the fluid text rather as a process that comes about out of the entanglement and performance of a plurality of agentic processes: material, discursive, technological, medial, human and non-human, intentional and non-intentional? From this position, a focus on how cuts and boundaries are being enacted within processual texts and books, in an inherently emergent and ongoing manner, might offer a more inclusive strategy to deal with the complexity of a book’s fluidity. This idea will be explored in more depth toward the end of this chapter when I take a closer look at Jerome McGann’s theories of textual criticism.

6.2.1.3 The Archive

As discussed in chapter 3, remix as a practice has the potential to raise questions for the idea of authorship as well as for the related concepts of authority and legitimacy. For example, do moral and ownership rights of an author extend to derivative works? And who can be held responsible for the creation of a work when authorship is increasingly difficult to establish in music mash-ups or in data feeds, where users receive updated information from a large variety of sources? As I touched upon previously, one of the suggestions made in discussions of remix to cope with the problem of authorship in a digital context has involved shifting the focus from the author to the selector, moderator or curator. Similarly, in cases where authorship is hard to establish or even absent, the archive could potentially establish authority. Navas examined both of these notions as potential alternatives to established forms of authority in an environment that relies on continual updates and where process is preferred to product. Navas stresses, however, that keeping a critical distance from the text is necessary to make knowledge possible and to establish authority. As authorship has been replaced by sampling—and ’sampling allows for the death of the author’, according to Navas, as the origin of a tiny fragment of a musical composition becomes hard to trace—he argues that the critical position in remix is taken in by s/he who selects the sources to be remixed. However, in mashups, this critical distance increasingly becomes difficult to uphold. As Navas puts it, ‘this shift is beyond anyone’s control, because the flow of information demands that individuals embed themselves within the actual space of critique, and use constant updating as a critical tool’ (2010).

To deal with the constantly changing present, Navas turns to history as a source of authority: to give legitimacy to fluidity retrospectively by means of the archive. The ability to search the archive gives the remix both its reliability as well as its market value, Navas points out. By recording information it becomes meta-information, information that is static, available when needed and always in the same form. Retrospectively, this recorded state, this staticity of information, is what makes theory and philosophical thinking possible. As Navas claims, ‘the archive, then, legitimates constant updates allegorically. The database becomes a delivery device of authority in potentia: when needed, call upon it to verify the reliability of accessed material; but until that time, all that is needed is to know that such archives exist’ (2010).

At the same time Navas is ambivalent about the archive as a search engine. He argues that in many ways it is a truly egalitarian space—able to answer ‘all queries possible’—but one that is easily commercialised too. What does it mean when Google harvests the data we collect and our databases are predominantly built upon social media sites? In this respect we are also witnessing an increasing rise of information flow control (Navas 2010).

The importance of Navas’ theorising in this context lies in the possibilities his thinking offers for the book and the knowledge system we have created around it. First of all, he explores the archive as a way of both stabilising flow and of creating a form of authority out of fluidity and the continual updating of information. Additionally, he proposes the role of s/he who selects, curates or moderates as an alternative to that of the author. In a way one can argue that this model of agency is already quite akin to that found in scholarly communication, where selection of resources and referring to other sources, next to collection building, is part of the research and writing process of most academics. Manovich argues for a similar potential, namely the potential of knowledge producers to modularise data and make it adaptable within multiple media and various platforms, mirroring scientific achievements with standardised meta-data and the semantic web.

These are all interesting steps to think beyond the status quo of the book, challenging scholarly thinking to experiment with notions of process and sharing, and to let go of idealised ideas of authorship. Nonetheless, the archive as a tool poses some serious problems with respect to legitimating fluidity retrospectively and providing the necessary critical distance, as Navas positions it. For the archive as such does not provide any legitimation but is built upon the authority and the commands that constitute it. This is what Derrida calls ‘the politics of the archive’ (1996). What is kept and preserved is connected to power structures, to the interests of those who decide what to collect (and on what grounds) and the capacity to interpret the archive and its content when called upon for legitimation claims later on. The question of authority does not so much lie with the archive, but with who has access to the archive and with who gets to constitute it. At the same time, although it has no real power of its own to legitimize fluidity, the archive is used as an objectified extension of the power structures that control it. Furthermore, as Derrida shows, archiving is an act of externalisation, of trying to create stable abstracts (1996: 12). A still further critique of the archive is that, rather than functioning as a legitimising device, its focus is first and foremost on objectification, commercialisation and consummation. In the archive, knowledge streams are turned into knowledge objects when we order our research into consumable bits of data. As Navas has shown, the search engine, based on the growing digital archive we are collectively building, is Google’s bread and butter. By initiating large projects like Google Books, for instance, Google aims to make the world’s archive digitally available or to digitise the ‘world’s knowledge’—or at least, that part of it that Google finds appropriate to digitise (i.e. mostly works in American and British libraries, and thus mostly English language works). In Google’s terms, this means making the information they deem most relevant—based on the specific programming of their algorithms—freely searchable, and Google partners with many libraries worldwide to make this service available. However, most of the time only snippets of poorly digitised information are freely available, and for full-text functionality, or more contextualised information, books must be acquired via Google Play Books (formerly Google Editions) for instance, the company’s ebook store. This makes it clear how search is fully embedded within a commercial framework in this environment.

The interpretation of the archive is therefore a fluctuating one and the stability it seems to offer is, arguably, relatively selective and limited. As Derrida shows, the digital offers new and different ways of archiving, and thus also provides a different vision on what it constitutes and archives (both from a producer as well as from a consumer perspective) (1996: 17). Furthermore, the archiving possibilities also determine the structure of the content that will be archived as it is becoming. The archive thus produces just as much as it records the event. In this respect the archive is highly performative: it produces information, creates knowledge, and decides how we determine what knowledge will be. And the way the archive is constructed is very much a consideration under institutional and practical constraints. For example, what made the Library of Congress decide to preserve and archive all public Twitter feeds starting from its inception in 2006, and why only Twitter and not other similar social media platforms? The relationship of the archive to scholarship is a mutual one, as they determine one another. A new scholarly paradigm therefore also asks for and creates a new vision of the archive. This is why, as Derrida states, ‘the archive is never closed. It opens out of the future’ (1996: 45).[11] Therefore the archive does not stabilise or guarantee any concept.

Foucault acknowledges this fluidity of the archive, where he sees it as a general system of both the formation and transformation of statements. However, the archive also structures our way of perceiving the world, as we operate and see the world from within the archive. As Foucault states: ‘it is from within these rules that we speak’ (1969: 146). The archive can thus be seen as governing us, and this again directly opposes the idea of critical distance that Navas wants to achieve with his notion of the archive, as we can never be outside of it. Matthew Kirschenbaum argues along similar lines when he discusses the preservation of digital objects, pointing out that their preservation is ‘logically inseparable from the act of their creation (emphasis in the original)’ (2013). He explains this as follows:

The lag between creation and preservation collapses completely, since a digital object may only ever be said to be preserved if it is accessible, and each individual access creates the object anew. One can, in a very literal sense, never access the “same” electronic file twice, since each and every access constitutes a distinct instance of the file that will be addressed and stored in a unique location in computer memory. (Kirschenbaum 2013)

This means that every time we access a digital object, we duplicate it, we copy it. And this is exactly why, in our strategies of conservation, every time we access a file we also (re)create these objects anew over and over again. Critical distance here is impossible when we are actively involved in the archive’s functioning. As Kirschenbaum states, ‘the act of retrieval precipitates the temporary reassembling of 0’s and 1’s into a meaningful sequence that can be decoded by software and hardware’ (2013). Here the agency of the archive, of the software and hardware, also becomes apparent. Kirschenbaum refers to Wolfgang Ernst’s notion of archaeography, which denotes forms of machinic or medial writing, or as Ernst puts it, ‘expressions of the machines themselves, functions of their very mediatic logic’ (2011: 242). At this point archives become ‘active ‘‘archaeologists’’ of knowledge’ (Ernst 2011: 239), or as Kirschenbaum puts it, ‘the archive writes itself’ (2013).

Let me reiterate that the above critique is not focused on doing away with either the archive or the creation of (open access) archives: archives play an essential role in making scholarly research accessible, preserving it, adding metadata and making it harvestable. However, I do want scholars to be aware of the structures at play behind the archive, and I want to put question marks at both its perceived stability, as well as at its (objective) authority and legitimacy.

6.2.2 The Limits of Fluidity and Stability

The theories and experiments described above in relation to modularity, fluid and liquid publications, new forms of authorship and the archive, offer valuable insights into some of the important problems, as well as some of the possibilities, with knowledge production in a digital context. I will however argue that most of the solutions presented above when it comes to engaging with fluidity in online environments still rely on print-based answers (favouring established forms of fixity and stability). The concepts and projects I have described have not actively explored the potential of networked forms of communication to truly disrupt or rethink our conventional understandings of the autonomous human subject, the author, the text, and fixity in relation to the printed book. Although they take on the challenge of finding alternative ways of establishing authority and authorship in order to cope with an increasingly fluid environment, they still very much rely on the print-based concept of stability and on the knowledge and power systems built around it. In many ways they thus remain bound to the essentialisms of this object-oriented scholarly communication system. The concepts of the archive, of the idea of the selector or moderator, of modularity, and of fluidity and liquidity neither fundamentally challenge nor form a real critical alternative to our established notions of authorship, authority and stability in a digital context.

As I said before, my critique of these notions is not intended as a condemnation of their experimental potential. On the contrary, I support these explorations of fluidity strongly, for all the reasons I have outlined here. However, instead of focussing on reproducing print-based forms of fixture and stability in a digital context, as the concepts and projects mentioned above still end up doing, I want to examine these practices of stabilising, and the value systems on which they are based. Books are an emergent property. Instead of trying to cope with the fluidity offered by the digital medium by using the same disciplinary regime we are used to from a print context, to fix and cut down the digital medium, I want to argue that we should direct our attention more toward the cuts we make in, and as part of our research, and on the reasons why we make them (both in a print and digital context) as part of our intra-active becoming with the book.

As I made clear in my introduction to this section, instead of emphasising the dualities of fixity/fluidity, closed/open, bound/unbound, and print/digital, I want to shift attention to the issue of the cut; or better said, to the performative agential processes of cutting. How can we, through the cut, take responsibility for the boundaries we enact and that are being enacted? How can we do this whilst simultaneously enabling responsiveness by promoting forms and practices of cutting that allow the book to remain emergent and processual (i.e. that do not tie it down or bind it to fixed and determined meanings, practices and institutions), and that also examine and disturb the humanist and print-based notions that continue to accompany the book?

Rather than seeing the book as either a stable or a processual entity, a focus on the agential processes that bring about book objects, on the constructions and value systems we adhere to as part of our daily scholarly practices, might be key in understanding the performative nature of the book as an on-going effect of these agential cuts. In the next section I therefore want to return to remix theory, this time exploring it from the perspective of the cut. I want to analyse the potential of remix as part of a discourse of critical resistance against essentialism to question humanist notions such as quality, fixity and authorship/authority; notions which continue to structure humanities scholarship, and on which a great deal of the print-based academic institution continues to rest. I will argue that within a posthumanist performative framework remix can be a means to intervene in and rethink humanities knowledge production, specifically with respect to the political-economy of book publishing and the commodification of scholarship into knowledge objects. I will illustrate this at the end of the next section with an analysis of two book publishing projects that have experimented with remix and reuse.

6.3 Remix and the Cut: Cutting Scholarship Together/Apart

Cutting can be understood as an essential aspect of the way reality at large is structured and provided with meaning. I want to focus on how remix specifically, as a form of ‘differential cutting’, can be a means of intervening in and rethinking humanities knowledge production—in particular with respect to the political-economy of book publishing and the commodification of scholarship into knowledge objects—thus opening up and enabling a potential alternative open-ended politics of the book.

In this section I will provide an analysis of how there has been a tendency within remix studies to theorise the cut and the practice of cutting from a representationalist framework. At the same time, my analysis will be juxtaposed and entangled with a diffractive[12] reading of a selection of critical theory, feminist new materialist and media studies texts that specifically focus on the act of cutting from a performative perspective, to explore what forms a posthumanist vision of remix and the cut might take. I will then explore how the potential of the cut and, relating to that, how the politics inherent in the act of cutting, can be applied to scholarly book publishing in an affirmative way. How can we account for our own ethical entanglements as scholars in the becoming of the book?[13] Based on Foucault’s concept of ‘the apparatus’, as well as on Barad’s posthumanist expansion of this concept,[14] I will argue that the scholarly book currently functions as an apparatus that cuts the processes of scholarly creation and becoming into authors, scholarly objects and an observed world separate from these and us. Drawing attention to the processual and unstable nature of the book instead, I will focus on the book’s critical and political potential to question these cuts and to disturb these existing scholarly practices and institutions.

After analysing how the book functions as an apparatus, a material-discursive formation or assemblage which enacts cuts, I will explore two book publishing projects—Open Humanities Press’s Living Books about Life and Mark Amerika’s remixthebook—that have tried to re-think and re-perform this apparatus by specifically taking responsibility for the cuts they make in an effort to ‘cut-well’ (Kember and Zylinska 2012). I will end this chapter by exploring how these projects have established an alternative politics and ethics of the cut that is open to change, whilst simultaneously analysing what some of their potential shortcomings are.

6.3.1 The Material-Discursive Cut within a Performative Framework

As I have shown above, Navas has written extensively about cut/copy paste as a practice and concept within remixed music and art. For Navas, remix as a process is deeply embedded in a cultural and linguistic framework, where he sees it as a form of discourse at play across culture (2012: 3). This focus on remix as a cultural variable or as a form of cultural representation seems to be one of the dominant modes of analysis within remix studies as a field.[15] Based on his discursive framework of remix as representation and repetition (following Jacques Attali), Navas makes a distinction between copying and cutting. He sees cutting (into something physical) as materially altering the world, while copying, as a specific form of cutting, keeps the integrity of the original intact. Navas explores how the concept of sampling was altered under the influence of changes in mechanical reproduction, where sampling as a term started to take on the meaning of copying as the act of taking, not from the world, but from an archive of representations of the world. Sampling thus came to be understood culturally as a meta-activity (Navas 2012: 12). In this sense Navas distinguishes between material sampling from the world (which is disturbing) and sampling from representations (which is a form of meta-representation that keeps the original intact). The latter is a form of cultural citation—where one cites in terms of discourse—and this citation is strictly conceptual (Navas 2012: 11–16).

It can be beneficial here to apply the insights of new materialist theorists to explore what their ‘material-discursive’ and performative visions of cutting and the cut are able to contribute to the idea of remix as a critical affirmative doing. Here I want to extend remix beyond a cultural logic operating at the level of representations, by seeing it as an always already material practice that disturbs and intervenes in the world. As Barad states, for instance: ‘the move toward performative alternatives to representationalism shifts the focus from questions of correspondence between descriptions and reality (e.g. do they mirror nature or culture?) to matters of practices/doings/actions’ (2003: 802). Here remixes as representations are not just mirrors or allegories of the world, but direct interventions in the world. Therefore, both copying and cutting are performative, in the sense that they change the world; they alter and disturb it.[16] Following this reasoning, copying is not ontologically distinct from cutting, as there is no distinction between discourse and the real world: language and matter are entangled, where matter is always already discursive and vice versa.[17]

As was explored in more depth in my first chapter, Barad’s material-discursive vision of the cut focuses on the complex relationship between the social and the non-social, moving beyond the binary distinction between reality and representation by replacing representationalism with a theory of posthumanist performativity. Her form of realism is not about representing an independent reality outside of us, but about performatively intervening, intra-acting with and as part of the world (Barad 2007: 37). For Barad, intentions are attributable to complex networks of agencies, both human and non-human, functioning within a certain context of material conditions (2007: 23). Where in reality agencies and differences are entangled phenomena, what Barad calls agential cuts cleave things together and apart, creating subjects and objects by enacting determinate boundaries, properties, and meanings. These separations that we create also enact specific inclusions and exclusions, insides and outsides. Barad argues that it is important to take responsibility for the incisions that we make, where being accountable for the entanglements of self and other that we weave also means we need to take responsibility for the exclusions we create (2007: 393). Although not enacted directly by us, but rather by the larger material arrangement of which we are a part (cuts are made from the inside), we are still accountable to the cuts we help to enact: there are new possibilities and ethical obligations to act (cut) at every moment (Barad 2007: 178–179). In this sense, ‘cuts do violence but also open up and rework the agential conditions of possibility’ (Barad et al. 2012). It matters which incisions are enacted, where different cuts enact different materialised becomings. As Barad states: ‘It’s all a matter of where we place the cut. (…) what is at stake is accountability to marks on bodies in their specificity by attending to how different cuts produce differences that matter’ (2007: 348).

6.3.1.1 Cutting Well

Kember and Zylinska explore the notion of the cut as an inevitable conceptual and material interruption in the process of mediation, focusing specifically on where to cut in so far as it relates to how to cut well. They point out that the cut is both a technique and an ethical imperative, in which cutting is an act necessary to create meaning, to be able to say something about things (Kember and Zylinska 2012: 27). On a more ontological level they argue that ‘cutting is fundamental to our emergence in the world, as well as our differentiation from it’ (Kember and Zylinska 2012: 168). Here they see a similarity with Derrida’s notion of ‘différance’, a term that functions as an incision, where it stabilises the flow of mediation into things, objects, and subjects (Kember and Zylinska 2012: xvi).[18] Through the act of cutting we shape our temporally stabilised selves (we become individuated), as well as actively forming the world we are part of and the matter surrounding us (Kember and Zylinska 2012: 168). Kember and Zylinska are specifically interested in the ethics of the cut. If we inevitably have to intervene in the process of becoming (to shape it and give it meaning), how is it that we can cut well? How can we engage with a process of differential cutting, as they call it, enabling space for the vitality of becoming? To enable a ‘productive engagement with the cut’, Kember and Zylinska are interested in performative and affirmative acts of cutting. They use the example of photography to explore ‘this imperative [which] entails a call to make cuts where necessary, while not forgoing the duration of things’ (Kember and Zylinska 2012: 81). Cutting becomes a technique, not of rendering or representing the world, but of managing it, of ordering and creating it, of giving it meaning. The act of cutting is crucial, as Kember and Zylinska put it, to our ‘becoming-with and becoming-different from the world’, by shaping the universe and shaping ourselves in it (2012: 75). Through cutting we enact both separation and relationality where an ‘incision’ becomes an ethical imperative, a ‘decision’, one which is not made by a humanist, liberal subject but by agentic processes. For Kember and Zylinska, a vitalist and affirmative way of ‘cutting well’ thus leaves space for duration, it does not close down creativity or ‘foreclose on the creative possibility of life’ (2012: 82).

6.3.2 The Affirmative Cut in Remix

To explore further the imperative to cut well, I want to return to remix theory and practice, where the potential of the cut and of remix as subversion and affirmative logic, and of appropriation as a political tool and a form of critical production, has been explored extensively. In particular, I want to examine what forms a more performative vision of remix might take to again examine how this might help us in reconstructing an alternative politics of the book. In what sense do remix theory and practice also function, in the words of Barad, as ‘specific agential practices/intra-actions/performances through which specific exclusionary boundaries are enacted’ (2008: 816)? Navas, for instance, conceptualises remix as a vitalism: as a formless force, capable of taking on any form and medium. In this vitalism lies the power of remix to create something new out of something already existing, by reconfiguring it. In this sense, as Navas states, ‘to remix is to compose’. However, remix, through these reconfiguring and juxtaposing gestures, also has the potential to question and critique, becoming an act that interrogates ‘authorship, creativity, originality, and the economics that supported the discourse behind these terms as stable cultural forms’ (Navas 2012: 61). However, Navas warns of the potential of remix to be both what he calls ‘regressive and reflexive’, where the openness of its politics means that it can also be easily co-opted, where ‘sampling and principles of Remix … have been turned into the preferred tools for consumer culture’ (2012: 160). A regressive remix, then, is a re-combination of something that is already familiar and has proved to be successful for the commercial market. A reflexive remix on the other hand is re-generative, as it allows for constant change (Navas 2012: 92–93). Here we can find the potential seeds of resistance in remix, where as a type of intervention, Navas states it has the potential to question conventions, ‘to rupture the norm in order to open spaces of expression for marginalized communities’, and, if implemented well, can become a tool of autonomy (2012: 109).

One of the realms of remix practice in which an affirmative position of critique and politics has been explored in depth, whilst taking clear responsibility for the material-discursive entanglements it enacts, is in feminist remix culture, most specifically in vidding and political remix video. Francesca Coppa defines vidding as ‘a grassroots art form in which fans re-edit television or film into music videos called “vids” or “fanvids”’ (2011: 123). By cutting and selecting certain bits of videos and juxtaposing them with others, the practice of vidding, beyond or as part of a celebratory fan work, has the potential to become a critical textual engagement as well as a re-cutting and recomposing (cutting-together) of the world differently. As Kristina Busse and Alexis Lothian state, vidding practically takes apart ‘the ideological frameworks of film and TV by unmaking those frameworks technologically’ (2011: 141). Coppa sees vidding as an act of both bringing together and taking apart: ‘what a vidder cuts out can be just as important as what she chooses to include’ (2011: 124). The act of cutting is empowering to vidders in Coppa’s vision, where ‘she who cuts’, is better than ‘she who is cut into pieces’ (2011: 128).

Video artist Elisa Kreisinger, who makes queer video remixes of TV series such as Sex and the City and Mad Men, states that political remix videos harvest more of an element of critique in order to correct certain elements (such as gender norms) in media works, without necessarily having to be fan works. As Kreisinger argues, ‘I see remixing as the rebuilding and reclaiming of once-oppressive images into a positive vision of just society’ (2010). Africana studies scholar Renee Slajda is interested in how Kreisinger’s remix videos can be seen as part of a feminist move beyond criticism, where Slajda is interested in how remix artists turn critical consciousness into a creative practice aiming to ‘reshape the media—and the world—as they would like to see it’ (2013). For Kreisinger, too, political remix video is not only about creating ‘more diverse and affirming narratives of representation’ (2011). It also has the potential to effect actual change (although, like Navas, she is aware that remix is also often co-opted by corporations to reinforce stereotypes). Remix challenges dominant notions of ownership and copyright as well as the author/reader and owner/user binaries that support these notions. By challenging these notions and binaries, remix videos also challenge the production and political economy of media (Kreisinger 2011). As video artist Martin Leduc argues, ‘we may find that remix can offer a means not only of responding to the commercial media industry, but of replacing it’ (2011).

6.3.3 The Agentic Cut in Remix

Together with providing valuable affirmative contributions to the imperative to cut-well, and to reconfiguring boundaries, remix has also been important with regard to rethinking and re-performing agency and authorship in art and academia. In this context it critiques the liberal humanist subject that underpins most academic performances of the author, whilst exploring more posthumanist and entangled notions of agency in the form of agentic processes in which agency is more distributed. Paul Miller writes about flows and cuts in his artist’s book Rhythm Science. For Miller, sampling is a doing, a creating with found objects, but this also means that we need to take responsibility for its genealogy, for those ‘who speak through you’ (2004: 037). Miller’s practical and critical engagement with remix and the cut is especially interesting when it comes to his conceptualising of identity, where—as in the new materialist thinking of Barad—he does not presuppose a pre-given identity or self, but states that our identity comes about through our incisions, the act of cutting shaping and creating our selves. The collage becomes my identity, he states (Miller 2004: 024). For Miller, agency is thus not related to our identity as creators or artists, but to the flow or becoming, which always comes first. We are so immersed in and defined by the data that surrounds us on a daily basis that ‘we are entering an era of multiplex consciousness’, Miller argues (2004: 061).

Where Miller talks about creating different persona as shareware, Amerika is interested in the concept of performing theory and critiquing individuality and the self through notions such as ‘flux personae’, establishing the self as an ‘artist-medium’ and a ‘post-production medium’ (2011: 26). Amerika sees performing theory as a creative process, in which pluralities of conceptual personae are created that explore their becoming. Through these various personae, Amerika wants to challenge the ‘unity of the self’ (2011: 28). In this vision the artist becomes a medium through which language, in the form of prior inhabited data, flows. When artists write their words they don’t feel like their own words but like a ‘compilation of sampled artefacts’ from the artist’s co-creators and collaborators. By becoming an artist-medium, Amerika argues that ‘the self per se disappears in a sea of source material’ (2011: 47). By exploring this idea of the networked author concept or of the writer as an artist-medium, Amerika contemplates what could be a new (posthuman) author function for the digital age, with the artist as a post-production medium ‘becoming instrument’ and ‘becoming electronics’ (2011: 58).

6.4 Re-Cutting the Scholarly Apparatus

What can we take away from this transversal reading of feminist new materialism, critical and media theory, and remix studies, with respect to cutting as an affirmative material-discursive practice—especially where this reading concerns how remix and the cut can performatively critique the established humanist notions such as authorship, authority, quality and fixity underlying scholarly book publishing? How can this reading trigger alternatives to the political economy of book publishing, with the latter’s current focus on ownership and copyright and the book as a consumer object? This (re-)reading of remix might pose potential problems for our idea of critique and ethics when notions of stability, objectivity and distance tend to disappear. The question is, then: how can we make ethical, critical cuts in our scholarship whilst at the same time promoting a politics of the book that is open and responsible to change, difference and the inevitable exclusions that result?

To explore this, we need to analyse the way the book functions as an apparatus. The concept of ‘dispositif’ or ‘apparatus’ originates from Foucault’s later work.[19] As a concept, it went beyond ‘discursive formation’ connecting discourse more closely with material practices (Foucault 1980: 194–195). The apparatus is the system of relations that can be established between these disparate elements. However, an apparatus for Foucault is not a stable and solid ‘thing’ but a shifting set of relations inscribed in a play of power, one that is strategic and responds to an ‘urgent need’, a need to control (1980: 196).[20] Deleuze’s more fluid outlook sees the apparatus as an assemblage capable of escaping attempts of subversion and control. He is interested in the variable creativity that arises out of dispositifs (in their actuality), or in the ability of the apparatus to transform itself, where we as human beings belong to dispositifs and act within them (Deleuze 1992). Barad, meanwhile, connects the notion of the cut to her posthuman Bohrian concept of the apparatus. As part of our intra-actions, apparatuses, in the form of certain material arrangements or practices, effect an agential cut between subject and object, which are not separate but come into being through these intra-actions (Barad 2007: 141–142). Apparatuses, for Barad, are therefore open-ended and dynamic material-discursive practices, articulating concepts and things (2007: 334).

In what way has the apparatus of the book—consisting of an entanglement of relationships between, among other things, authors, books, the outside world, readers, the material production and political economy of book publishing and the discursive formation of scholarship—executed its power relations through cutting in a certain way? In the present scholarly book publishing constellation, it has mostly operated via a logic of incision: one that favours neat separations between books, authors (as human creators) and readers; that cuts out fixed scholarly book objects of an established quality and originality; and that simultaneously pastes this system together via a system of strict ownership and copyright rules. The manner in which the apparatus of the book cuts at the present moment, does not take into full consideration the processual aspects of the book, research and authorship. Neither does the current print-based apparatus explore in depth the possibilities to re-cut our research results in such a way as to experiment with collaboration, updates, versionings and multimedia enhancements in a digital context. The dominant book-apparatus instead enforces a political economy that keeps books and scholarship closed-off from the majority of the world’s potential readers, functioning in an increasingly commercial environment (albeit one fuelled by public money), which makes it very difficult to publish specialised scholarship lacking marketable promise. The dominant book-apparatus thus does not take into consideration how the humanist discourse on authorship, quality and originality that continues to underlie the humanities, perpetuates this publishing system in a material sense. Nor does it analyse how the specific print based materiality of the book and the publishing institutions that have grown around it have likewise been incremental in shaping the discursive formation of the humanities and scholarship as a whole.

Following this chapter’s diffractively collected insights on remix and the cut, I want to again underscore the need to see and understand the book as a process of becoming, as an entanglement of plural (human and non-human) agencies. The separations or cuts that have been forced out of these entanglements by specific material-discursive practices have created inclusions and exclusions, book objects and author subjects, both controlling positions.[21] Books as apparatuses are thus performative, they are reality shaping. Not enough responsibility is taken—not by us as scholars, nor by publishers nor the academic system as a whole—for the cuts that are enacted with and through the book as an apparatus. We need to acknowledge the roles we all play and the responsibility we have in shaping the way we publish research, where now most humanities research ends up as a conventional, bound, printed (or increasingly hybrid) and single authored book, published by an established publisher and disseminated to mainly university libraries. However, we also need to take into consideration that our approved, dominant scholarly practices—which include the (printed) book—are simultaneously affecting us as scholars and the way we act in and describe the world and/or our object of study, including as Hayles has argued, the way we are ‘conceptualizing projects, implementing research programs, designing curricula, and educating students’ (2012: 1). It is important to acknowledge our entangled nature in all this, where scholars need to take more responsibility for the practices they enact and enforce and the cuts that they make, especially in their own book publishing practices.

6.4.1 Open-Ended Scholarly Re-Cutting

However, following the insights of Foucault, Deleuze and Barad as discussed above, the book-apparatus, of which we are a part, also offers new ‘lines of flight’ or opportunities to recut and (re)perform the book and scholarship, as well as ourselves, differently. Living Books about Life and remixthebook are two book-publishing projects that have explored the potential of the cut and remix for an affirmative politics of publishing, to challenge our object-oriented and modular systems. In the analysis of these projects that follows, I want to explore in what sense they have been able to promote, through their specific cuts, an open-ended politics of the book that enables duration and difference.[22]

6.4.1.1 Remixthebook

At the beginning of August 2011, Mark Amerika launched remixthebook.com, a website designed to serve as an online companion to his new print volume, remixthebook (2011). Amerika is a multi-disciplinary artist, theorist and writer, who’s various personas[23] offer him the possibility of experimenting with hypertext fiction and net.art as well as with more academic forms of theory and artist’s writings, and to do so from a plurality of perspectives.[24]

Remixthebook is a collection of multimedia writings that explore the remix as a cultural phenomenon by themselves referencing and mashing-up curated selections of earlier theory, avant-garde and art writings on remix, collage and sampling. It consists of a printed book and an accompanying website that functions as a platform for a collaboration between artists and theorists exploring practice-based research (Amerika 2011: xiv–xv). The platform features multimedia remixes from over 25 international artists and theorists who were invited to contribute a remix to the project site based on selected sample material of the printed book. Amerika questions the bound nature of the printed book and its fixity and authority, by bringing together this community of diverse practitioners, performing and discussing the theories and texts presented in the book via video, audio and text-based remixes published on the website—opening the book and its source material up for continuous multimedia re-cutting. Amerika also challenges dominant ideas of authorship by playing with personas and by drawing from a variety of remixed source material in his book, as well as by directly involving his remix community as collaborators on the project.

For Amerika, then, the remixthebook project is not a traditional form of scholarship. Indeed, it is not even a book in the first instance. As he states in the book’s introduction, it should rather be seen as ‘a hybridized publication and performance art project that appears in both print and digital forms’ (Amerika 2011: xi). Amerika applies a form of patch or collage writing[25] in the 12 essays that make up remixthebook. He also endeavours to develop a new form of new media writing, one that constitutes a crossover between the scholarly and the artistic, and between theory and poetry, mixing these different modalities. For all that, Amerika’s project has the potential to change scholarly communication in a manner that goes beyond merely promoting a more fluid form of new media writing. What is particularly interesting about his hybrid project, both from the print book side and from the platform network performance angle, is the explicit connections Amerika makes through the format of the remix to previous theories, and to those artists/theorists who are currently working in and theorising the realm of digital art, humanities and remix. At the same time, remixthebook the website functions as a powerful platform for collaboration between artists and theorists who are exploring the same realm, celebrating the kind of practice-based research Amerika applauds (Amerika 2011: xiv–xv). By creating and performing remixes of Amerika’s source material that is again based on a mash-up of other sources, a collaborative interweaving of different texts, thinkers and artists emerges, one that celebrates and highlights the communal aspect of creativity in both art and academia.

However, a discrepancy remains visible between Amerika’s aim to create a commons of renewable source material along with a platform on which everyone (amateurs and experts alike) can remix his and others’ source material, and the specific choices Amerika makes and the outlets he chooses to fulfil this aim. For instance, remixthebook is published as a traditional printed book (in paperback and hardcover); more importantly, it is not published on an open access basis, a publishing model which would make it far easier to remix and reuse Amerika’s material by copying and pasting directly from the web or a PDF, for instance.

Amerika in many ways tries to evade the bounded nature of the printed edition by creating this community of people remixing the theories and texts presented in the book. He does so not only via the remixes that are published on the accompanying website, but also via the platform’s blog and the remixthebook Twitter feed to which new artists and thinkers are asked to contribute on a weekly basis. However, here again, the website is not openly available for everyone to contribute to. The remixes have been selected or curated by Amerika along with his fellow artist and co-curator Rick Silva, and the artists and theorists contributing to the blog and Twitter as an extension of the project have also been selected by Amerika’s editorial team. Even though people are invited to contribute to the project and platform, then, it is not openly accessible to everyone. Furthermore, although the remixes and blogposts are available and accessible on the website, they are themselves not available to remix, as they all fall under the website’s copyright regime, which is licensed under a traditional ‘all rights reserved’ copyright. Given all the possibilities such a digital platform could potentially offer, the question remains as to how much Amerika has really put the source material ‘out there’ to create a ‘commons of renewable source material’ for others to ‘remix the book’ (Amerika 2011: xv).

Notwithstanding the fact that remixthebook is based on selections of manipulated and mashed-up source material from all kinds of disparate backgrounds, and to that extent challenges the idea of individual creativity, originality and authorship, this project, for all its experimental potentiality, also draws on some quite conventional notions of authorship. Theoretically, Amerika challenges such ideas by playing with different personas and by drawing on a variety of source material, which he proceeds to remix in his book. Practically, however, Amerika is still acting very much as a traditional humanist author of his book, of his curated collection of material. Amerika takes responsibility for the project when he signs his name on the cover of the book.[26] He is the book’s originator in the sense that he has created an authentic product by selecting and re-writing the material. Moreover, he seeks attribution for this endeavour (where it is copyrighted all rights reserved © Mark Amerika), and wants to receive the necessary credit for this work—a monograph published by an established university press (University of Minnesota Press)—in the context of the artistic and scholarly reputation economies. Amerika and co-curator Rick Silva are also the authors or curators of the accompanying website of remixes—similarly copyrighted with a traditional license—as they commissioned the remixes. Furthermore, all the remixes, which are again based on a variety of remixed (and often unattributed) source material, are attributed to the participating remixers (thus performing the function of quite traditional authors), complete with their bios and artist’s statements. In spite of its experimental aims related to new forms of authorship, remix and openness, it seems that practically the cuts that have been enacted and performed as part of the remixthebook project still adhere for a large part to our established humanist and print-based scholarly practices and institutions.

6.4.1.2 Living Books about Life

In 2011 the media and cultural theorists Clare Birchall, Gary Hall and Joanna Zylinska initiated Living Books about Life, a series of open access books about life published by Open Humanities Press, and designed to provide a bridge between the humanities and sciences. All the books in this series repackage existing open access science-related research, supplementing this with an original editorial essay to tie the collection together. They also provide additional multimedia material, from videos to podcasts to whole books. The books have been published online on an open source wiki platform, meaning they are themselves ‘living’ or ‘open on a read/write basis for users to help compose, edit, annotate, translate and remix’ (Hall 2012). Interested potential contributors can also contact the series editors to contribute a new living book. These living books can then collectively or individually be used and/or adapted for scholarly and educational contexts as an interdisciplinary resource bridging the sciences and humanities.

As Hall has argued, this project is designed to, among other things, challenge the physical and conceptual limitations of the traditional codex by including multimedia material and even whole books in its living books, but also by emphasising its duration by publishing using a wiki platform and thus ‘rethinking ‘‘the book’’ itself as a living, collaborative endeavour’ (2012). However, the mediawiki software employed by the Living Books about Life project, in common with a lot of wiki software, keeps accurate track of which ‘user’ is making what changes. This offers the possibility that other users can monitor recent changes to pages, explore a page’s revision history, and examine all the contributions of a specific user. The software thus already has mechanisms written into it to ‘manage’ or fix the text and its authors, by keeping a track-record or archive of all the changes that are made. But the project also continues to enforce stability and fixity (both of the text and of its users) on the front-end side: by clearly mentioning the specific editor’s name underneath the title of each collection, as well as on the book’s title page; by adding a fixed and frozen version of the text in PDF format, preserving the collection as it was originally created by the editors; but also by binding the book together by adding a cover page, and following a rather conventional book structure (complete with an editorial introduction followed by thematic sections of curated materials). Mirroring the physical materiality of the book (in its design, layout, and structuring) in such a way also reproduces ‘the aura’ of the book, including the discourse of scholarship (as stable and fixed, with clear authority) this brings with it. This might explain why the user interaction with the books in the series has been limited in comparison to some other wikis, which are perhaps more clearly perceived as multi-authoring environments. Here the choice to re-cut the collected information as a book, with clear authors and editors, whilst and as part of re-thinking and re-performing the book as concept and form, might paradoxically have been responsible for both the success and the failure of the project.

What both the Living Books about Life and OHP’s earlier Liquid Books project share, however, is a continued theoretical reflection on issues of fixity, authorship and authority, both by its editors and by its contributors in various spaces connected to the project. This comes to the fore in the many presentations and papers the series editors and authors have delivered on these projects, engaging people with their practical and theoretical issues. These discussions have also taken place on the blog[27] that has accompanied the Living Books about Life series, and in Hall and Birchall’s multimodal text and video-based introduction to the Liquid Books series, to give just some examples. It is in these connected spaces that continued discussions are being had about copyright, ownership, authority, the book, editing, openness, fluidity and fixity, the benefits and drawbacks of wikis, quality and peer review, etc. I would like to argue that it is here, on this discursive level that the aliveness of these living books is perhaps most ensured. These books live on in continued discussion on where we should cut them, and when, and who should be making the incisions, taking into consideration the strategic compromises—which might indeed include a frozen version and a book cover, and clearly identifiable editors—we might have to make due to our current entanglements with certain practices, institutions and pieces of software, all with their own specific power structures and affordances.

In ‘Future books: a Wikipedia model?’ an introduction to one the books in the Liquid Books series—namely Technology and Cultural Form: A Liquid Reader that has been, collaboratively edited and written by Joanna Zylinska and her MA students (together forming a ‘liquid author’)—the various decisions and discussions we could make and have concerning liquid, living and wiki books are considered in depth: ‘It seems from the above that a completely open liquid book can never be achieved, and that some limitations, decisions, interventions and cuts have to be made to its “openness”. The following question then presents itself: how do we ensure that we do not foreclose on this openness too early and too quickly? Perhaps liquid editing is also a question of time, then; of managing time responsibly and prudently’ (2010). Looking at it from this angle, these discussions are triggering critical questions from a user (writer/reader) perspective, in their entanglements and negotiations with the institutions, practices and technologies of scholarly communication. Within a wiki setting, questions concerning what new kinds of boundaries are being set up are important: who moderates decisions over what is included or excluded (what about spam?) Is it the editors? The software? The press? Our notions of scholarly quality and authority? What is kept and preserved and what new forms of closure and inclusion are being created in this process? How is the book disturbed and at the same time re-cut? It is our continued critical engagement with these kinds of questions, both theoretically and practically, in an affirmative manner that will keep these books open and alive.

6.5 Conclusion

To conclude this chapter, I would like to briefly return to textual studies or textual criticism, which as a field has always actively engaged itself with issues concerning the fixity and fluidity of texts. This is embodied mainly in the search for the ideal text or archetype, but also in the continued confrontation with a text’s pluralities of meaning and intentionality, next to issues of interpretation and materiality. In this respect critical editing, as a means of stabilising a text, has always revolved around an awareness of the cuts that are made to a text in the creation of scholarly editions. It can therefore be stated that, as Bryant has argued, the task of a textual scholar is to ‘manage textual fluidity’ (2002: 26).

One of the other strengths of textual criticism is an awareness on the part of many of the scholars in the field that their own practical and theoretical decisions or cuts influence the interpretation of a text. They can therefore be seen to be mindful of their entanglement with its becoming. As Bryant has put it, ‘editors’ choices inevitably constitute yet another version of the fluid text they are editing. Thus critical editing perpetuates textual fluidity’ (Bryant 2002: 26). These specific cuts, or ‘historical write-ups’, that textual scholars create as part of their work with critical editions, don’t only construct the past from a vision of the present, they also say something about the future. As textual scholar Jerome McGann has pointed out:

All poems and cultural products are included in history—including the producers and the reproducers of such works, the poet and their readers and interpreters … To the historicist imagination, history is the past, or perhaps the past as seen in and through the present; and the historical task is to attempt a reconstruction of the past, including, perhaps, the present of that past. But the Cantos reminds us that history includes the future, and that the historical task involves as well the construction of what shall be possible. (1988)

It is this awareness that a critical edition is the product of editorial intervention (which creates a material-discursive framework that influences future texts’ becoming) that I am interested in here, especially in relation to McGann’s work on the performativity of texts. For McGann every text is a social text, created under specific socio-historical conditions, where he theorises texts not as things or objects, but as events. He argues therefore that texts are not representations of intentions, but they are processual events in themselves. Thus every version or reading of a text is a performative (as well as a deformative) act (McGann, J. 2004: 225). In this sense, McGann makes the move in textual criticism from a focus on authorial intention and hermeneutics or representation, to seeing a text as a performative event and critical editions as performative acts.

McGann therefore argues for a different, dynamic engagement with texts, not focused on discovering what a text ‘is’, but on an ‘analysis [that] must be applied to the text as it is performative’ (2004: 206). This includes taking into consideration the specific material iteration of the text one is studying (and how this functions, as Hayles has argued, as a technotext, i.e. how its specific material apparatus produces the work as a physical artifact (Hayles 2002)), as well as an awareness of how the scholar’s textual analysis is itself part of the iteration and ‘othering’ of the text (McGann, J. 2004: 206). And connected to this, as Barad has argued, we have to be aware how the text’s performativity shapes us in our entanglement with it.

The question then is: why we can’t be more like critical textual editors (in the style of Jerome McGann) ourselves when it comes to our own scholarly works, taking into consideration the various cuts we make and that are made for us as part of the processes of knowledge production? Assuming responsibility for our own incisions as textual critics of our own work, exploring the poetics or poethics of scholarship in this respect should involve: taking responsibility for our entanglement in the production, dissemination and consumption of the book; engaging with the material-discursive institutional and cultural aspects of the book and book publishing; and experimenting with an open-ended and radical politics of the book (which includes exploring the processual nature of the book, whilst taking responsibility for the need to cut). This would also involve experimenting with alternative ways of cutting our bookish scholarship together-apart: with different forms of authorship, both-human and non-human; with the materialities and modalities of the book, exploring multimodal and emergent genres, whilst continuously rethinking and performing the fixity of the book itself; and with the publishing process, examining ways to disturb the current political economy of the book and the objectification of the book within publishing and research. From my perspective, this would mean we continue our experimentations with remixed and living books, with versionings, and with radical forms of openness, while at the same time remaining critical of the alternative incisions we make as part of these projects, of the new forms of binding they might weave. This also involves being aware of the potential strategic decisions we make to keep some iterative bindings intact (for reasons of authority and reputation, for instance) and why we choose to do so. We should therefore engage with this experimenting not from the angle of the fixed or fluid book, but from the perspective of the cut that cuts-together-apart the emergent book and, when done well, enables its ongoing becoming.

This text, just as the projects mentioned above, has attempted to start the process of rethinking (through its diffractive methodology) how we might start to cut differently where it comes to our research and publication practices. Cutting and stabilising still needs to be done, but it might be accomplished in different ways, at different stages of the research process, and for different reasons than we are doing now. What I want to emphasise here is that we can start to rethink and re-perform the way we publish our research if we start to pay closer attention to the specific cuts we make (and that are made for us) as part of our publishing practices. The politics of the book itself can be helpful in this respect where, as Gary Hall and I have argued elsewhere, ‘if it is to continue to be able to serve ‘new ends’ as a medium through which politics itself can be rethought (…) then the material and cultural constitution of the book needs to be continually reviewed, re-evaluated and reconceived’ (2013: 138). The book itself can thus be a medium with the critical and political potential to question specific cuts and to disturb existing scholarly practices and institutions. Books are always a process of becoming (albeit one that is continuously interrupted and disturbed). Books are entanglements of different agencies that cannot be discerned beforehand. In the cuts that we make to untangle them we create specific material book objects. In these incisions, the book has always already redeveloped, remixed. It has mutated and moved on. The book is thus a processual, ephemeral and contextualised entity, which we can use a means to critique our established practices and institutions, both through its forms (and the cuts we make to create these forms) and its metaphors, and through the practices that accompany it.


[1] I am here invoking what Lawrence Lessig refers to as a Read/Write (RW) culture, as opposed to a Read/Only (RO) culture (2008: 28–29).

[2] Where open access (in its weak version) can be seen to focus mainly on accessibility (and in many cases wants to preserve the integrity of the work), open content includes the right to modify specifically. The problem is that where it comes to open access definitions and providers, some permit derivative works and some do not. The open knowledge definition encompasses both, as does the BBB definition of open access.

[3] More ethical interventions in scholarly communication might start with—but are not limited to—a critical involvement with the various relationships in academic publishing by, for example: exercising an ethics of care with respect to the various (human and non-human) agencies involved in the publication process; a focus on free labour and a concern with power and difference in academic life; experimenting with alternatives, such as new economic models and fair pricing policies, to counter exploitative forms of publishing; exploring how we can open up the conventions of scholarly research (from formats to editing, reviewing, and revising); critically reflecting on the new potential closures we enact (McHardy et al. 2013, Danyi 2014, Kember, 2014a).

[4] In the United States, the Copyright Act defines “derivative work” in 17 U.S.C. § 101:

a “derivative work” is a work based upon one or more pre-existing works, such as a translation, musical arrangement, dramatization, fictionalization, motion picture version, sound recording, art reproduction, abridgment, condensation, or any other form in which a work may be recast, transformed, or adapted. A work consisting of editorial revisions, annotations, elaborations, or other modifications which, as a whole, represent an original work of authorship, is a “derivative work”. See: http://www.copyright.gov/title17/92chap1.html – 101

[5] A triplet or assertion is the shortest meaningful sentence or statement: a combination of subject, predicate and object. See: http://nanopub.org/wordpress/?page_id=65

[6] A nano-publication is the smallest unit of publishable information: an assertion about anything that can be uniquely identified and attributed to its author. See: http://nanopub.org/wordpress/?page_id=65

[7] McPherson argues that we can see this focus on the discreet in, among other things, digital technologies, in UNIX and in languages like c and c++.

[8] For the fluid text edition of Melvilles’s Typee, see: http://rotunda.upress.virginia.edu/melville/

[9] See: http://liquidpub.org/

[10] See: http://www.futureofthebook.org/gamertheory2.0/?page_id=2. This refers mostly to GAM3R 7H30RY 1.1, which can be seen as, as stated on the website, a first stab at a new sort of “networked book,” a book that actually contains the conversation it engenders, and which, in turn, engenders it (Wark 2007).

[11] Derrida gives the example of Freud’s archive and how, with the coming of digital media, a new vision on what constitutes an archive comes into being, which in turn will create a new vision of psychoanalysis.

[12] See chapter 2 for a detailed discussion of diffraction as a methodology.

[13] By engaging in a diffractive reading, this is a performative text too. This means that it is not only a piece of writing on the topic of remix and on ‘cutting things together and apart’, but through its methodology it also affirmatively ‘remixes’ a variety of theories from seemingly disparate fields, locations, times and contexts. This might enable us to understand both the practice and concept of the cut and the entangled theories themselves better. This is akin to what the net artist Mark Amerika calls ‘performing theory’. As a ‘remixologist’, Amerika sees data as a renewable energy source where ideas, theories and samples become his source material. By creating and performing remixes of this source material, which is again based on a mash-up of other source material, a collaborative interweaving of different texts, thinkers and artists emerges, one that celebrates and highlights the communal aspect of creativity in both art and academia (Amerika 2011).

[14] In which apparatuses are conceptualised as specific material configurations that effect an agential cut between, and hence produce, subject and object (Barad 2007: 148).

[15] For example, Henry Jenkins and Owen Gallagher talk about remix cultures and Lessig refers to remix as a R/W (Read/Write) culture, although they all see these cultures as embedded in technology and encapsulated by powers of material economic production (Lessig 2008, Jenkins 2013, Jenkins and Gallagher 2008). An exception is Elisabeth Nesheim who in her talk Remixed Culture/Nature argues for a different conception of remix, one that goes beyond seeing it as a cultural concept and explores principles of remix in nature. Although still starting from a position of human agency, she talks about bio-engineering as a form of genetic remixing, and about bio-artists who remix nature/culture as a form of critique and reflection (Nesheim 2009).

[16] See also Matthew Kirschenbaum’s arguments on how digital copying = preservation = creation, as discussed in the previous section.

[17] I am talking here about the fact that there is no onto-epistemological distinction between cutting and copying. From an ethical perspective, however, one might argue, as Navas has done extensively, that making a distinction between referencing ideas in conceptual and material form, might help us in our aid towards copyright reform (2011).

[18] Akin to what the sociologist and feminist theorist Vicki Kirby calls ‘the cut of difference’ (2011: 101).

[19] First appearing as a concept in Foucault’s History of Sexuality (1976).

[20] In Agamben’s vision the apparatus is an all-oppressive formation, one that human beings stand outside of. Agamben here creates new binaries between inside/outside and material/discursive that might not be helpful for the posthuman vision of the apparatus I want to explore here (2009: 14).

[21] See, for example, the way the PhD student as a discoursing subject is being (re)produced by the dissertation and by the dominant discourses and practices accompanying it (Adema 2013).

[22] I have contributed texts/books/remixes to both projects and my analysis underneath is thus partially written from a participant’s perspective.

[23] For instance, as remix artist and author, and as professor of Art and Art History at the University of Colorado, Boulder.

[24] Amerika wrote the hypertext trilogy GRAMMATRON, PHON:E:ME and FILMTEXT and founded one of the oldest online net.art networks, Alt-X, in 1992.

[25] Patch or collage writing, consisting of disconnected bits of writing pasted together in one work or collage, is relatively common in works of remix and appropriation art and theory, and is explored in Jonathan Lethem’s essay ‘The ecstasy of influence’ (2007), David Shield’s Reality Hunger (2011), and Paul D. Miller’s Rhythm Science (2004) It is a practice that can be traced at least as far back as the cut-up methods applied by William Boroughs and the Dadaists.

[26] Derrida remarks in his discussion of the significance of the signature that, although we cannot perceive it as a literal stand-in for an authentic, and with that, authoritative source, it does however function as and implies both the presence and the non-presence of the signing subject. Derrida argues for a non-essentialist notion of the signature where the singularity of the event of signing is maintained (and with that the presence of the subject is maintained) in what Derrida calls a past and a future now. Through the signature as a performative act the singularity of the original signing event is thus forever maintained in the signature, and becomes iterative in every copy (Derrida 1985).

[27] See: http://www.livingbooksaboutlife.org/blog/