What I’d like to do in this second essay is pick up on the idea that small-gauge scholarship is emerging under specifically “digital” conditions of possibility, and in turn, think through what sort of urban scholarship this could make possible.
Kevin Allen makes the excellent point in his own second essay that while an overly-dichotomous account of digital versus analog media “can be binary and tedious,” it can sometimes be pretty productive. In one of my own modules, Media Technology and Culture, I somewhat paradoxically put a fair bit of effort into temporarily encouraging the students to think somewhat reductively, by exploring media technology via a series of binaries: media technics and media cultures, old and new media, analog and digital, and so on. These, of course, are all false binaries. But my aim in working through them is to hopefully encourage students to realize for themselves that any singular medium is always in an irreducible relationship with its social or cultural contexts.
One of the nice things about taking this approach is that I get to justify placing an awful lot of emphasis on media history as a precursor to thinking about our contemporary digital and networked situation. And year after year, I been pleasantly surprised that students – who I might have assumed would be more interesting in just talking about topics such as social media or on-demand video – become fascinated and even a little nostalgic about the interesting, peculiar histories of media technologies and their associated practices. What’s important here for students is the realization that, while we sometimes tend to think about digital as an enhancement, when you look closely at media history, it’s clear that there are lost specificities to analog. As Kevin Allen rightly points out, there were and are some very real benefits to analog media being difficult or requiring effort.
He also identifies another big question for small-gauge scholarship: whether digital media might “emulate” these indelible qualities of analog formats. This resonates with one of the classic ways that media theorists have tried to make sense of proliferating digital interfaces: via the concept of “remediation”, as coined by Jay Bolter and Richard Grusin.1Jay Bolter and Richard Grusin, Remediation: Understanding New Media (Cambridge, MA: MIT Press, 2000). Though Bolter and Grusin offer a more detailed analysis that I can present here, the basic premise of remediation is that new media always seek to adapt, reference, incorporate, and refashion prior media. Early cinema arguably remediated existing theatrical conventions; many computer games clearly reference cinema; web sites readapt aspects of the magazine; email references postal infrastructures and letter-writing practices (mailbox, inbox, send, folders, “carbon copying,” addresses); and 20th–21st century music could be seen as a complex chain of remediations, from live performance to LP to 8-track to cassette to CD to mp3 to streaming.
Lev Manovich’s recent book Software Takes Command2Lev Manovich, Software Takes Command (London: Bloomsbury, 2013). picks up on this notion of remediation to both highlight its centrality to understanding digital media today, but also to question whether this characteristic (i.e. simulation) exhausts the capacities of computational technologies. Manovich explores in particular some of the prototypes that were developed in the 1970s by Alan C. Kay and colleagues at the Xerox Palo Alto Research Center (PARC), which aimed to effectively re-invent the computer “as ‘metamedium’ whose content is ‘a wide range of already-existing and not-yet-invented media.’”3Manovich, Software Takes Command, 83, quoting Kay and Goldberg. Kay and others at Xerox PARC were trying to transform what was then rather clunky computing technology to a new kind of easily usable and universal media device that could be used for creative and educational ends. The culmination would have been the Dynabook, a sort of proto-tablet that only reached early prototype phase.
Manovich suggests that Xerox PARC’s projects paved the way for later digital devices and software to be primarily produced and understood as “remediating machines”: as essentially simulating non-computational media such as typewriters, painting, filming, editing, storage, and so on. So, turning briefly to small-gauge scholarship via the digital (i.e. software, platforms, devices, environments, etc), we might note that we have seen a kind of convergence in which various small-scale interventions, across a wide range of traditional-yet-digitized media formats, can be translated onto a single metamedial platform, such as Mediapolis. To pick up on a point made by Erica Stein, this could be especially well-suited to interdisciplinary interventions around the intersection of cities, media, and culture; a topic arguably best explored via small-gauge scholarly practices and formats. So, perhaps, through small-gauge scholarship, enabled via digital and networked media, we might realize new ways to re-envision and re-present the small-scale or microcosmic fragments of mediated urban living and its environments.
However, a point I think Manovich makes well, via his discussion of Xerox PARC’s projects, is that even if computational technology, and more specifically software, might involve the simulation of prior media, it does not amount to a transparent simulation. Rather, it is a simulation that also adds new properties and capacities, while taking away others. And these new properties and capacities are very often not specific to the simulated media, but shared amongst and between a range of software applications (just think, for example, about how different software applications, remediating different media, still share functions such as cut and paste, saving files, view control, etc). What Manovich tries to point out is that this not only adds new interoperability between different software applications; in the longer term, it suggests we might see computational media more generally move on from remediation or simulation, and begin to develop its own distinctive and heretofore unimagined logics, semantics and syntax.
In other words, computation and software code not only allows for the translation of the urban into digital spaces; as Rob Kitchin argues4Rob Kitchin, “The Programmable City,” Environment and Planning B: Planning and Design 38:6 (2011): 945-951. See also: Rob Kitchin and M. Dodge, Code/space: Software and Everyday Life (Cambridge, MA: The MIT Press, 2011), 3-22., it also opens up processes of transduction where code and computation can reshape the city, and our perceptions of the city. So I suppose I would double down on the (perhaps slightly implicit) point I was trying to make in my first essay: that we should not simply see the digital environments of small-gauge scholarship as neutral. What this might mean is that, even if we agree that small-gauge scholarship might be an interesting way to follow through on a De Certeau-like orientation to the small, microcosmic, street-level practices of urban living, it will predominantly do so through a specific and complex set of digitized environments. And even if that allows us as individual academics, artists, and creators to move away from what Erica Stein – drawing on De Certeau – calls the “elevated, masterful view” which reduces the city “to spectacle and thought,” perhaps we might consider whether the dynamics of the networked media systems we inhabit might semi-automate precisely such a view. For this reason, I’m slightly ambiguous around whether or not digital or networked environments allow any sort of escape from the imperatives of capital which some might say are embedded into the urban environment. Most of the digitized environments scholars inhabit today are proprietary, commercialized ecosystems, much like the city (from which they are not separate, in any event).
Of course, in doubling down on this emphasis on “forms,” I risk slipping into a crude technical determinism. Indeed, Alexander Galloway’s recent book The Interface Effect5Alexander Galloway, The Interface Effect (Oxford: Polity, 2012). provides a helpful and sympathetic critique of Lev Manovich’s work. For Galloway, Manovich’s conceptualization of software (what he calls “new media objects” in his earlier work6Galloway’s critique centers on Manovich’s seminal book on new media: Lev Manovich, The Language of New Media (Cambridge, MA: The MIT Press, 2001). ) is far too focused on “the formal essence of the medium… the techniques and characteristics of the technology.”7Galloway, 3. In other words, overemphasizing the technical aspects of small-gauge scholarship would mean we drift into an overly “media-centric” theory of media. This is a view of media which Galloway labels “conservative” because it focuses on media as such, rather than practices of mediation. It’s worth quoting Galloway at length here:
The main difficulty is the simple premise … that new media may be defined via reference to a foundational set of formal qualities, and that these qualities form a coherent language that may be identified across all sorts of new media objects, and above all that the qualities may be read, and may be interpreted. This is what was called, many years ago, structuralism. … This is the crux of the matter: they contain no injunction. They talk more about objects and operations than practices and effects.8Galloway, 23-24.
We should be alert to the formalistic qualities of the mediums, platforms, and environments of an emergent (and largely digitized) small-gauge scholarship. But at the same time, we must also be critically self-reflective of our own media practices, and not lose sight of the questions of ethics and politics. These, too, are central conditions of possibility for how scholars inhabit, work with, and work through the emergent environments of small-gauge scholarship.
Photo by J Brew / Cropped from the original
Scott Rodgers is Audio Editor for Mediapolis. He also holds the post of Reader in Media and Geography in the Department of Film, Media and Cultural Studies at Birkbeck, University of London. His research specialises in the relationships of media and cities and the geographies of communication. Scott also has broad interests in media production practices, digital and networked technologies, urban politics and ethnographic methodologies. His publications have appeared in journals such as Media, Culture and Society, Society and Space, City and Community, International Journal of Cultural Studies, International Journal of Urban and Regional Research, Space and Culture and Journalism: Theory Practice and Criticism. With Tim Markham, he is co-editor of Conditions of Mediation: Phenomenological Perspectives on Media (Peter Lang, 2017).
|↑1||Jay Bolter and Richard Grusin, Remediation: Understanding New Media (Cambridge, MA: MIT Press, 2000).|
|↑2||Lev Manovich, Software Takes Command (London: Bloomsbury, 2013).|
|↑3||Manovich, Software Takes Command, 83, quoting Kay and Goldberg.|
|↑4||Rob Kitchin, “The Programmable City,” Environment and Planning B: Planning and Design 38:6 (2011): 945-951. See also: Rob Kitchin and M. Dodge, Code/space: Software and Everyday Life (Cambridge, MA: The MIT Press, 2011), 3-22.|
|↑5||Alexander Galloway, The Interface Effect (Oxford: Polity, 2012).|
|↑6||Galloway’s critique centers on Manovich’s seminal book on new media: Lev Manovich, The Language of New Media (Cambridge, MA: The MIT Press, 2001).|