Innovation and Preservation

Innovation can mean many different things depending on the topic being discussed. In Nick Montfort and Ian Bogos’ book Racing the Beam they write that “Technical innovations are often understood as the creation of new technology—new materials, new chip designs, new algorithms. But technical innovation can also mean using existing technical constraints in new ways, something that produces interesting results when combined with creative goals.” (p. 53). The idea that things already in existence can be used in a new or different ways is one factor that lead to computers and games to evolve into what we know today.

In Racing the Beam, we learn about the history of the Atari VCS, later renamed the Atari 2600, and its impact on the development of gaming. In spite of what we now consider limited technology, it changed how people relaxed and pioneered the way for future computer and gaming systems. Those programming at the time of the Atari developed a computer opponent to play against, no longer requiring two people to play a game, and created interchangeable cartridges on which to store the games, no longer requiring new hardware to be sold for each game and lowering the price for the consumers.

What interested me most was that they decided to use less than the full computing power of the time in order to make it affordable and attractive to the public. The programmers worked within those self-imposed limitations to create what they needed and wanted the games to do. One of the cost-saving methods was to not include much RAM in the system, forcing the programmers to find creative solutions to complex problems while ‘racing the beam’, or using only the time it takes for a line to be drawn on the TV screen to compute the next one to be displayed. 

While the system was affordable for the average person, it placed a lot of strain on the programmers developing the games. They needed to be creative so that their games would be playable on the limited hardware. This creativity led to more of the game’s graphics being stored in ROMs, a hardware component, and not computed in software, as it is done now. It seems like preserving these old games and programs would be difficult without the original hardware even if we knew how the games were created, without completely re-creating the games within new technology. Everyone has played pong or space invaders at some point in their lives, but few have used an Atari. Can we say that it is really the original game being played and not just something that was created to look and feel like it?

Currently, innovation in games and many other digital media has very little to do with hardware, as the software can accomplish the same task and it is easier to program. However, the problem becomes one of different file formats. As new formats are created and others become obsolete, preserving the original file becomes challenging.  For instance, audio has numerous formats the file could be in but only a few are preservation quality. Should all digital audio in a library or archive be converted to the same format for preservation? Or should the original file be preserved for as long as possible? On the other hand, text is usually preserved as a PDF file, but there are many different subtypes that all have the .pdf extension but only the PDF/A subtype is meant for preservation. How do you know what the subtype is? Or even if the format has subtypes?

Caroline Arms and Carl Fleischhauer in their article “Digital Formats: Factors for Sustainability, Functionality, and Quality” pointed out that “Preservation of content in a given format is not feasible without an understanding of how the information is encoded as bits and bytes in digital files.” (p. 3). However, I would argue that for every information professional to understand how all digital files work is more than is really needed to preserve the information. A basic knowledge that some formats are better than others can be all that is needed in a professional environment, as the details of file formats can be found online if they are required.

Emulation as Resistance and Social Memory

giphy (1)

Racing the Beam: The Atari Video Game Computer System by Ian Bogost and Nick Montfort offers a detailed look at the Atari VCS or what is known by many as the Atari 2600. The book focuses on when the system dominated the market from 1979-1983 and discusses its eventual role in the video game crash of in 1983. In particular, the authors examine a number of game cartridges to reveal the affordances and resistance created by the video game platform and its limiting of “computational expression” in the creation of these games. At first glance, this book may seem largely tangential to preserving digital art but I think there are many commonalities that Bogost and Montfort illustrate quite well and which we can learn from.

The Resistance in the Materials

William_Morris_age_53
Early Atari enthusiast, William Morris

As William Morris once stated, “you can’t have art without resistance in the materials.” The Atari VCS could be defined by its resistance. Coders had to deal with the physical and technical limitations of the platform such as the speed of the beam writing across the screen one line at a time, the limited ROM, and the limited graphical elements, to produce games people actually wanted to play. For what they had to work with, the coders’ results were often innovative and astounding.

All the while, the concurrent political, social, and economic challenges such as creating games for the home (rather than arcade) environment, deadlines for game release based on market forces, and questions over ownership also affected game creation. In these ways, Montfort and Bogost are connecting what is on the screen and the cultural context with the forensic materiality of the hardware that Kirschenbaum describes in Mechanisms. Understanding the entire process of game creation and the limitations of the platform gives a better understanding of all the factors that need to be preserved.

Porting and Licensed Adaptations

The theme of resistance continued in the challenge of taking other arcade video games and licensed works from their original medium and adapting or “porting” them to the Atari 2600. To me, this process raised some interesting connections with ideas of social memory, digital preservation, and significant properties. Specifically, the quote below really got me thinking about these issues:

Along with the lack of true originality in most VCS games—that is, the basis of many VCS games in arcade games or licensed properties—another closely related theme runs throughout the history of the Atari VCS, that of the transformative port or adaptation. When an earlier game is the basis for a VCS game, it can almost never be reproduced on the VCS platform with perfect fidelity. – page 23, Racing the Beam

We see that these games lacked true originality in the sense that they were attempting to copy other works but were original in their transformative adaptation to a system loath to provide the elements needed to reproduce their inspiration exactly.  Montfort and Bogost go on to say that, technical limitations notwithstanding, it is still impossible to replicate the physical environment, interface, or economic context to create a true copy of the game experience for the player, but it can be transformed to get something close enough to make the memory live on.

Social Memory

Porting and adapting have many parallels to the production of informal social memory through recreation and variation. Exactness is not key in this approach, instead adapting or porting seems more like a performance based on a “score” or the instructions of the original artwork, similar to the process that Rinehart and Ippolito discuss in Re-Collection.

From page 69 of Racing the Beam
Pac-Man for Arcade vs. Pac-Man for Atari. From page 69 of Racing the Beam.

Or seen in a different way, the retelling of a game on a different platform can perhaps be compared to the retelling and memory sharing process of oral history. In these ways, the idiosyncrasies of each repetition can be forgiven, assuming the significant features remain, allowing it to be the same “work” on a more conceptual level. With the Atari VCS, the resistance in the materials forced game creators to focus on the most significant elements in order to create something that resembled the look and feel of the original, while still acknowledging the variation of its underlying medium and its context.

Emulation

As both professionals and amateurs try to preserve these games through emulation or even rewriting them to work on new devices that do not have the same resistances, they encounter new unique resistance. The problem here remains of being unable to reproduce these items with the exact fidelity of the original. Newer technology can overcompensate for the original quirks. Newer processors can make a game run faster and better than it ever could, LCD and HDTV’s do not display the games in the same way CRT televisions blur pixels together, and the interface is often not the original joystick. Compensating to make these items run like their original and to feel less advanced is its own form of resistance.

giphy (2)

But as seen above, these characteristics can be forgiven if we view emulation as an attempt to implement the “score” of the original game and its most significant elements. These very good efforts can still ensure that the social memory of these cultural artifacts survive. And to what extent is perfection necessary? Perfection, in fact, can be problematic in communicating authenticity. Sound recordings do not perfectly reproduce the natural world, the sounds of particular elements are amplified to be heard better. Furthermore, some consider vinyl recordings to have a warmer, more authentic sound than their high quality digital versions. In these examples, the consumer expects imperfection. Additionally,  Glitches (often as result of a system running incorrectly) in computers can be hints of the materiality of the system and also provide a certain authenticity. 

If there are always errors or difference, how do we determine the acceptable level of tolerance of them in the preservation of objects and their social memory? I think there can be multiple tolerances as Rinehart and Ippolito describe the “both/and” approach to preservation that allows for multiple representations of the same work. So if a range is okay, where do we stop? Overall, the platform study process Montfort and Bogost undertake seems like an essential framework to understand sufficient context to decide what level of preservation is good enough. But how do we make it scalable?

Rembrandts and Floppy Disks

One of the themes that I took away from the readings this week (and continuing from last week) is that what we see on our computer screens is a performance. A kind of theatrical performance, where what we view on the screen seems magical and perfect, but all the while a lot of chaotic movement and hard grunt work is transpiring behind the curtain.

 

Kirschenbaum asserts that digital objects can be defined in three ways: as physical (inscription), logical (the code that is interpreted by software) and conceptual (what is presented to us). While we deal on a daily basis with the conceptual aspects, and many of us are at least familiar with the logical ones, there has not been much literature focusing on the physical aspects of digital objects, and this is what Kirschenbaum wants to highlight. We remain in the grip of a “medial ideology, with many of the plain truths about the fundamental nature of electronic writing apparently unknown […] or overlooked” (45). He delves deep into looking at how this developed, at how the fundamentals of digital inscription removed digital objects from human intervention, and of how the hard drive was locked away in a box –turning into a “book [that] can be read without being opened” (81). To look at the forensic materiality of digital objects is to understand what makes the magic happen, and allows us to get a more complete examination of the digital object itself.

 

And what do we find when we look at the forensic materiality of digital objects? We find magnetic reversals on a large smooth disk accompanied by a read-and-write head, which acts as a signal processor and converts the analog magnetic reversals on the disk to the formal digital binary material, and vice versa. This system creates a nonvolatile but variable environment for information inscription: the information stored on it is reusable but reaccessible. And the question that rises when it comes to digital media is this: can we use the archival materials and bibliographic data provided by their forensic qualities to reveal a larger critical and interpretive program—or, in other words, reveal the artists’ original motives, thoughts, and milieu? Can we look at an artist’s hard drive and floppy disks to understand what she was thinking and what cultural norms and practices were at that time of inscription, in the same way that we can look at a Rembrandt, look at the materials he decided to use, the strokes he left on the canvas, and understand more about what was really going on at the exact moment Rembrandt started to paint it?

 

Kirschenbaum argues yes to this question, and after looking at all of the examples given in this week’s readings, I can readily agree with him. By looking at Warhol’s Amiga files, for example, we can determine that he was able to complete those drawings within a short amount of time from each other thanks to the time stamps left on the files, even if Arcangel thinks that the time stamps are incorrect. The disk image of Mystery House revealed fragments of two other games, Dung Beetles and Blitzkrieg. This gives us an insider view into the cultural and user context; if anything, it tells us that Mystery House was more important to that user than the other two games.

 

The problem with analyzing this bibliographic data, however, is that because the computer is so successful at hiding its forensic qualities from the human eyes, we very easily overlook these qualities. Out of sight, out of mind. I had no clue about many of these technical features of hard drives, CDs, and floppy disks before reading this book; and I’ll admit most of the technical descriptions still went over my head. But I take it that this is Kirschenbaum’s point – we’ve separated ourselves from the physical materiality of digital processes for so long that we’ve forgotten how they worked. And we need to relearn this information so that we can in turn save digital objects in a more complete form, including their formal and forensic qualities.

 

The perfect example of this is the Warhol Amiga files. If Cory Arcangel hadn’t shown a strong interest in recovering those files, and if the Computer Club at Carnegie Mellon hadn’t had the technical expertise required to reverse engineer the software, then would Andy Warhol’s floppy disks have stayed in their boxes at the Warhol museum forever, serving no purpose other than add to their collection catalog? Another example is Jonathan Larson’s Word Files. If Doug Reside didn’t decide to migrate all 180 of Jonathan Larson’s floppy disks, then the “fast saves” versions of his lyrics for RENT would be frozen forever on those floppy disks, and the glimpse into Larson’s creative process would be remain tucked away and unknown.

 

This is the note Kirschenbaum ends on – that we’ve already lost too much, and that we need to start acting now. And ensuring that archivists and preservationists know exactly what they’re dealing with when it comes to digital media is the first step that needs to be taken.

Minding Individuality

What comes to your mind when you hear the term born digital artwork? To me, it is an image of website on my Mac laptop. It is not hard to see how limited and historically situated my imagination is. Though such first impression may be trivial, the unquestioned assumptions that surround the born digital artworks inform the way we approach them in hope of preservation for the future generations.

Two of the born-digital artworks Matt Kirschenbaum introduces in his Mechanisms: New Media and the Forensic Imagination (2008) suggest how our assumptions about the digital art need to be challenged. One such example is Agrippa, a work of William Gibson originally published in 1992. The text was said to be encrypted with the then state of the art method in order to facilitate a single reading experience of 20 minutes. Contrary to the notion of digital artwork being fluid, Agrippa’s electronic text becomes unaccessible over a short period of time. The book in which the disk was embedded, too, was designed to dissolve by its exposure to the light. Such material characteristics of Agrippa suggest a marriage between the form and the theme of Gibson’s work–fading autobiographical recollection. Suffice it to say that Agrippa is a nicely executed artist book project. Within a day of release, however, Agrippa, started to venture into the new realm, challenging the notion of fixed art. That is, Agrippa’s text, what was said to be unhackable, was miraculously reproduced and posted online. What is more interesting, this text, as Gibson himself acknowledges, keeps changing over the years. Additionally, these textual reproductions and a few reminiscences of original media of Agrippa are now the only access points that allow us to learn about the work. This complicates the assumption of digital artworks being ephemeral.

Second example Kirschenbaum provides is Mystery House, a game written by Roberta and Ken Williams in 1980. As Kirschenbaum offers the tour of its disk image (floppy disk) composed of 40-kilobyte electronic file, it becomes apparent how the construction of the game itself is the main attraction. Recalling his childhood, Kirschenbaum writes: “normative play is perhaps the least interesting level on which to engage [Mystery House]” (129). The disk image, as Kirschenbaum walks us through, exposes the game players how both machine-level instructions and screen-level text are at work simultaneously, blurring the distinction of what is stored and what we see on the screen, the distinction Kirschenbaum calls “forensic materiality” and “formal materiality.” In addition, Kirschenbaum sheds light on how a storage system like this complicates the idea of digital files’ fungibility. According to Kirschenbaum, the disk image retains traces of past activities. For instance, such action as “deletion” does not remove the data but prepares the data to be overwritten, should that happen in the future. While each disk image carries the trace of its unique activities, little attention has been paid to this idiosyncrasy, writes Kirschenbaum. He speculates this overlook has to do with “screen essentialism” (27). In other words, we tend to emphasize the look of any Mystery House–just about what the emulator aims to achieve–rather than attending to the unique constitution of a Mystery House. The different emphasis, should you know, can be described with such terms as “allographic” and “autobiographic.”

Throughout the book, Kirschenbaum illustrates how particular digital artworks can be. It seems to me being conscious about digital artworks’ individuality would better equip us when thinking about what to preserve, how to preserve, and why. The anecdote Cory Arcangel offers in his “The Warhol Files” demonstrates how the assumption may hinder us from the sound preservative practice. The Andy Warhol Museum’s acquisition and the consequent assessment of the painter’s Amiga computer, tells Arcangel, were based on the assumption that the machine should retain files “presumably labeled along the lines of ANDY’s STUFF, ANDY’S DRAWINGS, etc.” Only, those files were not to be found, writes Arcangel. It was later found out, according to Arcangel, that Warhol used an application called GraphiCraft, in order to produce his bitmap drawing, and that this software did not allow files to be saved on other than GraphiCraft disks. Had curators not known the idiosyncratic condition under which Warhol labored, those commercial disks–and the drawings stored on them–might have been overlooked and lost to history.

Doug Reside in his “‘No Day But Today’: A Look at Jonathan Larson’s Word Files” also describes how every software is historically situated. It is easy to say this than to do it, I must add, especially a software in question is as familiar to as Microsoft Word. Concerning the text’s variants among the digital records of Larson’s musical RENT, Reside demonstratively entertains possible interpretations of such variation. It turns out that, according to Reside, Microsoft Word 5.1 (the version Larson used to compose) had a function called “fast save.” This unfamiliar feature to our contemporary ears does as strange things as appending revisions to the end of a file instead of overwriting. Reside concludes that such records would “provide scholars and artists a fascinating glimpse into [Larson’s] creative process.” Such an inquiry is only made possible only when we learn to read such record within the historical context of the medium and its affordance, among other things. Needless to say, such historical awareness is crucial to archiving practice.

Granted, we need to take into consideration the particular native environment within which the digital artwork was, and continues to be, shaped, in order to preserve the work’s significance in as comprehensive manners as possible for the future generations. I can imagine subject specialists would have an important role to play when paying due respect to the individuality of digital artworks. For instance, Kirschenbaum’s walk-through of a digital artwork is undoubtedly invaluable contextual records. But I wonder how we can make this a feasible practice? Kirschenbaum’s description of “Agrippa” concerns its changing environment including the significance of the link to “404 File Not Found” and the variants of typeface rendering its ASCII transcriptions. How much attendance would do justice to the digital artwork’s individuality?

It’s Not Always What it Seems

Week 3: Digital Forensics, Materiality, Fixity & is-ness

The inner workings of a computer have always been a bit of a mystery to me. I grew up when most people didn’t have a computer in the house, let alone several, and the Internet had never been heard of. While in school, I actually remember taking a computer class in which we were taught how Apple’s Graphical User Interface worked. Over the years, thankfully, I progressed beyond operating the mouse, and learned how to use a variety of software applications. But the inner workings of the computer, were still a bit vague – just visions of bits adding up to bytes and kilobytes and megabytes, and so on.

 

Kirschenbaum, in his book Mechanisms: New Media and the Forensic Imagination, recounts a story from his youth when he stopped saving files to a 5 ¼-inch disk and began saving them to the computer itself. The storage was hidden away, behind the hard plastic case of the computer. He explains that architecturally the personal computer didn’t really change, but the “psychological impact” of saving information to the computer, instead of a removable floppy disk, cannot be ignored. He explains that no longer labeling a disk or having one to carry home after class just felt different. The physical aspect of the digital work was taken out of his hands and was sitting concealed inside the computer. In other words, what happens in the storage device stays in the storage device – and if you’re like me, the details of it all weren’t something I necessarily needed to know.

 

According to Kirschenbaum, the storage mechanism of a computer is both a product and a process, and despite the work being created behind the scenes and hidden away as 1s and 0s, it is indeed physical in nature. It has a materiality to it. He goes on to describe in great detail the history of computing, disk arrays, tracks, sectors, read/write heads and hashing.

 

All of these hidden storage and transmission actions point to a very structured process that must consist of rules and standards in order for it all to work. However, Kirschenbaum refers to a forensic materiality, which “ . . . rests upon the principle of individualization, the idea that no two things in the physical world are ever exactly alike” (p. 10). He explains further that a computer is able to create an illusion of immaterial behavior, in which data can be identified without ambiguity, sent without anything being lost, and copied with no variance occurring. This illusion hides the fact that errors can occur anywhere along the line.

 

These errors, whether it is a function of copying over old data, as in the case of the Jonathan Larson collection described by Reside, or intentional tampering, which occurred with the Mystery_House.dsk game in Kirschenbaum’s book, could pass us by, completely unnoticed. But, through the use of a hex editor, these hidden artifacts come to light and provide us additional forensic evidence and new insights. Reside’s article for instance points out the ability to see Larson’s creative process after the hex editor finds deleted text.

 

These pieces of forensic evidence that get tucked away should make us question what in fact we might be copying – it’s not always what it seems. So, as a digital archivist, you have to ask, what versions do you keep? Or do you save all of them? Which version is the “authentic” or “authoritative” one? Or is that an impossible choice to make as a digital archivist?