significance shifts

As we explore the more granular planning involved in digital art curation, we repeatedly encounter the idea that significance shifts. Whether it’s evolving re-interpretations of artworks in Re-Collection, the strange history of a video game platform in Racing the Beam, or the fluid readability and scope of Agrippa (as detailed in Mechanisms), it’s becoming clear that preservation over time involves multiple solutions in response to multiple meanings, use cases, and instances of any given artwork.

Continue reading “significance shifts”

Why is who saving what, and how?

It seems that when it comes to preserving born digital works, certain questions need to be raised.  In fact, a lot of questions need to be raised since there is no established consensus on which formal framework to use.  There’s the question of “who,” involving the roles different people play in the lifetime of a work.  This includes the artist, the curator, the preservationist, and the consumer/audience. Next there’s the “why”: what makes this work worth saving, and why did we choose certain components of the work to save? Next comes the “what” part: what exactly do these groups decide to save, and what is it that we are actually saving about this work? And finally there’s the “how”—putting a preservation plan into action.

The “who”: Creators, Curators, Conservators, and Consumers

First comes the artist, who creates the work.  The artist makes the initial creative decisions that make his/her work unique, whether intentionally or incidentally. Next comes the curator, who decides that the work is worth collecting and exhibiting and defends the work’s significance.  After that is the preservationist or conservator, who determines what to preserve and how.  Finally there is the audience/consumer and their role in supporting the work.

What makes born digital works so complex is that the roles of these various groups are often bleeding into each other: the artist creates an interactive work that allows the consumer to feel a sense of authorship in making unique decisions that affect the work; the conservators are now asking for statements of intent from the artists to hear their feedback on what’s significant about the work; and fans of a work can prove crucial in providing the emulation software necessary for preserving that work.

Furthermore, as Dappert and Farquhar insist, different stakeholders place their own constraints on a work.  For instance, Chelcie Rowell discusses how Australian artist Norie Neumark used a specific software called Macromedia Director for her 1997 work Shock in the Ear. The audience who experienced it originally had to load a CD-ROM into their computer, which could have been a Mac or Windows.  The preservationists chose emulation as the best method to save works like this one, and these emulators were created by nostalgic enthusiasts.  So each of these people involved placed constraints on the original work, in terms of hardware, software, and usage.  And these constraints changed from its creation to preservation. Dianne Dietrich concludes with this in regards to digital preservation:

“As more people get involved in this space, there’s a greater awareness of not only the technical, but social and historical implications for this kind of work. Ultimately, there’s so much potential for synergy here. It’s a really great time to be working in this space.”

For this reason, it is becoming more important than ever to document who is doing what with the work, increasing accountability and responsibility. Which leads to…

The “why”: Preservation Intent Statements

As Webb, Pearson, and Koerbin express, before we make any attempt to preserve a work we need to answer the “why”.  Their decision to write Preservation Intent Statements is a means of accomplishing this. For, as Webb et all say, “[w]ithout it, we are left floundering between assumptions that every characteristic of every digital item has to be maintained forever.”

And nobody has the time or resources to save every characteristic of every digital item.  At least I don’t.  To try and do this would be impossible and even undesirable for certain works, where the original hardware and software become too costly to maintain.

This leads to a discussion of authenticity. Like Espenshied points out in regards to preserving GeoCities, with increased authenticity comes a lower level of access, but with a low barrier to access comes a low level of authenticity and higher percentage of lossy-ness. In the case of GeoCities, Espenshied says,

“While restoration work must be done on the right end of the scale to provide a very authentic re-creation of the web’s past, it is just as important to work on every point of the scale in between to allow the broadest possible audience to experience the most authentic re-enactment of Geocities that is comfortable for consumption on many levels of expertise and interest.”

And that gets at the heart of why we should bother to create Preservation Intent Statements before implementing any actual preservation actions.  We need to establish the “bigger picture,” the long-term vision of a particular work’s value.  Rowell also points out that there are different kinds of authenticity: forensic, archival, and cultural.  Forensic and archival authenticity deal with ensuring the object preserved is what it claims to be (if you’ve read Matt Kirschenbaum’s book Mechanisms, you know that this can be harder than you think to achieve).  Cultural authenticity, however, becomes a much more complex issue, and explores how to give respect to the original context of the work while still ensuring a wide level of access.

And once we have decided on the best strategy, we then get into…

The “what” and the “how”: Significant Properties Characteristics

Now that we’ve established the “bigger picture,” we get into the details of exactly how to capture the work for preservation.  This is where Dappert and Farquhar come back in.  Dappert and Farquhar really get technical about the differences between “significant properties” and “significant characteristics.”  Their definition of significant characteristics goes like this:

“Requirements in a specific context, represented as constraints, expressing a combination of characteristics of preservation objects or environments that must be preserved or attained in order to ensure the continued accessibility, usability, and meaning of preservation objects, and their capacity to be accepted as evidence of what they purport to record.”

Sounds confusing, right? The way I understood it was that properties can be thought of like HTML properties for coding.  In coding, properties are simply a means of using a logical system language to define certain attributes of the website/game/whatever we are coding.  Similarly, for a digital work, the property itself is abstract, like “fileSize” or “isVirusScanned.”  We aren’t trying to preserve those properties; rather, it is the pair of the property with its value (like “fileSize=1MB”) that we want to capture, and this is what a characteristic of the work is.  You wouldn’t save a property without its value, nor would you save the value without attaching it to a property.  And significant characteristics go beyond the basic forensic/archival description of the object by capturing the context surrounding the object.  Thus, significant characteristics can evolve and change beyond the original work as the preservation environment changes and as different courses of action are taken.  And all of these changes should be documented along the way through these significant characteristics, prioritized and listed by order of importance.

The last question that remains is… is anyone else’s mind boggled by all this?

The Death of the Significant Property: A Tragedy in Three Acts

Act I: What’s this?

In a defiant act of rebranding, Dappert and Farquhar shed the idea of “significant properties” entirely. Focusing scientifically on the critical and definable elements of each preservation object, Dappert and Farquhar aim to “focus their attention on preserving the most significant characteristics of the content, even at the cost of sacrificing less important ones.”

They enable their scientific precision by introducing a standardized language and a workflow model. This allows anyone who works in the field to be able to relate to the process.

By focusing on the “significant characteristics” of each “entity” Dappert and Farquhar focus their assessment more closely on what each individual item is, it’s format and platform, and then on balancing the actions required to preserve it. Their preservation model takes into account changes in optimal preservation format by building in feedback loops and circular flow of preservation information.

In this way, the effervescent significant properties are discarded for characteristics and relationships that determine the most valuable aspects of the preservation object and the environment required to understand it.

Act II: Why this?

Coming at the matter from a different direction, the National Library of Australia chose instead to focus on the community they serve, the evolving nature and needs of that community, and dedicating their efforts ensuring access the materials chosen in perpetuity – taking into account and meeting the evolving needs. As Webb, Pearson, and Koerbin describe it,

“Like most things to do with managing digital collections, effective ways of making preservation decisions are evolving. … we (the digital preservation community) have no settled, agreed procedures for the full range of digital preservation challenges, nor even tentative plans we confidently expect to ensure adequate preservation for the next 100, 200 or 500 years.”

Because the file types and platforms are such nebulous commodities, Webb, Pearson, and Koerbin explain that their institutions previous efforts with one-size-fits-all definitions of significant properties and policies fell short of a feasible work strategy.

Webb et al go on to explain how the National Library of Australia redefined their work strategy, as described above, by focusing on why something was important – why the user would want it. This re-envisioning lead to redefining “significant properties” not as the building blocks of the preservation itself, but the tent poles of the document that justifies the need for preservation.

Act III: Lazarus?

Despite the effort to rid the profession of a seemingly outdated idea, I would argue that that together the work of the NLA and Dappert and Farquhar fits together magnificently, but in that order. Simplifying the definition of the “significant property” as the aspect of the entity that makes it worthy of preservation and initiations the statement of intent to preserve is enough, then once the priorities of the designated community have been established, the model developed by Dappert and Farquhar comes into play. Using the example of Preservation & Access Framework for Digital Art Objects (PAFDAO)’s preservation of Shock the Ear (1997) and other works developed in the same software – Macromedia Director – the preservation of the art is dependent on knowing the ins-and-outs of the program in which it was developed, a process much easier when conducted at scale. By this I  mean that when a collected such as the PAFDAO has dozens of early works of multimedia digital art all built from the same software, understanding the aforementioned ins and outs becomes easier because the preservationists develop familiarity with what’s a quirk of the software, a facet of the art, or a fatal flaw in a particular file. These actions are the work of Dappert and Farquhar’s model; moving backwards to Webb et al is the drive to determine why these works are valuable to the community, and even further back, to the underlining definition of the most significant of properties: that this is art and every distinction going forward will be informed by that attribution.

Introduction

Hi, I am James H. a graduate MLS student at the University of Maryland.  I am currently in my last semester and expect to graduate in the spring.  I joined the class a little late due to a scheduling error so i am submitting this a bit later than everyone else as a result.  I am interested in this course because I believe that digital technology is the future of information access and management and that understanding digital curation will be very important for future archivists.SECOND HAND

platforms and constraints

In Racing the Beam: The Atari Video Computer System (The MIT Press, 2009), Ian Bogost and Nick Montfort introduce platform studies as an approach to studying games and other digital media, tracing the history of the Atari VCS home video game console as a case study. Here’s how they define platform:

“Whatever the programmer takes for granted when developing, and whatever, from another side, the user is required to have working in order to use particular software, is the platform” (p. 2-3).

Platforms shape the actions of their users, which can cut two ways. The Atari VCS’ many limitations sparked creativity in game design, but the assumptions hidden in other platforms could have malign consequences. When preserving platforms and platform-dependent art, we’ll need to consider how best to make these influences explicit.

creativity from constraints

In order to preserve executable and/or reusable versions of software and digital artworks, we’ll need to document how constraints in platforms shape creative decisions. In Racing the Beam, a member of the design team for the Atari game Star Wars: The Empire Strikes Back recalls, “We prioritized the game elements we wanted to include and watched the list diminish as the available cartridge space was used up” (p. 128). This is one of many instances in which designers and gamers maximized what they could do within the Atari VCS’s limitations.

Bogost and Montfort write, “Technical innovations are often understood as the creation of new technology–new materials, new chip designs, new algorithms. But technical innovation can also mean using existing technical constraints in new ways, something that produces interesting results when combined with creative goals” (p. 53). The limits of preservation, such as our inability to completely document or perfectly save an old piece of software, offer their own set of restrictions. Preservation-related constraints can be detrimental to faithful reproduction, but they also free artists and curators to reinterpret the works, with “interesting results.”

Specific documentation and interoperable data might be the dream combination enabling Ippolito and Rinehart’s gross but effective concept of a “mother copy” (Re-Collection: Art, New Media, and Social Memory, Cambridge, MA: The MIT Press, 2014, p. 24). But where description and documentation inevitably fail could be where reuse really takes off.

hidden assumptions

While the restrictiveness of platforms can be good for creativity, hidden constraints and assumptions aren’t always beneficial. Game designers working with the Atari VCS seemed extremely knowledgeable about the limits governing their work, but that might not be true for artists working with software and hardware today. Platform studies suggests that we should continually interrogate the tools and systems we use, even as we build upon them.

I’m reminded of an article about “library as infrastructure” in which Shannon Mattern highlights problems with a popular “library as open platform” metaphor. Infrastructure is embedded, complicated, old and dirty, comes freighted with history. Open platforms are ostensibly about encouraging anyone to remix library collections and metadata (for example) but can obscure the values on which the platforms run. While Mattern argues that infrastructure is closer to the reality of libraries than the open platform, her “infrastructure” is akin to “platforms” as framed in Racing the Beam.

As a complement to Bogost and Montfort’s observations about technological innovation, Mark Matienzo’s keynote for the 2015 LITA Forum wraps up a lot of key issues in building new technology upon old platforms. He questions how innovative or revolutionary a technology — such as linked data created from old classification systems — can actually be, so long as participation and “the power to name” are distributed as before. My first reaction to the talk was, “Read this if you are human and work with information.” But these concerns are especially important for us since preserving platforms and their products means documenting creative cultures and relying upon members of those cultures in the documentation process.

Matienzo might find common ground with glitch artist Scott Fitzgerald, who says:

In glitch art, more so than a lot of other art forms, I am a really big proponent of the idea that the process is more important. Part of the process is empowering people to understand the tools, understand the underlying structures, like what’s going on inside of a computer. So, as soon as you understand a system enough to know why you’re breaking it, then you have a better understanding of what the tool is built for.

hidden histories go on and on

As Matt Kirschenbaum, Mattern and Matienzo (and many, many others) suggest, we can push, break apart, and interrogate platforms by delving into the social and political histories of the hardware and software. We’ll probably find that the hidden histories just keep going.

For example, Bogost and Montfort mention that the Atari VCS used a processor and other hardware manufactured by Fairchild Semiconductor. I happen to have read a paper by digital media scholar Lisa Nakamura tracing the history of Navajo women’s involvement in the manufacture of these parts at Fairchild’s plant in Shiprock, N.M. “Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture” focuses on convoluted representations of these workers in Fairchild’s corporate archives, and the conspicuous absence of their actual testimonies in the archival record. This chance connection is based on my having just one other context for the name Fairchild Semiconductor, but it reinforces that platform studies are inseparable from studies of gender, labor, race, and class.

from platform studies to preservation strategy

All of this suggests that preserving digital art is a continuous process of investigation forward and backward in time from the moment of a digital object’s creation — if a singular moment can be identified at all.

Arms and Fleischhauer (2005) make two especially helpful contributions to how we might translate platform studies into preservation strategy. First, they conceive of digital formats as tied to the stages of a digital project’s life cycle (creation, preservation, and use). They call for archivists to investigate the full range of formats used and the relationships between content stored in each. Second, they enumerate specific sustainability and quality/functionality factors for promoting the longevity of digital formats. Each factor could serve as a way for archivists to enter into conversation with creators and users of digital media platforms, from whom we seek help.

Innovation and Preservation

Innovation can mean many different things depending on the topic being discussed. In Nick Montfort and Ian Bogos’ book Racing the Beam they write that “Technical innovations are often understood as the creation of new technology—new materials, new chip designs, new algorithms. But technical innovation can also mean using existing technical constraints in new ways, something that produces interesting results when combined with creative goals.” (p. 53). The idea that things already in existence can be used in a new or different ways is one factor that lead to computers and games to evolve into what we know today.

In Racing the Beam, we learn about the history of the Atari VCS, later renamed the Atari 2600, and its impact on the development of gaming. In spite of what we now consider limited technology, it changed how people relaxed and pioneered the way for future computer and gaming systems. Those programming at the time of the Atari developed a computer opponent to play against, no longer requiring two people to play a game, and created interchangeable cartridges on which to store the games, no longer requiring new hardware to be sold for each game and lowering the price for the consumers.

What interested me most was that they decided to use less than the full computing power of the time in order to make it affordable and attractive to the public. The programmers worked within those self-imposed limitations to create what they needed and wanted the games to do. One of the cost-saving methods was to not include much RAM in the system, forcing the programmers to find creative solutions to complex problems while ‘racing the beam’, or using only the time it takes for a line to be drawn on the TV screen to compute the next one to be displayed. 

While the system was affordable for the average person, it placed a lot of strain on the programmers developing the games. They needed to be creative so that their games would be playable on the limited hardware. This creativity led to more of the game’s graphics being stored in ROMs, a hardware component, and not computed in software, as it is done now. It seems like preserving these old games and programs would be difficult without the original hardware even if we knew how the games were created, without completely re-creating the games within new technology. Everyone has played pong or space invaders at some point in their lives, but few have used an Atari. Can we say that it is really the original game being played and not just something that was created to look and feel like it?

Currently, innovation in games and many other digital media has very little to do with hardware, as the software can accomplish the same task and it is easier to program. However, the problem becomes one of different file formats. As new formats are created and others become obsolete, preserving the original file becomes challenging.  For instance, audio has numerous formats the file could be in but only a few are preservation quality. Should all digital audio in a library or archive be converted to the same format for preservation? Or should the original file be preserved for as long as possible? On the other hand, text is usually preserved as a PDF file, but there are many different subtypes that all have the .pdf extension but only the PDF/A subtype is meant for preservation. How do you know what the subtype is? Or even if the format has subtypes?

Caroline Arms and Carl Fleischhauer in their article “Digital Formats: Factors for Sustainability, Functionality, and Quality” pointed out that “Preservation of content in a given format is not feasible without an understanding of how the information is encoded as bits and bytes in digital files.” (p. 3). However, I would argue that for every information professional to understand how all digital files work is more than is really needed to preserve the information. A basic knowledge that some formats are better than others can be all that is needed in a professional environment, as the details of file formats can be found online if they are required.

Emulation as Resistance and Social Memory

giphy (1)

Racing the Beam: The Atari Video Game Computer System by Ian Bogost and Nick Montfort offers a detailed look at the Atari VCS or what is known by many as the Atari 2600. The book focuses on when the system dominated the market from 1979-1983 and discusses its eventual role in the video game crash of in 1983. In particular, the authors examine a number of game cartridges to reveal the affordances and resistance created by the video game platform and its limiting of “computational expression” in the creation of these games. At first glance, this book may seem largely tangential to preserving digital art but I think there are many commonalities that Bogost and Montfort illustrate quite well and which we can learn from.

The Resistance in the Materials

William_Morris_age_53
Early Atari enthusiast, William Morris

As William Morris once stated, “you can’t have art without resistance in the materials.” The Atari VCS could be defined by its resistance. Coders had to deal with the physical and technical limitations of the platform such as the speed of the beam writing across the screen one line at a time, the limited ROM, and the limited graphical elements, to produce games people actually wanted to play. For what they had to work with, the coders’ results were often innovative and astounding.

All the while, the concurrent political, social, and economic challenges such as creating games for the home (rather than arcade) environment, deadlines for game release based on market forces, and questions over ownership also affected game creation. In these ways, Montfort and Bogost are connecting what is on the screen and the cultural context with the forensic materiality of the hardware that Kirschenbaum describes in Mechanisms. Understanding the entire process of game creation and the limitations of the platform gives a better understanding of all the factors that need to be preserved.

Porting and Licensed Adaptations

The theme of resistance continued in the challenge of taking other arcade video games and licensed works from their original medium and adapting or “porting” them to the Atari 2600. To me, this process raised some interesting connections with ideas of social memory, digital preservation, and significant properties. Specifically, the quote below really got me thinking about these issues:

Along with the lack of true originality in most VCS games—that is, the basis of many VCS games in arcade games or licensed properties—another closely related theme runs throughout the history of the Atari VCS, that of the transformative port or adaptation. When an earlier game is the basis for a VCS game, it can almost never be reproduced on the VCS platform with perfect fidelity. – page 23, Racing the Beam

We see that these games lacked true originality in the sense that they were attempting to copy other works but were original in their transformative adaptation to a system loath to provide the elements needed to reproduce their inspiration exactly.  Montfort and Bogost go on to say that, technical limitations notwithstanding, it is still impossible to replicate the physical environment, interface, or economic context to create a true copy of the game experience for the player, but it can be transformed to get something close enough to make the memory live on.

Social Memory

Porting and adapting have many parallels to the production of informal social memory through recreation and variation. Exactness is not key in this approach, instead adapting or porting seems more like a performance based on a “score” or the instructions of the original artwork, similar to the process that Rinehart and Ippolito discuss in Re-Collection.

From page 69 of Racing the Beam
Pac-Man for Arcade vs. Pac-Man for Atari. From page 69 of Racing the Beam.

Or seen in a different way, the retelling of a game on a different platform can perhaps be compared to the retelling and memory sharing process of oral history. In these ways, the idiosyncrasies of each repetition can be forgiven, assuming the significant features remain, allowing it to be the same “work” on a more conceptual level. With the Atari VCS, the resistance in the materials forced game creators to focus on the most significant elements in order to create something that resembled the look and feel of the original, while still acknowledging the variation of its underlying medium and its context.

Emulation

As both professionals and amateurs try to preserve these games through emulation or even rewriting them to work on new devices that do not have the same resistances, they encounter new unique resistance. The problem here remains of being unable to reproduce these items with the exact fidelity of the original. Newer technology can overcompensate for the original quirks. Newer processors can make a game run faster and better than it ever could, LCD and HDTV’s do not display the games in the same way CRT televisions blur pixels together, and the interface is often not the original joystick. Compensating to make these items run like their original and to feel less advanced is its own form of resistance.

giphy (2)

But as seen above, these characteristics can be forgiven if we view emulation as an attempt to implement the “score” of the original game and its most significant elements. These very good efforts can still ensure that the social memory of these cultural artifacts survive. And to what extent is perfection necessary? Perfection, in fact, can be problematic in communicating authenticity. Sound recordings do not perfectly reproduce the natural world, the sounds of particular elements are amplified to be heard better. Furthermore, some consider vinyl recordings to have a warmer, more authentic sound than their high quality digital versions. In these examples, the consumer expects imperfection. Additionally,  Glitches (often as result of a system running incorrectly) in computers can be hints of the materiality of the system and also provide a certain authenticity. 

If there are always errors or difference, how do we determine the acceptable level of tolerance of them in the preservation of objects and their social memory? I think there can be multiple tolerances as Rinehart and Ippolito describe the “both/and” approach to preservation that allows for multiple representations of the same work. So if a range is okay, where do we stop? Overall, the platform study process Montfort and Bogost undertake seems like an essential framework to understand sufficient context to decide what level of preservation is good enough. But how do we make it scalable?

Rembrandts and Floppy Disks

One of the themes that I took away from the readings this week (and continuing from last week) is that what we see on our computer screens is a performance. A kind of theatrical performance, where what we view on the screen seems magical and perfect, but all the while a lot of chaotic movement and hard grunt work is transpiring behind the curtain.

 

Kirschenbaum asserts that digital objects can be defined in three ways: as physical (inscription), logical (the code that is interpreted by software) and conceptual (what is presented to us). While we deal on a daily basis with the conceptual aspects, and many of us are at least familiar with the logical ones, there has not been much literature focusing on the physical aspects of digital objects, and this is what Kirschenbaum wants to highlight. We remain in the grip of a “medial ideology, with many of the plain truths about the fundamental nature of electronic writing apparently unknown […] or overlooked” (45). He delves deep into looking at how this developed, at how the fundamentals of digital inscription removed digital objects from human intervention, and of how the hard drive was locked away in a box –turning into a “book [that] can be read without being opened” (81). To look at the forensic materiality of digital objects is to understand what makes the magic happen, and allows us to get a more complete examination of the digital object itself.

 

And what do we find when we look at the forensic materiality of digital objects? We find magnetic reversals on a large smooth disk accompanied by a read-and-write head, which acts as a signal processor and converts the analog magnetic reversals on the disk to the formal digital binary material, and vice versa. This system creates a nonvolatile but variable environment for information inscription: the information stored on it is reusable but reaccessible. And the question that rises when it comes to digital media is this: can we use the archival materials and bibliographic data provided by their forensic qualities to reveal a larger critical and interpretive program—or, in other words, reveal the artists’ original motives, thoughts, and milieu? Can we look at an artist’s hard drive and floppy disks to understand what she was thinking and what cultural norms and practices were at that time of inscription, in the same way that we can look at a Rembrandt, look at the materials he decided to use, the strokes he left on the canvas, and understand more about what was really going on at the exact moment Rembrandt started to paint it?

 

Kirschenbaum argues yes to this question, and after looking at all of the examples given in this week’s readings, I can readily agree with him. By looking at Warhol’s Amiga files, for example, we can determine that he was able to complete those drawings within a short amount of time from each other thanks to the time stamps left on the files, even if Arcangel thinks that the time stamps are incorrect. The disk image of Mystery House revealed fragments of two other games, Dung Beetles and Blitzkrieg. This gives us an insider view into the cultural and user context; if anything, it tells us that Mystery House was more important to that user than the other two games.

 

The problem with analyzing this bibliographic data, however, is that because the computer is so successful at hiding its forensic qualities from the human eyes, we very easily overlook these qualities. Out of sight, out of mind. I had no clue about many of these technical features of hard drives, CDs, and floppy disks before reading this book; and I’ll admit most of the technical descriptions still went over my head. But I take it that this is Kirschenbaum’s point – we’ve separated ourselves from the physical materiality of digital processes for so long that we’ve forgotten how they worked. And we need to relearn this information so that we can in turn save digital objects in a more complete form, including their formal and forensic qualities.

 

The perfect example of this is the Warhol Amiga files. If Cory Arcangel hadn’t shown a strong interest in recovering those files, and if the Computer Club at Carnegie Mellon hadn’t had the technical expertise required to reverse engineer the software, then would Andy Warhol’s floppy disks have stayed in their boxes at the Warhol museum forever, serving no purpose other than add to their collection catalog? Another example is Jonathan Larson’s Word Files. If Doug Reside didn’t decide to migrate all 180 of Jonathan Larson’s floppy disks, then the “fast saves” versions of his lyrics for RENT would be frozen forever on those floppy disks, and the glimpse into Larson’s creative process would be remain tucked away and unknown.

 

This is the note Kirschenbaum ends on – that we’ve already lost too much, and that we need to start acting now. And ensuring that archivists and preservationists know exactly what they’re dealing with when it comes to digital media is the first step that needs to be taken.

Minding Individuality

What comes to your mind when you hear the term born digital artwork? To me, it is an image of website on my Mac laptop. It is not hard to see how limited and historically situated my imagination is. Though such first impression may be trivial, the unquestioned assumptions that surround the born digital artworks inform the way we approach them in hope of preservation for the future generations.

Two of the born-digital artworks Matt Kirschenbaum introduces in his Mechanisms: New Media and the Forensic Imagination (2008) suggest how our assumptions about the digital art need to be challenged. One such example is Agrippa, a work of William Gibson originally published in 1992. The text was said to be encrypted with the then state of the art method in order to facilitate a single reading experience of 20 minutes. Contrary to the notion of digital artwork being fluid, Agrippa’s electronic text becomes unaccessible over a short period of time. The book in which the disk was embedded, too, was designed to dissolve by its exposure to the light. Such material characteristics of Agrippa suggest a marriage between the form and the theme of Gibson’s work–fading autobiographical recollection. Suffice it to say that Agrippa is a nicely executed artist book project. Within a day of release, however, Agrippa, started to venture into the new realm, challenging the notion of fixed art. That is, Agrippa’s text, what was said to be unhackable, was miraculously reproduced and posted online. What is more interesting, this text, as Gibson himself acknowledges, keeps changing over the years. Additionally, these textual reproductions and a few reminiscences of original media of Agrippa are now the only access points that allow us to learn about the work. This complicates the assumption of digital artworks being ephemeral.

Second example Kirschenbaum provides is Mystery House, a game written by Roberta and Ken Williams in 1980. As Kirschenbaum offers the tour of its disk image (floppy disk) composed of 40-kilobyte electronic file, it becomes apparent how the construction of the game itself is the main attraction. Recalling his childhood, Kirschenbaum writes: “normative play is perhaps the least interesting level on which to engage [Mystery House]” (129). The disk image, as Kirschenbaum walks us through, exposes the game players how both machine-level instructions and screen-level text are at work simultaneously, blurring the distinction of what is stored and what we see on the screen, the distinction Kirschenbaum calls “forensic materiality” and “formal materiality.” In addition, Kirschenbaum sheds light on how a storage system like this complicates the idea of digital files’ fungibility. According to Kirschenbaum, the disk image retains traces of past activities. For instance, such action as “deletion” does not remove the data but prepares the data to be overwritten, should that happen in the future. While each disk image carries the trace of its unique activities, little attention has been paid to this idiosyncrasy, writes Kirschenbaum. He speculates this overlook has to do with “screen essentialism” (27). In other words, we tend to emphasize the look of any Mystery House–just about what the emulator aims to achieve–rather than attending to the unique constitution of a Mystery House. The different emphasis, should you know, can be described with such terms as “allographic” and “autobiographic.”

Throughout the book, Kirschenbaum illustrates how particular digital artworks can be. It seems to me being conscious about digital artworks’ individuality would better equip us when thinking about what to preserve, how to preserve, and why. The anecdote Cory Arcangel offers in his “The Warhol Files” demonstrates how the assumption may hinder us from the sound preservative practice. The Andy Warhol Museum’s acquisition and the consequent assessment of the painter’s Amiga computer, tells Arcangel, were based on the assumption that the machine should retain files “presumably labeled along the lines of ANDY’s STUFF, ANDY’S DRAWINGS, etc.” Only, those files were not to be found, writes Arcangel. It was later found out, according to Arcangel, that Warhol used an application called GraphiCraft, in order to produce his bitmap drawing, and that this software did not allow files to be saved on other than GraphiCraft disks. Had curators not known the idiosyncratic condition under which Warhol labored, those commercial disks–and the drawings stored on them–might have been overlooked and lost to history.

Doug Reside in his “‘No Day But Today’: A Look at Jonathan Larson’s Word Files” also describes how every software is historically situated. It is easy to say this than to do it, I must add, especially a software in question is as familiar to as Microsoft Word. Concerning the text’s variants among the digital records of Larson’s musical RENT, Reside demonstratively entertains possible interpretations of such variation. It turns out that, according to Reside, Microsoft Word 5.1 (the version Larson used to compose) had a function called “fast save.” This unfamiliar feature to our contemporary ears does as strange things as appending revisions to the end of a file instead of overwriting. Reside concludes that such records would “provide scholars and artists a fascinating glimpse into [Larson’s] creative process.” Such an inquiry is only made possible only when we learn to read such record within the historical context of the medium and its affordance, among other things. Needless to say, such historical awareness is crucial to archiving practice.

Granted, we need to take into consideration the particular native environment within which the digital artwork was, and continues to be, shaped, in order to preserve the work’s significance in as comprehensive manners as possible for the future generations. I can imagine subject specialists would have an important role to play when paying due respect to the individuality of digital artworks. For instance, Kirschenbaum’s walk-through of a digital artwork is undoubtedly invaluable contextual records. But I wonder how we can make this a feasible practice? Kirschenbaum’s description of “Agrippa” concerns its changing environment including the significance of the link to “404 File Not Found” and the variants of typeface rendering its ASCII transcriptions. How much attendance would do justice to the digital artwork’s individuality?

It’s Not Always What it Seems

Week 3: Digital Forensics, Materiality, Fixity & is-ness

The inner workings of a computer have always been a bit of a mystery to me. I grew up when most people didn’t have a computer in the house, let alone several, and the Internet had never been heard of. While in school, I actually remember taking a computer class in which we were taught how Apple’s Graphical User Interface worked. Over the years, thankfully, I progressed beyond operating the mouse, and learned how to use a variety of software applications. But the inner workings of the computer, were still a bit vague – just visions of bits adding up to bytes and kilobytes and megabytes, and so on.

 

Kirschenbaum, in his book Mechanisms: New Media and the Forensic Imagination, recounts a story from his youth when he stopped saving files to a 5 ¼-inch disk and began saving them to the computer itself. The storage was hidden away, behind the hard plastic case of the computer. He explains that architecturally the personal computer didn’t really change, but the “psychological impact” of saving information to the computer, instead of a removable floppy disk, cannot be ignored. He explains that no longer labeling a disk or having one to carry home after class just felt different. The physical aspect of the digital work was taken out of his hands and was sitting concealed inside the computer. In other words, what happens in the storage device stays in the storage device – and if you’re like me, the details of it all weren’t something I necessarily needed to know.

 

According to Kirschenbaum, the storage mechanism of a computer is both a product and a process, and despite the work being created behind the scenes and hidden away as 1s and 0s, it is indeed physical in nature. It has a materiality to it. He goes on to describe in great detail the history of computing, disk arrays, tracks, sectors, read/write heads and hashing.

 

All of these hidden storage and transmission actions point to a very structured process that must consist of rules and standards in order for it all to work. However, Kirschenbaum refers to a forensic materiality, which “ . . . rests upon the principle of individualization, the idea that no two things in the physical world are ever exactly alike” (p. 10). He explains further that a computer is able to create an illusion of immaterial behavior, in which data can be identified without ambiguity, sent without anything being lost, and copied with no variance occurring. This illusion hides the fact that errors can occur anywhere along the line.

 

These errors, whether it is a function of copying over old data, as in the case of the Jonathan Larson collection described by Reside, or intentional tampering, which occurred with the Mystery_House.dsk game in Kirschenbaum’s book, could pass us by, completely unnoticed. But, through the use of a hex editor, these hidden artifacts come to light and provide us additional forensic evidence and new insights. Reside’s article for instance points out the ability to see Larson’s creative process after the hex editor finds deleted text.

 

These pieces of forensic evidence that get tucked away should make us question what in fact we might be copying – it’s not always what it seems. So, as a digital archivist, you have to ask, what versions do you keep? Or do you save all of them? Which version is the “authentic” or “authoritative” one? Or is that an impossible choice to make as a digital archivist?