One of the themes that I took away from the readings this week (and continuing from last week) is that what we see on our computer screens is a performance. A kind of theatrical performance, where what we view on the screen seems magical and perfect, but all the while a lot of chaotic movement and hard grunt work is transpiring behind the curtain.
Kirschenbaum asserts that digital objects can be defined in three ways: as physical (inscription), logical (the code that is interpreted by software) and conceptual (what is presented to us). While we deal on a daily basis with the conceptual aspects, and many of us are at least familiar with the logical ones, there has not been much literature focusing on the physical aspects of digital objects, and this is what Kirschenbaum wants to highlight. We remain in the grip of a “medial ideology, with many of the plain truths about the fundamental nature of electronic writing apparently unknown […] or overlooked” (45). He delves deep into looking at how this developed, at how the fundamentals of digital inscription removed digital objects from human intervention, and of how the hard drive was locked away in a box –turning into a “book [that] can be read without being opened” (81). To look at the forensic materiality of digital objects is to understand what makes the magic happen, and allows us to get a more complete examination of the digital object itself.
And what do we find when we look at the forensic materiality of digital objects? We find magnetic reversals on a large smooth disk accompanied by a read-and-write head, which acts as a signal processor and converts the analog magnetic reversals on the disk to the formal digital binary material, and vice versa. This system creates a nonvolatile but variable environment for information inscription: the information stored on it is reusable but reaccessible. And the question that rises when it comes to digital media is this: can we use the archival materials and bibliographic data provided by their forensic qualities to reveal a larger critical and interpretive program—or, in other words, reveal the artists’ original motives, thoughts, and milieu? Can we look at an artist’s hard drive and floppy disks to understand what she was thinking and what cultural norms and practices were at that time of inscription, in the same way that we can look at a Rembrandt, look at the materials he decided to use, the strokes he left on the canvas, and understand more about what was really going on at the exact moment Rembrandt started to paint it?
Kirschenbaum argues yes to this question, and after looking at all of the examples given in this week’s readings, I can readily agree with him. By looking at Warhol’s Amiga files, for example, we can determine that he was able to complete those drawings within a short amount of time from each other thanks to the time stamps left on the files, even if Arcangel thinks that the time stamps are incorrect. The disk image of Mystery House revealed fragments of two other games, Dung Beetles and Blitzkrieg. This gives us an insider view into the cultural and user context; if anything, it tells us that Mystery House was more important to that user than the other two games.
The problem with analyzing this bibliographic data, however, is that because the computer is so successful at hiding its forensic qualities from the human eyes, we very easily overlook these qualities. Out of sight, out of mind. I had no clue about many of these technical features of hard drives, CDs, and floppy disks before reading this book; and I’ll admit most of the technical descriptions still went over my head. But I take it that this is Kirschenbaum’s point – we’ve separated ourselves from the physical materiality of digital processes for so long that we’ve forgotten how they worked. And we need to relearn this information so that we can in turn save digital objects in a more complete form, including their formal and forensic qualities.
The perfect example of this is the Warhol Amiga files. If Cory Arcangel hadn’t shown a strong interest in recovering those files, and if the Computer Club at Carnegie Mellon hadn’t had the technical expertise required to reverse engineer the software, then would Andy Warhol’s floppy disks have stayed in their boxes at the Warhol museum forever, serving no purpose other than add to their collection catalog? Another example is Jonathan Larson’s Word Files. If Doug Reside didn’t decide to migrate all 180 of Jonathan Larson’s floppy disks, then the “fast saves” versions of his lyrics for RENT would be frozen forever on those floppy disks, and the glimpse into Larson’s creative process would be remain tucked away and unknown.
This is the note Kirschenbaum ends on – that we’ve already lost too much, and that we need to start acting now. And ensuring that archivists and preservationists know exactly what they’re dealing with when it comes to digital media is the first step that needs to be taken.
Brittany, I see your point that Mechanisms sets up a case for forensic disk imaging and grabbing every possible piece of information from a piece of digital media so as to know exactly what it is. But I’d also suggest that imaging at this level could actually enable incremental digital preservation, acknowledging that archivist don’t know (can’t possible know) exactly what they have at hand but that a baseline level of preservation won’t wait. Better to get whatever we can now and refine the details of screening for privacy, long-term preservation, and providing access over time, in other words — what do you think?
I completely agree with you Amy. If we wanted to connect this to last week’s readings, we could say disk imaging conforms with Level 1 of NDSA’s Levels of Digital Preservation (Protect Your Data). Having at least one bit-by-bit copy of the digital object at hand seems like a good place to start, even if we don’t have the time or resources to process it yet. And this is where more collaborative projects like the one that recovered Warhol’s Amiga pictures can relieve the museum of some of their burden by providing the technical knowledge to recover those files later on.
But is it safe for museums to only store the hardware and wait for outside parties to approach them for preservation projects? I think museums need to take a more proactive approach – if Cory Arcangel had tried to do this project twenty, thirty years from now, those floppy disks might not be in the same good condition and those images might never have been recovered. Whereas if the Warhol Museum had already taken the first step in having disk images of those floppy disks as soon as they are acquired, then at least the information on them will be a little more secure for further preservation in the future.
No, it’s not safe (thanks for the prompt!) and I don’t imagine all institutions are waiting to be approached. Partnerships exist, and they take a number of forms that are relevant to what we’re talking about.
There’s the tool development angle: Archivists and archival institutions are actively involved in the development of open-source software at every stage of preserving born-digital material. They’re involved in developing tools for digital forensics (BitCurator and its component scripts), format identification and metadata extraction (JHOVE, Droid, exiftools, etc.), capturing social media (Social Feed Manager), archival description and information management (ArchivesSpace, Archivists’ Toolkit), access (BCA Webtools, ePADD, Olive Executable Archive), and digital preservation lifecycle management (Archivematica). I think this kind of community contribution can have as much short- and long-term impact as consortial partnerships, and there are many access points for LAMs of all sizes and archivists at different levels of experience. For one thing, there are just so many tools that community input is essential to figure out which tools best meet which use cases, and how they can be calibrated to play nicely together. So there’s also no shortage of process-focused work, like what’s featured in this blog series on born-digital access.
Sharing hardware is another kind of collaboration. Following the release of the AIMS report (stands for “An Inter-Institutional Model for Stewardship”), consortia for processing digital media seemed like a real possibility. Not everyone has the equipment to (for example) digitize a wide range of AV formats, much less the expertise or money. A university or other organization with an AV specialization might collaborate with another institution equipped to image obscure storage media for legacy research data, and so on. So that’s one way to envision working together.
Avoiding redundancy and collecting to one’s strengths are two principles of archival collection development that seem to need reinterpretation when it comes to digital art and so on. I’ve heard it said that more web archives are better — “A world with one archive is a really bad idea” — an argument for redundancy for security’s sake, but also a case for democratizing digital preservation and collecting. Maybe it makes sense for different institutions to document different versions or experiences of digital creative work — different Afternoons, for example, or different collections of play-throughs and machinima created with a video game. It’s interesting to consider what new territory collaboration can explore when we’re limited by server space and data budgets rather than digitization hardware, just a different set of limitations to collection development.
However (and this is a big however), Allison’s experience with small archives is undeniably commonplace. It’s clear that power and money are the prices of timely admission to many of these partnerships, even when smaller organizations would benefit as much as any.
I agree with you Brittany that museums, libraries and archives may need to take a more proactive approach. Collaboration with outside parties seems extremely important here and if they’re waiting for outside parties to approach them, they might be waiting for a very long time.
Considering that most archives, etc. are under budget constraints and are unlikely to have the staff to do this – the idea of partnerships with organizations that are able to delve into the more technical aspects might be the best idea. Is anyone aware of an institution currently doing something like this?
I’m not aware of any LAMs partnering with organizations that might better be able to help preserve, access and analyze this kind of data, but it seems like that would be an obvious decision for larger institutions (not that the obvious choice is always actually made).
My concern actually goes to the smaller LAMs because I know from experience that their primary focus is usually on preserving the perceived important, tangible collection they’ve acquired. Analysis of the materials therein is frequently carried out by the researcher. Because of this, I’m wondering if this would not be a “ground-up”movement driven by the researchers.
As I’ve done historical research mostly in small, underfunded archives, I’ve found that the archivists there tend to focus on larger, most important items OR on the kinds of items with which they are most familiar preserving. There have been multiple times where I’ve found something truly valuable in an archive that had been overlooked or stuffed away in a box because there wasn’t time or money to truly preserve it. It’s really just a matter of practicality in those instances. Their funding as well as training just doesn’t allow for the maintenance of all of their collection, let alone items of specials needs.
I’m at a loss as far as how to solve that problem. We ARE losing data on antiquated technology all the time (beyond floppy disks- VHS and cassette tapes are also a huge problem). Do we radically change the training for MLS students so that they are better equipped for handling such materials when/if they do work at underfunded institutions? It seems like a big jump to incorporate into the program enough technological skills that might allow archivists to analyze the data themselves. Would that intimidate potential applicants and detract from the number of archivists entering the field? Is it possible to preserve the physical medium without fully understanding the ends and outs of how the medium operates? These are all questions for which I don’t have a firm answer, but I agree with you all that they’re crucial to the growth of the field and the preservation of our history.
I don’t think that there is a way to preserve all data, especially what is on ‘antiquated technology’. There isn’t enough time or money to transfer everything off old tapes, films, or discs before they become unreadable, either through some physical affliction of the media or because there is nothing left to play them on. We need to accept that loss is a part of preservation and that our choices will determine what is still around for the next generation to see. While Warhol’s Amiga files are interesting and provide an insight into his work and process, are they really as important as his Campbell’s Soup Cans? Are the ‘fast saves’ of Larson’s work more important than the finished RENT? As mentioned, institutions have limited budgets and finite staff time, little of which can be devoted to the ‘problem children’, the items that require extra preservation, time, or equipment without being of demonstrated importance. With this approach, we might be losing something historically or culturally important, can it really be that important if no one know it exists?
I would say that we have to worry less about a finished copy of RENT or Warhol’s Soup Cans. These items have been deemed culturally important and have a strong community backing that would coalesce to ensure their survival. This can be seen particularly in the case of video game cultures. For those items we can look to the interested communities for help preserving them. In contrast, less known but still culturally important items are neglected and have less backers interested in ensuring their longevity. This is both digitally and as with the physical items that Alison mentioned above. The populace can be mobilized with either time or money if there is fear these major items would be lost but would most care if an obscure early digital artwork disappeared forever? Beyond simply less known art or derivatives, institutions continually overlook underrepresented communities in the historical record through a focus on major cultural items. If we have more faith in amateur groups to support the dominant parts of social memory, perhaps we will have the time to ensure a more inclusive one.
I think it is rather ironic that preserving digital artworks with their forensic materiality in mind is most likely to require a large physical storage space (just like the non-digital artworks called for). This may shed light on another perspective on our “waiting for the scholars to come in” route. I recall Rinehart and Ippolito in their book “Re-Collection” discussed about having the digital art producers document what is essential to reproduce/preserve their artworks. Setting aside the question of artists challenging the very idea of preservation by adapting an ephemeral medium by design, I think bottom-up archiving practice may be worth pondering. For example, in addition to being a scholar, Matt Kirschenbaum is a fun of games and a collector of classic computers (as MITH’s Vintage Computers collection suggests: http://mith.umd.edu/vintage-computers/). Kirschenbaum’s walk-through demonstrated in his book “Mechanism” is wonderful contextual information that helps us see what needs to be preserved in order for the future generations to appreciate an artifact such as Mytery_House.dsk. I think Amy mentioned elsewhere about the importance of archivists’ taking initiatives to outreach the community. I wonder how we can encourage digital art enthusiasts to contact cultural institutions?
Tying together M. Gaffney’s and Setsuko’s comments, I think this is where the idea of social preservation comes in, as Kirschenbaum talks about. Do we open up collections as conceptualized in Rinehart’s and Ippolito’s Open Museum, letting the online community decide what to save and what to let go by preserving what most interests them?
Large institutions are already beginning to let enthusiasts do some of the preservation work for them with their print-to-digital collections. One example I know of is the Smithsonian Transcription Center (https://transcription.si.edu/), for which anyone can sign up and volunteer to transcribe the vast collection of historical documents in their possession. Of course in this case, the Smithsonian has already chosen what needs to be saved, and is only allowing digital volunteers to help with the transcription. Perhaps an institution can do something similar with their born-digital collections, advertising online that they have however many floppy disks, VHS, and cassette tapes that haven’t been touched, and any interested researcher/enthusiast with expertise can come and help convert them to more stable formats. Or perhaps the museum can do provide disk image copies already, and the researchers/enthusiasts can then try to “hack” the files and see what treasures lie hidden within them. I think it’s something worth considering when facing reduced budgets, limited resources, and an increasing quantity of cultural knowledge that is waiting to be preserved.