It’s Not Always What it Seems

Week 3: Digital Forensics, Materiality, Fixity & is-ness

The inner workings of a computer have always been a bit of a mystery to me. I grew up when most people didn’t have a computer in the house, let alone several, and the Internet had never been heard of. While in school, I actually remember taking a computer class in which we were taught how Apple’s Graphical User Interface worked. Over the years, thankfully, I progressed beyond operating the mouse, and learned how to use a variety of software applications. But the inner workings of the computer, were still a bit vague – just visions of bits adding up to bytes and kilobytes and megabytes, and so on.


Kirschenbaum, in his book Mechanisms: New Media and the Forensic Imagination, recounts a story from his youth when he stopped saving files to a 5 ¼-inch disk and began saving them to the computer itself. The storage was hidden away, behind the hard plastic case of the computer. He explains that architecturally the personal computer didn’t really change, but the “psychological impact” of saving information to the computer, instead of a removable floppy disk, cannot be ignored. He explains that no longer labeling a disk or having one to carry home after class just felt different. The physical aspect of the digital work was taken out of his hands and was sitting concealed inside the computer. In other words, what happens in the storage device stays in the storage device – and if you’re like me, the details of it all weren’t something I necessarily needed to know.


According to Kirschenbaum, the storage mechanism of a computer is both a product and a process, and despite the work being created behind the scenes and hidden away as 1s and 0s, it is indeed physical in nature. It has a materiality to it. He goes on to describe in great detail the history of computing, disk arrays, tracks, sectors, read/write heads and hashing.


All of these hidden storage and transmission actions point to a very structured process that must consist of rules and standards in order for it all to work. However, Kirschenbaum refers to a forensic materiality, which “ . . . rests upon the principle of individualization, the idea that no two things in the physical world are ever exactly alike” (p. 10). He explains further that a computer is able to create an illusion of immaterial behavior, in which data can be identified without ambiguity, sent without anything being lost, and copied with no variance occurring. This illusion hides the fact that errors can occur anywhere along the line.


These errors, whether it is a function of copying over old data, as in the case of the Jonathan Larson collection described by Reside, or intentional tampering, which occurred with the Mystery_House.dsk game in Kirschenbaum’s book, could pass us by, completely unnoticed. But, through the use of a hex editor, these hidden artifacts come to light and provide us additional forensic evidence and new insights. Reside’s article for instance points out the ability to see Larson’s creative process after the hex editor finds deleted text.


These pieces of forensic evidence that get tucked away should make us question what in fact we might be copying – it’s not always what it seems. So, as a digital archivist, you have to ask, what versions do you keep? Or do you save all of them? Which version is the “authentic” or “authoritative” one? Or is that an impossible choice to make as a digital archivist?

You’ve Come a Long Way, Baby

Hello to everyone, my name is Kerry Huller and I am in my second year of the Digital Curation and Archives program. I am getting close to finishing the degree and was excited to see a new class being offered in digital curation, which was also specific to the arts.

My undergraduate education was in photojournalism and over the years I have worked for various publications. Photography was strictly analog when I began, but digital tools slowly began to make an appearance. By the time I had my first permanent job as a newspaper photographer, everything about the process was digital. From these very early days in digital newspaper photography, circa 1998, it was clear that archiving the work was going to be a problem. The newspaper I worked for had made no plans for saving anything. Back then, only images that ran in the paper were being saved, and they were only getting backed up to the computer’s hard drive, until of course we began to run out of room. At that point, a disgruntled IT staff member elected to trash what little had been saved to eliminate the problem.

As the Smithsonian Interview Project points out, preserving photography and film, or any art that began as analog work, is much more stable today. But, the conservation of digital art beyond this typical analog-turned-digital variety is a very young field. Christine Frohnert estimates it is roughly 15 years old in the Smithsonian article. All of the readings seem to stress that preservation just needs to start somewhere, anywhere. We just need to work on it, experiment and realize it is an evolving process. As Catherine points out in her post, and my own experience tells me, we will lose things. But, if newspapers have learned to no longer store photographs on a computer hard drive and that they should not throw them away when that hard drive is full, then we have made progress.

It has been a messy path, but I do think collaboration is key to moving forward. Fino-Radin stresses communication with the artist, while the Smithsonian article and Rinehart and Ippolito’s book highlight working with a variety of stakeholders, including the artist, curators, conservators, archivists, programmers, etc. This leads me to a question that has constantly come up while pursuing my MLS – how tech savvy does an archivist need to be in today’s world? Yes, a large organization will have a lot of staff to fill these roles, but what about small institutions? What do you think is feasible for a staff that may only consist of a few people or is perhaps a one-man show?