significance shifts

As we explore the more granular planning involved in digital art curation, we repeatedly encounter the idea that significance shifts. Whether it’s evolving re-interpretations of artworks in Re-Collection, the strange history of a video game platform in Racing the Beam, or the fluid readability and scope of Agrippa (as detailed in Mechanisms), it’s becoming clear that preservation over time involves multiple solutions in response to multiple meanings, use cases, and instances of any given artwork.

Continue reading “significance shifts”

Why is who saving what, and how?

It seems that when it comes to preserving born digital works, certain questions need to be raised.  In fact, a lot of questions need to be raised since there is no established consensus on which formal framework to use.  There’s the question of “who,” involving the roles different people play in the lifetime of a work.  This includes the artist, the curator, the preservationist, and the consumer/audience. Next there’s the “why”: what makes this work worth saving, and why did we choose certain components of the work to save? Next comes the “what” part: what exactly do these groups decide to save, and what is it that we are actually saving about this work? And finally there’s the “how”—putting a preservation plan into action.

The “who”: Creators, Curators, Conservators, and Consumers

First comes the artist, who creates the work.  The artist makes the initial creative decisions that make his/her work unique, whether intentionally or incidentally. Next comes the curator, who decides that the work is worth collecting and exhibiting and defends the work’s significance.  After that is the preservationist or conservator, who determines what to preserve and how.  Finally there is the audience/consumer and their role in supporting the work.

What makes born digital works so complex is that the roles of these various groups are often bleeding into each other: the artist creates an interactive work that allows the consumer to feel a sense of authorship in making unique decisions that affect the work; the conservators are now asking for statements of intent from the artists to hear their feedback on what’s significant about the work; and fans of a work can prove crucial in providing the emulation software necessary for preserving that work.

Furthermore, as Dappert and Farquhar insist, different stakeholders place their own constraints on a work.  For instance, Chelcie Rowell discusses how Australian artist Norie Neumark used a specific software called Macromedia Director for her 1997 work Shock in the Ear. The audience who experienced it originally had to load a CD-ROM into their computer, which could have been a Mac or Windows.  The preservationists chose emulation as the best method to save works like this one, and these emulators were created by nostalgic enthusiasts.  So each of these people involved placed constraints on the original work, in terms of hardware, software, and usage.  And these constraints changed from its creation to preservation. Dianne Dietrich concludes with this in regards to digital preservation:

“As more people get involved in this space, there’s a greater awareness of not only the technical, but social and historical implications for this kind of work. Ultimately, there’s so much potential for synergy here. It’s a really great time to be working in this space.”

For this reason, it is becoming more important than ever to document who is doing what with the work, increasing accountability and responsibility. Which leads to…

The “why”: Preservation Intent Statements

As Webb, Pearson, and Koerbin express, before we make any attempt to preserve a work we need to answer the “why”.  Their decision to write Preservation Intent Statements is a means of accomplishing this. For, as Webb et all say, “[w]ithout it, we are left floundering between assumptions that every characteristic of every digital item has to be maintained forever.”

And nobody has the time or resources to save every characteristic of every digital item.  At least I don’t.  To try and do this would be impossible and even undesirable for certain works, where the original hardware and software become too costly to maintain.

This leads to a discussion of authenticity. Like Espenshied points out in regards to preserving GeoCities, with increased authenticity comes a lower level of access, but with a low barrier to access comes a low level of authenticity and higher percentage of lossy-ness. In the case of GeoCities, Espenshied says,

“While restoration work must be done on the right end of the scale to provide a very authentic re-creation of the web’s past, it is just as important to work on every point of the scale in between to allow the broadest possible audience to experience the most authentic re-enactment of Geocities that is comfortable for consumption on many levels of expertise and interest.”

And that gets at the heart of why we should bother to create Preservation Intent Statements before implementing any actual preservation actions.  We need to establish the “bigger picture,” the long-term vision of a particular work’s value.  Rowell also points out that there are different kinds of authenticity: forensic, archival, and cultural.  Forensic and archival authenticity deal with ensuring the object preserved is what it claims to be (if you’ve read Matt Kirschenbaum’s book Mechanisms, you know that this can be harder than you think to achieve).  Cultural authenticity, however, becomes a much more complex issue, and explores how to give respect to the original context of the work while still ensuring a wide level of access.

And once we have decided on the best strategy, we then get into…

The “what” and the “how”: Significant Properties Characteristics

Now that we’ve established the “bigger picture,” we get into the details of exactly how to capture the work for preservation.  This is where Dappert and Farquhar come back in.  Dappert and Farquhar really get technical about the differences between “significant properties” and “significant characteristics.”  Their definition of significant characteristics goes like this:

“Requirements in a specific context, represented as constraints, expressing a combination of characteristics of preservation objects or environments that must be preserved or attained in order to ensure the continued accessibility, usability, and meaning of preservation objects, and their capacity to be accepted as evidence of what they purport to record.”

Sounds confusing, right? The way I understood it was that properties can be thought of like HTML properties for coding.  In coding, properties are simply a means of using a logical system language to define certain attributes of the website/game/whatever we are coding.  Similarly, for a digital work, the property itself is abstract, like “fileSize” or “isVirusScanned.”  We aren’t trying to preserve those properties; rather, it is the pair of the property with its value (like “fileSize=1MB”) that we want to capture, and this is what a characteristic of the work is.  You wouldn’t save a property without its value, nor would you save the value without attaching it to a property.  And significant characteristics go beyond the basic forensic/archival description of the object by capturing the context surrounding the object.  Thus, significant characteristics can evolve and change beyond the original work as the preservation environment changes and as different courses of action are taken.  And all of these changes should be documented along the way through these significant characteristics, prioritized and listed by order of importance.

The last question that remains is… is anyone else’s mind boggled by all this?

The Death of the Significant Property: A Tragedy in Three Acts

Act I: What’s this?

In a defiant act of rebranding, Dappert and Farquhar shed the idea of “significant properties” entirely. Focusing scientifically on the critical and definable elements of each preservation object, Dappert and Farquhar aim to “focus their attention on preserving the most significant characteristics of the content, even at the cost of sacrificing less important ones.”

They enable their scientific precision by introducing a standardized language and a workflow model. This allows anyone who works in the field to be able to relate to the process.

By focusing on the “significant characteristics” of each “entity” Dappert and Farquhar focus their assessment more closely on what each individual item is, it’s format and platform, and then on balancing the actions required to preserve it. Their preservation model takes into account changes in optimal preservation format by building in feedback loops and circular flow of preservation information.

In this way, the effervescent significant properties are discarded for characteristics and relationships that determine the most valuable aspects of the preservation object and the environment required to understand it.

Act II: Why this?

Coming at the matter from a different direction, the National Library of Australia chose instead to focus on the community they serve, the evolving nature and needs of that community, and dedicating their efforts ensuring access the materials chosen in perpetuity – taking into account and meeting the evolving needs. As Webb, Pearson, and Koerbin describe it,

“Like most things to do with managing digital collections, effective ways of making preservation decisions are evolving. … we (the digital preservation community) have no settled, agreed procedures for the full range of digital preservation challenges, nor even tentative plans we confidently expect to ensure adequate preservation for the next 100, 200 or 500 years.”

Because the file types and platforms are such nebulous commodities, Webb, Pearson, and Koerbin explain that their institutions previous efforts with one-size-fits-all definitions of significant properties and policies fell short of a feasible work strategy.

Webb et al go on to explain how the National Library of Australia redefined their work strategy, as described above, by focusing on why something was important – why the user would want it. This re-envisioning lead to redefining “significant properties” not as the building blocks of the preservation itself, but the tent poles of the document that justifies the need for preservation.

Act III: Lazarus?

Despite the effort to rid the profession of a seemingly outdated idea, I would argue that that together the work of the NLA and Dappert and Farquhar fits together magnificently, but in that order. Simplifying the definition of the “significant property” as the aspect of the entity that makes it worthy of preservation and initiations the statement of intent to preserve is enough, then once the priorities of the designated community have been established, the model developed by Dappert and Farquhar comes into play. Using the example of Preservation & Access Framework for Digital Art Objects (PAFDAO)’s preservation of Shock the Ear (1997) and other works developed in the same software – Macromedia Director – the preservation of the art is dependent on knowing the ins-and-outs of the program in which it was developed, a process much easier when conducted at scale. By this I  mean that when a collected such as the PAFDAO has dozens of early works of multimedia digital art all built from the same software, understanding the aforementioned ins and outs becomes easier because the preservationists develop familiarity with what’s a quirk of the software, a facet of the art, or a fatal flaw in a particular file. These actions are the work of Dappert and Farquhar’s model; moving backwards to Webb et al is the drive to determine why these works are valuable to the community, and even further back, to the underlining definition of the most significant of properties: that this is art and every distinction going forward will be informed by that attribution.

Introduction

Hi, I am James H. a graduate MLS student at the University of Maryland.  I am currently in my last semester and expect to graduate in the spring.  I joined the class a little late due to a scheduling error so i am submitting this a bit later than everyone else as a result.  I am interested in this course because I believe that digital technology is the future of information access and management and that understanding digital curation will be very important for future archivists.SECOND HAND

platforms and constraints

In Racing the Beam: The Atari Video Computer System (The MIT Press, 2009), Ian Bogost and Nick Montfort introduce platform studies as an approach to studying games and other digital media, tracing the history of the Atari VCS home video game console as a case study. Here’s how they define platform:

“Whatever the programmer takes for granted when developing, and whatever, from another side, the user is required to have working in order to use particular software, is the platform” (p. 2-3).

Platforms shape the actions of their users, which can cut two ways. The Atari VCS’ many limitations sparked creativity in game design, but the assumptions hidden in other platforms could have malign consequences. When preserving platforms and platform-dependent art, we’ll need to consider how best to make these influences explicit.

creativity from constraints

In order to preserve executable and/or reusable versions of software and digital artworks, we’ll need to document how constraints in platforms shape creative decisions. In Racing the Beam, a member of the design team for the Atari game Star Wars: The Empire Strikes Back recalls, “We prioritized the game elements we wanted to include and watched the list diminish as the available cartridge space was used up” (p. 128). This is one of many instances in which designers and gamers maximized what they could do within the Atari VCS’s limitations.

Bogost and Montfort write, “Technical innovations are often understood as the creation of new technology–new materials, new chip designs, new algorithms. But technical innovation can also mean using existing technical constraints in new ways, something that produces interesting results when combined with creative goals” (p. 53). The limits of preservation, such as our inability to completely document or perfectly save an old piece of software, offer their own set of restrictions. Preservation-related constraints can be detrimental to faithful reproduction, but they also free artists and curators to reinterpret the works, with “interesting results.”

Specific documentation and interoperable data might be the dream combination enabling Ippolito and Rinehart’s gross but effective concept of a “mother copy” (Re-Collection: Art, New Media, and Social Memory, Cambridge, MA: The MIT Press, 2014, p. 24). But where description and documentation inevitably fail could be where reuse really takes off.

hidden assumptions

While the restrictiveness of platforms can be good for creativity, hidden constraints and assumptions aren’t always beneficial. Game designers working with the Atari VCS seemed extremely knowledgeable about the limits governing their work, but that might not be true for artists working with software and hardware today. Platform studies suggests that we should continually interrogate the tools and systems we use, even as we build upon them.

I’m reminded of an article about “library as infrastructure” in which Shannon Mattern highlights problems with a popular “library as open platform” metaphor. Infrastructure is embedded, complicated, old and dirty, comes freighted with history. Open platforms are ostensibly about encouraging anyone to remix library collections and metadata (for example) but can obscure the values on which the platforms run. While Mattern argues that infrastructure is closer to the reality of libraries than the open platform, her “infrastructure” is akin to “platforms” as framed in Racing the Beam.

As a complement to Bogost and Montfort’s observations about technological innovation, Mark Matienzo’s keynote for the 2015 LITA Forum wraps up a lot of key issues in building new technology upon old platforms. He questions how innovative or revolutionary a technology — such as linked data created from old classification systems — can actually be, so long as participation and “the power to name” are distributed as before. My first reaction to the talk was, “Read this if you are human and work with information.” But these concerns are especially important for us since preserving platforms and their products means documenting creative cultures and relying upon members of those cultures in the documentation process.

Matienzo might find common ground with glitch artist Scott Fitzgerald, who says:

In glitch art, more so than a lot of other art forms, I am a really big proponent of the idea that the process is more important. Part of the process is empowering people to understand the tools, understand the underlying structures, like what’s going on inside of a computer. So, as soon as you understand a system enough to know why you’re breaking it, then you have a better understanding of what the tool is built for.

hidden histories go on and on

As Matt Kirschenbaum, Mattern and Matienzo (and many, many others) suggest, we can push, break apart, and interrogate platforms by delving into the social and political histories of the hardware and software. We’ll probably find that the hidden histories just keep going.

For example, Bogost and Montfort mention that the Atari VCS used a processor and other hardware manufactured by Fairchild Semiconductor. I happen to have read a paper by digital media scholar Lisa Nakamura tracing the history of Navajo women’s involvement in the manufacture of these parts at Fairchild’s plant in Shiprock, N.M. “Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture” focuses on convoluted representations of these workers in Fairchild’s corporate archives, and the conspicuous absence of their actual testimonies in the archival record. This chance connection is based on my having just one other context for the name Fairchild Semiconductor, but it reinforces that platform studies are inseparable from studies of gender, labor, race, and class.

from platform studies to preservation strategy

All of this suggests that preserving digital art is a continuous process of investigation forward and backward in time from the moment of a digital object’s creation — if a singular moment can be identified at all.

Arms and Fleischhauer (2005) make two especially helpful contributions to how we might translate platform studies into preservation strategy. First, they conceive of digital formats as tied to the stages of a digital project’s life cycle (creation, preservation, and use). They call for archivists to investigate the full range of formats used and the relationships between content stored in each. Second, they enumerate specific sustainability and quality/functionality factors for promoting the longevity of digital formats. Each factor could serve as a way for archivists to enter into conversation with creators and users of digital media platforms, from whom we seek help.