Hi all! I’m Eric, and I’m in my second and (knock on wood) final year in the MLS program, focusing primarily on archives and special collections. Like many of you, my background is not in libraries; I got my undergrad in English and then worked for several years in international education and development. I’m currently working on a metadata-focused project in the University Archives at UMBC, and a digitization project at UMD’s Special Collections in Performing Arts. My interests in this field are broad but I ideally hope to work with arts-related materials (in particular music or film), and am interested in the digital humanities in general, so I was excited to hear about this course.
This week’s readings were a great overview of the challenges involved in digital and new media arts preservation. Re-Collection in particular raised intriguing questions in each chapter. I appreciated that the readings overall covered the gamut from detailing current preservation problems, to discussing the issues involved on a more theoretical level, to reviewing individual case studies and detailing efforts to create more standardized guidelines to assist preservationists. Anchoring the big theoretical questions in specific real-world examples and institutions helped to connect the readings to the work I’m doing now, and the materials and institutions I hope to be involved with in the future.
The case studies in Re-Collection and the Fino-Radin article, however, also show how dauntingly complex these issues can quickly become when you get down to the details of a specific work. While I’m interested in the possibilities of emulation, the sheer number of different technical formats seems to make it inevitable that many of these works will be impossible to save in their original form, so I was particularly inspired by the reinterpretation section of Ippolito and Rinehart’s four preservation techniques.
I found the concepts for new media metadata formats intriguing, in particular Rinehart’s proposals for a “score” for new media artworks. The comparisons to musical notation (which also came up in the Smithsonian survey interviews), as well as the links to preservation issues for other performance-based art forms like dance and theater, are great examples of how reinterpretation can work as a method of preservation. I think this avenue is particularly exciting because it offers not just an opportunity to better preserve the work for future generations, but also to encourage the development of new art both now and in the future by making these works available for use and reinterpretation by other artists. The potential ability of cultural institutions to facilitate this dynamic opens up possibilities to connect libraries, archives, and museums to communities of artists and fans in a richer and more collaborative way. Looking forward to exploring all this more during the course!
Eric – like you I also enjoyed the comparisons of new media artworks to musical scores and performances. I think it highlights the two integral parts of a new framework when it comes to preserving new media artworks: the artworks themselves and the technology used to preserve it. Both of these aspects can be perceived in continuously changing ways. For instance, viewing the artwork as a comprehensive performance encompasses the single instances the work was “performed” or viewed, the conceptual motives driving the artwork, and the physical media used for the work. Looking at the metadata as a musical score involves thinking about the various aspects of the work and leaving behind the best blueprint to recreate or reinterpret that work. Additionally, thinking about the various technologies used for that media and how to best preserve those technologies and the context surrounding their use for the future can be a complex process – not just keeping the hardware in storage, but using the different methods suggested in the readings such as emulation, migration, and reinterpretation. This is why I think thinking of the works as performances encourages us to “meet the work on its own terms” (like the Smithsonian interviews suggest) when it comes to its preservation.
Perhaps I’m too optimistic in my views of what scripters/coders have the time and ability to do, but I think that emulation stands perhaps a better chance than one might think, even in wake of the number of scripts and technologies (and there are a lot)! Thinking back to Koebler’s article from last week, Mozilla and Google were working to make sure that Flash files could be played in browsers moving forward, effectively emulating the SWFs using Javascript and HTML5. It’s interesting, as a gamer, to talk about emulation as a way of preserving things; that comment in the book was what really had me thinking about the functionality of the things we curate.
I think the metadata section pairs nicely here; while getting artists to use such formats might be challenging at first, I think that certain things might push creators to move towards collecting such information– for instance, SEO (search engine optimization) is certainly on the minds of people who work on the internet, and if Google is more able to “read” such metadata (and I think this would be the case), then people would be more likely to use these systems. AND, if people use metadata like a score, it would hopefully lead to easier emulation down the line, as specific technologies change, but there would still be some underlying scripts that would remain constant.