We All Digress with Digital Artworks

Imagine holding an exhibition of digital artworks that we have carefully collected, say, in five years? What would you put down on the artworks’ label?

ArtworkLabel

I ask because these questions help me reflect on some of the points we collectively mused on during our class meetings this semester, as we digressed from our readings. My reflection covers these points precisely because they remain formative, and I would like to hear your thoughts on them.

Cultural Institutions 2.0

First point has to do with three types of cultural institutions and their conventional and altering practices. Museums, libraries, and archives—institutions most likely to be in charge of curating digital artworks—have traditionally valued their collections differently. As Rinehart and Ippolito explain in their Re-Collection, the difference can be generalized as follows. Museums were mostly in charge of unique objects and took care of conservations, as needed. Libraries and archives were traditionally in charge of documents. Generally, libraries held the mass produced resources and circulated them accordingly. Archives, in addition to the published works, held unpublished resources and rare items, often in association with the published authors. Archives rarely circulated their items.

To a certain extent, the distinction among three types of cultural institutions seems to be reflected to some of the theoretical frameworks we have encountered. For instance, museum and archival tradition of valuing the uniqueness of an object can be seen among the approaches of Kirschenbaum (media archeology), Arcangel (Warhol files), and Reside (Jonathan Larson’s Word files). How libraries circulate copies of the published work can be seen in Rinehart and Ippolito’s approach (dissemination/reinterpretation).

But here is the rub: the Library of Congress now has a Flickr collection. Because, let’s face it, technically speaking, uploading pictures on the Internet is a form of publishing and mass production? No doubt such image corpus would be helpful. In the short run, people can learn about the usage of an online publishing platform. In the long run, the corpus can be a window to the 21st-century way of lives.

LibraryOfCongressFlickr

Other readings further suggest how the conventional cultural institutions’ delegation model would be blurred. Historical research of hardware, software, and file formats—such as that of Montfort & Bogost (platform studies), Ball (CAD), Manoich (Photoshop), and Eppink (GIF)—demonstrate how artworks that utilize a digital device at any stage of their production is essentially codependent on the nature of such devices.

To be fair, material conditions of artwork have always had an influence long before the digital age. What is new, perhaps, is that we are now aware of 1) how the digital production devices are historically located and that 2) the pace of their obsolescence is fast and furious. That means, without a preservative effort, we would lose the access to the digital artwork. By access, I mean, the access to the digital artwork itself and the access to its historical significance.

An interesting mash-up approach, perhaps, is MITH’s classic computer collection which showcases six classic computers: KAYPRO 4, Apple IIe, Macintosh SE, Vectrex, Amiga, and Macintosh IIci. What is unique is how on the Vectrex (a standalone video game console with a vector-based image interface) sit instructions and books that can help us understand how the machine works and how sensational Vectrex was when it first appeared. When we play Vectrex’ cartridge games on the machine, moreover, we are reminded of the invincible score of its former owner (as we inadvertently add our marks to the machine’s storage device, altering the object). It is such an assemblage of library items and archival/museum objects that enables me to interact with and appreciate a classic computer.

Yes, the alteration of the object may make archivists and museum folks wince and squirm, but, hey, there is an emulation method to save the rare object from the deterioration. Yes, the question of scope is always there, but can we think big for a moment to envision what the best practice may look like?

The biggest takeaway for me out of this musing is this: don’t get our thinking limited by the conventional institutional practices, and seek cross-institutional collaboration as necessary, for the best curatorial approach of digital artworks. Collaboration may come in forms of human resources and/or modes of operation.

Cultivating Digerati via Digital Artworks

The second point is the educational roles cultural institutions can play as they curate digital artworks. I am particularly interested in the idea of curatorial show and tell exhibition of digital artworks. Going back to my prompt of artwork label, I think the kinds of information we assembled for archival information package should go into the label.

Why? My answer is two-folds: 1) to showcase what kind of documentation is needed for preserving digital artworks, and 2) to make the artworks’ invisible technological aspects visible.

The former would be an investment for the future collaboration with the artists, making them take into consideration how to compile documentations for the longevity of their work (should sustainability of the work be agreeable to the artists). The latter has to do with the promotion of information literacy among the audiences via digital artworks. Times and time again, we’ve been reminded that the most successful technology is invisible. But as the media studies scholars such as Elizabeth Losh strive to show, such invisibility can lend a hand to an abuse. Sharing archival information package with the audience can be an antidote. Moreover, public education, as Rinehart and Ippolito show, is in harmony with the traditional role of cultural institutions. Besides, I trust a hard work of archivists can be beneficial when exposed to the public eyes.

What do you all think?

MAIP for Twitch, an Invitation to Edit

MAIP-Twitch

NOTES ON THE REPOSITORY
Model Archival Information Package (MAIP) that I composed for Twitch—a series of minimal one button game—is currently hosted on the GitHub repository. I chose GitHub in accordance with the Open Source spirit of Processing, the programming language and its surrounding community that brings digital artworks such as Twitch to life.

It is to be hoped that a MAIP like this would help enrich a collection such as “The Art of Programming” exhibition at the Computer History Museum. Currently, this exhibition in question hosts two programmers: Don Knuth and Jamie Zawinski. While I trust the exhibition to be more populated eventually, when compared to other exhibition such as “Memory and Storage” with 27 contents and granular contextual information, I cannot help but wonder weather the artistic endeavors with computers is prone to be neglected. As the subtitle for “The Art of Programming” exhibition is “A Programming Language for Everyone,” showcasing the evolution of Processing may be a perfect fit. Moreover, Processing may offer the width to the museum’s collection to show how some aspects of computing have little to do with the military and financial interests.

WHY PRESERVE TWITCH
Twitch is an epitome of the Processing evolution. Preservation of Twitch, therefore, allows the game to be a gateway to the history of Processing. Moreover, preservation of Twitch can be a pilot project for the growing numbers of software artwork created with Processing.

Diverse User Base
The latter rationale is especially of import, concerning the immediate stakeholders of Processing. According to Casey Reas, co-creator of Processing and the creator of Twitch, the creative community is the primary audience of Processing. Reas describes the motivation behind the invention as follows:

It’s not very common for artists and designers to be the primary authors of programming environment, but this is changing. I hope Processing has helped to demonstrate that we don’t need to rely only on what software companies market to us and what engineers think we need. As a creative community, we can create our own tools for our specific needs and desires.

As a matter of fact, within seven years since Reas and Ben Fry released Processing under the Open Source policy, the developing community has developed 70+ libraries. Processing users’ fields include: 12K and higher education, music industry, journal publishing, design and art industries. As a result, the language initially developed to teach computational graphic design literacy can now process audio, electronics, and animation.

History of Programming in the Field of Art & Design
Concerning the former, Twitch embodies the principles and the evolution of ideas that originate from the Massachusetts Institute of Technology (MIT). As the founder of the MIT’s Visual Language Workshop (later known as the Media Lab) Muriel Cooper wrote in a letter in 1980, the principle of Processing, too, revolves around how “the content, quality and technology of communication” informing “each other in education, professional and research programs.” John Maeda, who was allured by the vision of Cooper and pursued an Art degree after his MIT engineering training, later came back to work at the Media Lab and with graduate students including Casey Reas and Ben Fry. In his “When a Good Idea Works” published in the MIT Technology Review in 2009, Maeda connects the dots of how Reas and Fry’s Processing has come to be:

The starting point for their project was something that I can take credit for: the Design by Numbers (DBN) framework for teaching programming to artists and designers. I originally wrote DBN in the 1990s, but I couldn’t get it to yield production-quality work. My graduate student Tom White made it into something that was much more functional. And then Fry and Reas took a crack at it. DBN limited users to drawing in a 100-by-100-pixel space, and only in grayscale-faithful to my Bauhaus-style approach to computational expression. But Fry and Reas figured that people needed color. They needed canvases larger than 100 by 100. They realized that this wasn’t in line with my interests, so they went off and made their own system that gave users no restrictions at all.

twitch-ecosystem

In once sense, Processing arose as a response to practical problems. When Java fist came out, it offered minimal support for sophisticated graphics processing. Drawing a line and stuff like that was possible, of course. But it couldn’t do transparency or 3-D, and you were almost guaranteed to see something different on a Windows computer and a Mac; it was incredibly cumbersome to do anything that was both sophisticated and cross-platform. So Fry, who grew up hacking low-level graphics code as a kind of hobby, built from scratch a rendering engine that could make a graphically rendered scene appear the same in a Windows or a Mac environment. It was not just any renderer—it borrowed the best elements of Postscript, OpenGL, and ideas cultivated at the MIT Media lab in the late Muriel Cooper’s Visible Language Workshop.

What started at the MIT did not stop there. It is fair to say that the Open Source spirit of Processing helped to gain popularity among the developing community. Such support proves to be crucial for Processing as the programming environment continues to change. John Resig, the initial author of jQuery, for instance, developed Processing.js to enable better implementation of visualization and animation of Processing. Writing in JavaScript, Processing.js converts Processing codes written in Java, and uses HTML5’s <canvas> element to render images. As an antidote to the now-close-to-obsolete-Java ailments, this adaptation strategy was evolutionary, and Resig’s work was highly praised among the developer community. As one comment on reddit has it: “This is ridiculously well done. The simplicity of some of the example is fairly stunning.”

Twitch is currently showcased as one of 1174 Google Chrome experiments, that utilizes the functionality of modern web browsers. But Twitch is much more than a series of minimal one button game we can play, and its rich historical context needs to be documented.

WHAT TO COLLECT OF TWITCH
As such, I propose to preserve the ecosystem of Twitch in order to document how an Open Source project thrives. As such, my MAIP is composed of following digital assets, in an alphabetical order: codes, demo, hosting, and people.

Twitch-GitHub-Repostitory

  1. codes: This holder contains a instruction for reverse-engineering Twitch with its source codes. There is also a credits document which details the licensing and the due acknowledgement of each source code file.
  2. demo: This holder provides a document with a link to the YouTube video, demonstrating how Twitch works. I also have the screen recording file on my local drive, but this was too big to be uploaded to GitHub repository.
  3. hosting: This holder has a document with a hosting instruction for reverse-engineering Twitch.
  4. people: This holder contains five documents describing the figures who have played direct roles concerning Twitch’s fruition. Namely, again, in an alphabetical order: Ben Fry, Casey Reas, John Maeda, John Resig, and Muriel Cooper.

* Here is the zip file of what is hosted on the GitHub repository: MAIP-Twitch. Editorial suggestions would be most welcomed at my GitHub repository.

Preservation of Twitch Ecosystem

twitch-ecosystem

WHY PRESERVE TWITCH?
Twitch—a series of minimal one button game—is an epitome of the Processing evolution. Preservation of Twitch, therefore, allows the game to be a gateway to the history of Processing. Moreover, preservation of Twitch can be a pilot project for the growing numbers of software artwork that incorporate Processing.

The latter rationale is especially of import, concerning the immediate stakeholders of Processing. According to Casey Reas, co-creator of Processing and the creator of Twitch, the creative community is the primary audience of Processing. In a 2008 interview, Reas spoke about Processing as follows:

It’s not very common for artists and designers to be the primary authors of programming environment, but this is changing. I hope Processing has helped to demonstrate that we don’t need to rely only on what software companies market to us and what engineers think we need. As a creative community, we can create our own tools for our specific needs and desires.

As a matter of fact, within seven years since Reas and Ben Fry released Processing under the Open Source policy, the developing community has developed 70+ libraries. Processing users’ fields include: 12K and higher education, music industry, journal publishing, design and art industries. As a result, the language initially developed to teach computational graphic design literacy can now process audio, electronics, and animation. (You can learn more about the diverse user base in this Vimeo of a Carnegie Mellon lecture delivered by Reas and Fry.)

WHAT TO COLLECT OF TWITCH?
Preserving the source code of Twitch, together with the historical narratives surrounding its production would serve as the documentation of how an Open Source project thrives and comes to be. Here is a preservation package for the Twitch ecosystem.

1. Source Codes: Twitch is composed of (at least) 13 files*—3 JavaScript and 10 HTML files. They need to be stored in a file titled “play.” The following are the existing files that are accessible from any users’ web browser.

source-code-structure

* see the ACQUISITION PLAN below

File Name Notes on Credits
init.js This code searches for all the <script type= “application/processing” target=”canvasid”> in a page and loads each script in the target canvas with the proper id. It is useful to smooth the process of adding Processing code in a page and starting the Processing.js engine.
processing.js Build script for generating processing.js. The version used for Skitch is 1.4.8. Written by John Resig (http://ejohn.org/), MIT Licensed (http://ejohn.org/blog/processingjs/).
windowScript.js Prompts the size and position of windows as they open in sequence.
window0.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. TWITCH written by REAS (www.reas.com). Ported from Processing to Processing.js.
window1.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. FLOW written by REAS (www.reas.com). Ported from Processing to Processing.js.
window2.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. PERILOUS BELT written by REAS (www.reas.com). Ported from Processing to Processing.js.
window3.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. SLENDER VINES written by REAS (www.reas.com). Ported from Processing to Processing.js.
window4.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. BOOMING CANNON written by REAS (www.reas.com). Ported from Processing to Processing.js.
window5.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. ELECTRIC PYRAMID written by REAS (www.reas.com). Ported from Processing to Processing.js.
window6.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. SMUGGLING written by REAS (www.reas.com). Ported from Processing to Processing.js.
window7.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. SLIPPERY STREAM written by REAS (www.reas.com). Ported from Processing to Processing.js.
window8.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. BATTLE FRONT written by REAS (www.reas.com). Ported from Processing to Processing.js.
window9.html Pattern example written by Casey Reas and Ben Fry, saved on http://ejohn.org/apps/processing.js/examples/topics/pattern.html. EERIE LABYRINTH written by REAS (www.reas.com). Ported from Processing to Processing.js.

2. Hosting Environment: In order to allow users an interaction with Twitch, the source files need to be generated on a web browser. Web browsers need to support HTLM5 <canvas> element. As of April 2016, browsers that quality this requirements are: Chrome 4.0, Internet Explorer 9.0, FireFox 2.0, Safari 3.1, and Opera 9.0.

browser

3. Creators & Contributors: There are at least five key figures who contributed to the fruition of Twitch. While there are many more developers, artists, and educators who shape the Processing community, the following persons have playd the crutial roles.

  • Casey Reas: See my Statement of Significance
  • Ben Fry: See my Statement of Significance
  • John Resig: In 2010, Resig—an author of jQuery—developed Processing.js (a JavaScript port of Processing) to enable better implementation of visualization and animation. Written in JavaScript, Processing.js converts Processing codes written in Java and uses HTML5’s <canvas> element to render images. The idea of translating Java to JavaScript suggests a wonderful adaptation strategy to the changing programming environment. Owning to Resig’s work, Processing users do not need to abandon Processing (which operates in now-close-to-obsolete-Java). All Processing users need to do is to include Resig’s JavaScript file “processing-1.0.0.min.js” to their .pde file, just as they have always done.
  • John Maeda: See my Statement of Significance
  • Muriel Cooper: See my Statement of Significance

ACQUISITION PLAN?
1. Source Codes: It is best to acquire source codes from Casey Reas. This approach allows me to acquire .pde source file. *Otherwise, a web brower’s “View Page Source” function (such as the one that Google Chrome offers) lets me have a look at most of the source codes. With the latter method, I can reconstruct the initial Twitch page (window0.html) and the first game window (window1.html). However, as this approach only allows me to inspect the source code generated on the web browser; the files listed above in the table are not complete enough to proceed to the second and consequent game windows.

2. Hosting Environments: The current common web browsers support HTML5 <canvas> elements. Should the future update of web browsers results in malfunction of Twitch, there is a web browser emulation project such as oldweb.today that may be of use.

3. Documentations of Creators & Contributors: Documentation of each individuals’ contribution requires some research. Reas and Fry’s rationale behind the development of Processing has become a part of core philosophy of Processing community and is thus well documented in the official websites such as Processing.org and Processing Foundation. So, too, are the anecdotes and praises of John Resig. His blog post entry, interview, and the reddit’s reaction to Processing.js are some of the records that can be preserved. Additionally, Resig’s role as a leading figure of Khan Academy’s Computer Science and how he utilizes Processing.js as a part of his pedagogy are relevant; and these records should assist us in understanding the climate of Twitch ecosystem.

john-resig copy
John Resig. Image Credit: https://twitter.com/jeresig

Project Top-notch Twitch

Twitch is a series of minimal, one button games using Processing.js. Twitch is a work of Casey Reas, a software artist who “writes software to explore conditional systems of art.” Twitch is also a part of Chrome Experiments project that leverages the Google Chrome browser functions.

Why we should preserve Twitch? The significance of Twitch is multifold, and I hope, by the time you finish reading this post, your face twitches in a pleasant surprise of how much this minimalistic game entails.

Twitch & Ally of Visionaries

Twitch is enabled by the ally of visionaries. The following three are one such examples, showcasing how a digital artwork comes to be.

1. Casey Reas, the creator of Twitch. Trained at the Massachusetts Institute of Technology in Media Arts and Sciences, Reas utilizes the generative software to produce dynamic graphics. Reas published Twitch in 2009 as a demonstration of most elementally Processing.js features such as drawing a circle and allowing users’ interaction by a click of the mouse. Reas’s other state-of-the-art exhibitions and commission work take place internationally, and some prominent installations include the one at the Victoria and Albert Museum in London UK. Most of Reas’s works, however, are stored in either print or otherwise static formats. Venturing to find the apt preservative methodsof Reas’s work in its native digital media is doubtless of interest to his fellow contemporary artists and the cultural institutions alike. Starting with the preservation of Twitch can be a feasible step towards prospective and larger preservative project of digital artworks.

image of casey-reas
Casey Reas. Image Credit: http://www.amazon.com/Casey-Reas/e/B001JRWKOC

2. Ben Fry, the co-creator of Processing. Processing.js, what enables Twitch, is a JavaScript port of a programming language called Processing. Aside from how Processing.js works for now, let me explain the historical significance of documenting the development of Processing. Processing was conceived by Ben Fry and aforementioned Casey Reas in 2001 while they were both graduate students at the MIT Media Lab. The idea behind the invention was to promote software literacy within the field of visual art and visual literacy among the field of technology. Processing is open source, and its continuous growth has been supported by the community participation and collaboration. Processing is inspired by the early programming language such as BASIC, a language recognized as a reminiscence of the time when the user-friendly computer encouraged programming (unlike our contemporary complex and easily doomed-to-be gibberish programming languages). The syntax of Processing has been adapted widely such as seen in an elementary yet powerful DIY wiring kit Arduino and in the Khan Academy’s free online computer science tutorials.

image of ben fry
Ben Fry. Image Credit: http://www.abetterworldbydesign.com/2015/conference/past-presenters/

3. Muriel Cooper, the founder of Visual Language Workshop. The Visual Language Workshop of the MIT is a precedent name of what’s later became known as MIT Media Lab. Processing gives credits to many of its foundational ideas to the Workshop. Cooper founded the Visual Language Workshop in 1973 and directed it until her death in 1994.

In addition to her work as a graphic designer–which revolutionary rebranded the MIT press–the Visual Language Workshop produced gifted software designers including information architect Lisa Strausfeld and the former president of the Rhode Island School of Design, John Maeda (who taught Ben Fry and Casey Reas). Cooper herself can be regarded as a pioneer of graphic designer and educator who envisioned “the content, quality and technology of communication [to] inform each other in education, professional and research programs” as she notes in her 1980 letter. Preservation of Twitch can serve as a tip of the iceberg to document the history of digital technology and art in the last two decades.

muriel-cooper
Muriel Cooper. Image Credit: https://mitpress.mit.edu/blog/university-press-week-throwback-thursday-featuring-muriel-cooper

Twitch & Evolution of Programming Languages

Now let me shift to how Processing.js has come to enable a game such as Twitch. Proceccing.js is written in JavaScript. When executed, Processing.js uses HTML5’s <canvas> element and convers Processing code (which is written in Java) to JavaScript. Why this is important?

1. From Java to JavaScript. As Processing.js takes care of the conversion, the long time users of Processing and the audience of Processing-based visual arts do not need to worry about the Java-related glitches. Java, indeed, was once a ruling language of the web during the mid-1990s. Java’s compatibilities and loading time, however, soon cast the shadow on its popularity and now only few web developers deploy Java-based applications by choice. Processing.js, therefore, is here to rescue. It allows the great numbers of Processing users to retain their method of production by simply including a “processing-1.0.0.min.js” script file in addition to the .pde file just as they used to with the Processing IDE. Twitch seen in this light, therefore, is an epitome of the strove-to-be-seamless transformation of digital environments, embracing and adapting the change.

Java-JS
Image Credit: http://forum.feed-the-beast.com/threads/i-have-a-question.99218/

2. The Age of Web Browser. Moreover, Twitch can demonstrate the contemporary norm of generating the files on the client side, a common practice to facilitate dynamic interactions by utilizing the browser functions. For example, HTML5’s <canvas> element—what Javascript.js uses to create graphics—is supported by five major browsers such as Chrome 4.0, Internet Explorer 9.0, FireFox 2.0, Safari 3.1, and Opera 9.0. Twitch, as a result, can address the interesting question concerning the preservation and the browser’s version history. Starting with a preservation of Twitch, the project can further address the evolution of browsers, the shift from being heavily text-based to multimodal graphic and audio rendering.

timeline_of_web_browsers
Timeline of Web Browsers. Image Credit: https://en.wikipedia.org/wiki/Timeline_of_web_browsers

Saving Software Ecosystem

Our cultural gut tells we should start saving software…

You will notice how software preservation is a relatively new endeavor when you encounter a somewhat intuitive rationale that pushes the agenda. Concerning the preservation of Planetary, a visual music player developed for iPad, Sebastian Chan and Aaron Cope in their 2013 “Collecting the Present: Digital Code and Collections” describe their motive as follows:

The benefits accrued by the ability for software and hardware industries to frequently “shed their skin” and start anew still outweigh the costs, and that is the landscape in which museums will continue to try and preserve objects in for the foreseeable future. To that end we see the ability and freedom for third parties to play and experiment with—to become comfortable and familiar with—Planetary’s source code as integral to any efforts to recruit developers in our preservation aims. Will some of what we see be still-born or not in line with the museum’s thinking? Probably. Will they be worth it in the long run? We choose to believe so.

Matthew Kirschenbaum in his 2013 “An Executable Past: The Case for a National Software Registry” published in “Preserving.exe” acknowledges the similar condition that currently surrounds software preservation:

For decades the Library of Congress has also been receiving computer games, and in 2006 the games became part of the collections at the Culpeper campus. But while the Library registers the copyrights, what it means to preserve and restore vintage computer games—or any kind of computer software—is less clear. As yet there is only the beginning of a national agenda for software preservation, and precious little in the way of public awareness of the issue.

So we think we should save softwares, but what exactly should we save?

Granted we take heed of the pace of software obsoletion—and our cultural practices highly shaped by the daily interactions with software—it is not difficult to imagine varying understanding of software preservation. Questions that concern archivists include: To what end do we preserve software? What is the scope of software preservation? Chan and Cope consider the functionality of the software as its utmost importance. Therefore, for Planetary, it was the features of software—“the interaction design and experience of manipulating and affecting a dynamic three dimensional system using a touchable interface”—that need to be saved and reconstructed.

For Kirschenbaum, it is the historical context in which certain software was conceived that demands documentation. With the example of Microsoft Word 2.0 released in 1991, Kirschenbaum entertains how a hidden mischievous features of “WordPerfect monster” alludes to the then ongoing rivalry between the competing word processing applications.

Henry Lowood in his “The Lures of Software Preservation” (also published in “Preserving.exe”) considers an alternative preservation method by suggesting the verification of data files. Lowood’s approach questions the “screen essentialism” (the preservation of the look of software) and encourages the preservation of software integrity by using such signatures as hashes and checksums.

Is the through-er, the better?

Despite the different emphasis on the significant property of software, one thing is for certain when considering software preservation. That is, software needs to be preserved as a whole package. The package may include, just as the case of Planetary, the software’s early versions, change logs, and bug reports. The suggestions made in Erick Kaltman et al.’s 2014 “A Unified Approach to Preserving Cultural Software Objects and Their Development Histories” go as far to include paper prototype, type of IDE, email correspondences, and interpersonal relationship, concerning the documentation related to the development of academic gaming software Prom Week. Mind you, however, while through documentation and preservation of relevant files are essential to saving software, such action calls for professional judgment and managerial strategies. In the words of Chan and Cope:

Because digital works are exist in any equally digital life-support system, or ecosystem, absent preserving the entire dependency chain for a single digital object museums need to be able to conceptualize and articulate a strategy for demonstrating some kind of tangible proof for those objects in their collection which lack the physicality typically associated with our collections.

Keep it open!

Another consensus seen among Chan and Cope and Kaltman et al. is the benefit of preserving software. Chan and Cope followed the original open source policy of Planetary and concludes its advantage for preservation as follows:

The choice to both enable and encourage derivative works of Planetary was, and continues to be, seen as fundamental to our efforts to preserve “Planetary the software system” as opposed to “Planetary the iPad application.” Because of all the complexities inherent in preserving hardware and software systems, we are hoping that third party developers will translate the design and intent of Planetary to other systems, for example Google’s Android mobile operating system or to the JavaScript and WebGL programming environments found in modern web browsers.

Kaltman et al. in a similar vein writes:

Most scientific progress is based on reproduction and extension of others’ work, making that work easier to access and more transparent would open up more research avenues and increase scientific output. […] The more open a platform is, the better a chance it has for migration to other systems, and the more open a development process, the easier it will be for future researchers and students to understand and further the work. (7)

The benefit of open source seems to resonate with the idealistic rationale of library. I wonder what are the potential drawbacks of this approach? If we use this rationale, do you think we can persuade the stakeholders with a growing number of software preservation? Especially, as mentioned earlier by Chan and Cope, when developing a new software is much cheaper than tinkering the existing one? Do we envision something equivalent to “a law library” or “a business library” with the collections of software?