An AIP for a Digital Deep Cut: Kutiman’s Off Grid
Currently, my AIP for Off Grid is 100% “make believe,” so unfortunately there is nothing yet to download. Still, I will provide details regarding each folder series—web pages, videos, documentary materials, and working files—and subseries. Every series includes a readme document that provides contextual information to users. (Excuse the lack of good normalized file names! This factor was simply overlooked, as I was only creating dummy files to populate the model AIP.)
Web Pages
This is where the sites preserved using Archive-It will be housed. The folder includes:
• The current instance of www.kutiman.com (dedicated to Off Grid)
• The YouTube page for Off Grid
• The YouTube pages for all of Off Grid’s 95 component videos
• Websites that embed, critique, or provide write-ups on Off Grid
As discussed in my statement of preservation intent, the YouTube pages will be crawled at regular intervals in order to show change over time. Kutiman’s personal website will only be captured once since it is a temporary showcase for Off Grid. Of course, I’ll intermittently peek at the site just in case Kutiman develops the site in an unexpected direction.
The fourth bullet point gave me a bit of difficulty. Originally I had placed these web sites in the Documentary Materials folder because I figured that they helped a future user understand how Off Grid was received and spread throughout the internet. Yes, these sites do provide that function, but sticking them in a folder separate from the “official” Off Grid sites is betraying too much focus on the individual author. I am interested in preserving Off Grid as a window into participatory internet culture, not just as a set of cool videos. For this reason, I feel that my series and subseries need to walk the walk, even if it means that the Web Pages folder may look slightly intimidating to a user at first blush. But, hey, that’s what Readme files are for.
Videos
This folder contains the access copies of all the YouTube videos for Off Grid and its component videos. Using ClipGrab all the videos will be saved as MPEGS4s, as the format is the standard for streaming media and looks to remain well supported. The original format of the video will be documented in its metadata (ClipGrab easily allows one to identify the original format) and the videos are saved at the highest level of quality available. Metadata is generated for these videos and stored in a separate file. I went with a PBCore application profile since it is well-suited, recommended, and I am familiar with it.
Documentary Materials
This folder contains materials that help provide additional context to future users and, as such, holds the largest potential variety of formats. The folder includes:
• PDFs of interview questions and responses sent to creators of the component videos via email form (thanks to Alice for the tip! P.S. would this need to go through IRB?)
• “Making of” videos (likely stored as MPEG4s, though none currently exist so I can’t be 100% sure at this time)
The only lingering doubt I have with this folder is related to the earlier problem I noted regarding the fourth bullet point in the Web Pages folder. Materials that document Off Grid are also materials that tend to embed the work and spread it throughout the internet. This means that a website featuring an interview with Kutiman about the making of Off Grid could be included in both the Web Pages and Documentary Materials folders. I decided that the Documentary Materials folder would be best suited for static documents (non-web pages) that discuss the work. Most of these documents don’t exist yet and will likely be generated through the efforts of my institution, so that’s another way to look at them.
Working Files
The working files are something that can only be obtained through Kutiman and, as such, they aren’t here! I am under the impression that the files are created in Sony Vegas Pro, but this has yet to be confirmed. This folder series is thus a placeholder until more information can be obtained. Still, knowing how the affordances of the various platforms used to produce YouTube videos come into play is important toward understanding participatory internet culture.
Conclusion
Two of the big challenges in designing this AIP were my inability to contact Kutiman and the relatively small amount of buzz it has gotten online—especially when compared to Thru You (*raises fist toward sky* Eriiiiiiiiiiiiiiiiiic!!!!!!!). As I mentioned in my statement of significance, it is essentially viral-proof with its long running time and “out there” music. Also, the work was only released in February, so it simply hasn’t been out very long. However, I think that my planning for the AIP allows things to be added quite easily in the future, such as the working files and interviews.
The ARSC-Files: The Truth Is Out There
Upon reading the bold statement at the beginning of the Preservation Reformatting chapter of ARSC’s Guide to Audio Preservation, “A restored version of a sound recording cannot be considered a preservation copy,” I had a moment of kneejerk skepticism. While I think that anyone can appreciate the truthiness of that claim, it feels a little weird to authoritatively ignore an important concept: digitization always results in the creation of a new digital object.

A point made by Trevor Owens in a 2012 post on The Signal comes to mind:
“The idea of digitization obfuscates the fact that digitization is not a preservation act. Digitization is a creative act.”
Owens is arguing that a digital surrogate should not simply be considered a handy duplicate because digitization tools (and the preservationists who use them) will always be making decisions about what they are capturing. Some of the decisions may seem harmless or minuscule, but they are still judgments that prioritize certain significant properties. Furthermore, the decisions regarding what materials get digitized and which ones don’t also demonstrate these types of values. Still, I can understand where ARSC is coming from. “Warts and all” preservation copies allow scholars to scrutinize the artifactual qualities of items, and they also have a way of—to use a tired phrase—bringing history to life.
Still, too much faith in these preservation copies can lead to problems. Sarah Werner illustrates this point in her discussion of the limitations of the digitizations on Early English Books Online. The digitizations were largely drawn from microfilms of early books and so certain unexpected details can be lost or misinterpreted. The title page of an elegy mourning the death of Prince Henry, for instance, is a mistakenly reversed negative because the person who processed the microfilm didn’t believe that the text was white on a black background.
Interestingly, FADGI’s Technical Guidelines for Digitizing Cultural Heritage Materials does not subscribe to ARSC’s relatively dogmatic principles regarding verisimilitude. Take, for instance, this snippet that discusses FADGI’s view on the adjustment of master image files:
“There is a common misconception that image files saved directly from a scanner or digital camera are pristine or unmolested in terms of the image processing… Because of this misconception, many people argue that you should not perform any post-scan or post-capture adjustments on image files because the image quality might be degraded. We disagree. The only time we would recommend saving unadjusted files is if they meet the exact tone and color reproduction, sharpness, and other image quality parameters that you require.”
FADGI’s reasoning is a mix of ideological concerns and practical thinking. It recognizes that advocacy for adjusting master images may cost it some blue ribbons in the future (“First Place – Most Authentic in Show”); however, it also feels that “adjusting master files to a common rendition provides significant benefits in terms of being able to batch process and treat all images in the same manner.” Furthermore, multiple copies (master/raw, production, access) might create prohibitive storage costs.
In their eyes, post-capture adjustments will result in insignificant data loss and raw files are often more trouble than they are worth. Might this be FADGI taking proactive steps to avoid creating more situations like Sarah Werner described: producing facsimiles of things that don’t actually exist?
This makes me wonder: what is the cause for these contrasting perspectives? Is it something to do with the different materials being preserved? ARSC’s definition of preservation reformatting might provide a clue: “the process of transferring the essence or intellectual content of an object to another medium.” I’m not 100% certain what they mean by “essence”; perhaps they are referring to the differences between artifactual and informational qualities. I also see this as a nod to a precept of psychoacoustics: perception and meaning are not bound to one another. Still, the very idea that someone is prioritizing either essence or intellectual content seems to undermine the authenticity of any preservation copy.
Also, ARSC does recognize the fact that audio files aren’t necessarily “unmolested” during transfer from analog to digital. In my current field study at UMD Digital Conversion and Media Reformatting I am following ARSC guidelines to digitize reel-to-reel recordings and this requires me to adjust the azimuth of the reel-to-reel player before transferring each recording into Adobe Audition. During this process I am essentially adjusting the playback head based on what sounds right to my ear. The resulting preservation master can’t be called 100% raw. But I suppose that “do the best you can out there” doesn’t make a very strong opening statement to your preservation guidelines!

Ultimately, I wonder how different the philosophies of ARSC and FADGI are in practice. They look pretty different on the page in regards to tone, with ARSC being a bit dogmatic and FADGI as a relatively cavalier pragmatist, but are they so divergent? Is FADGI really “throwing away” any more data by virtue of post-capture adjustments than ARSC is by prizing the preservation copy? I suppose not, if you believe that digitization is a creative act in the first place.
Close Enough for Jazz: A Statement of Preservation Intent
Kutiman’s Off Grid (2016) is a work that uniquely captures a particular moment—a very volatile and exciting moment, at that—in the development of participatory internet culture. This is a period in which artists are experimenting with innovative new ways to create and distribute their digital works to an audience that is able to provide immediate feedback. Of course, saying that Off Grid is the ultimate work that defines all of participatory internet culture is silly; trying to sum up the movement with one piece would ignore how various works cross-reference one another, as well as the blurred line between audience and creator. My goal is to use Off Grid as a window into this online culture and I plan on eventually collecting similar works to show the various paths along which the culture developed.
In order to best capture a snapshot of this scene, the core materials to be gathered are:
- Web pages: Kutiman’s homepage, Off Grid’s YouTube page, the YouTube pages of the videos used by Kutiman
- The individual videos (Off Grid and the videos that compose it)
- Web pages that embed the video or provide write-ups
There are also a few things that I will attempt to gather, though I can’t be sure I will succeed:
- Interviews with Kutiman and the other video creators
- Kutiman’s working files
Now that I’ve laid out the broad plan, I will go into detail regarding the different materials.
Web pages

As Kutiman says of his work, “It’s all going to be on the internet. It’s from the internet, and that’s where it belongs. You can link, you can dig in it and see the other musicians, read comments or something.” He makes a good point that resonates with the spirit of this preservation effort: capturing the look and feel of www.kutiman.com and the various YouTube videos is a very high priority. For instance, a future user will benefit from being able to click on the annotations embedded in the Off Grid video on YouTube. Similarly, the experience of sifting through the collage of annotations on Kutiman’s web site should be preserved. Half the fun is utilizing these discovery-facilitating features! A user may also be interested in checking out the comments, Related Videos, or the counts for thumbs up/down for context.
Archive-It is a well supported web archiving service with a good amount of customizability via its curator controls. Recently it has received a good amount of support in regards to crawl configurations for archiving YouTube content. The YouTube pages for Off Grid and its component videos will be crawled every few months in order to capture updates. (Once this project broadens, the amount of crawls may be reduced or perhaps limited to one capture.) I should note that Kutiman’s homepage will only need to be captured once, as Off Grid is currently the entirety of the page and it will be removed once it is time to show off his newest creation. I am sort of pretending that I work at an institution that already has Archive It, but I should note that there are alternative tools that can be obtained for free, such as youtube-dl.
Grabbing the YouTube pages for the component videos is important because it will provide a glimpse into the online presence of their creators. Who are these people throwing up videos of themselves playing the sitar/saxophone/hurdy-gurdy, etc.? What are the Related Videos on their pages? Did their pages explode with views thanks to Kutiman? Have people visited via Off Grid and left comments? Many of these questions can be answered partially by viewing the Video Statistics feature in YouTube.
Obtaining web pages that embed Off Grid or provide write-ups is also important in regards to getting context, understanding how the work was received and spread throughout the internet. These sites will identified the old fashioned way: browsing the web for a representative sampling.
Videos
So, if I am already grabbing all the individual web pages, why do I also want the raw videos? The videos will function in a sense as access copies. Not every user will need to view the entire web page, as they may be more interested in just studying the individual videos. Furthermore, Archive It isn’t perfect and I have found claims that it occasionally has trouble with videos. If necessary, the videos can be inserted into a mock up of the web page, similar to how Rhizome inserted PNGs into the source code of Legendary Account pages due to improperly functioning Flash-based visuals. This would sacrifice some of the interactivity in regards to clicking on the annotations, but at least the look of the site and the ability to read comments and such would remain.
In order to get the raw videos from YouTube, a free YouTube downloader and converter will be used. ClipGrab is a reliable and easy-to-use tool that will work perfectly for this project. Thankfully, Kutiman has provided the links to all 96 videos, so there won’t be any resources spent divining the origins of his raw materials; one simply needs to paste the video’s address in ClipGrab and then select the format and quality. All videos will be saved as MPEG4s, as the format is the standard for streaming media and looks to remain well supported. ClipGrab also allows one to obtain a video in its original form (MPEG4, FLV, or WebM), bypassing any conversion. The original form of the video will be documented in its metadata. Videos will be saved at the highest level of quality available.
I should note that I am not interested in contacting the various creators in the hopes of obtaining uncompressed copies of the original videos. The YouTube videos seen by the public—and the inspiration of Kutiman—were compressed, so it is important to preserve that aspect.
Interviews and Working Files
I am, however, interested in obtaining interviews with Kutiman and the various uploaders so that we can have a record of their impressions. I have been unable to find any so far, but Off Grid was only released this February so I will maintain a lookout.
My statement of significance for Off Grid does minimize the importance of the creator’s intent, giving more focus to how the work embodies participatory internet culture, but there is undeniable value in being able to understand the tools used by artists. After all, each and every video that gets uploaded to YouTube gets tinkered with in some sort of video editor software (well, maybe not all of them…), so knowing the affordances of these platforms helps one achieve a deeper understanding of the culture—from the creator’s side of things, that is. It is difficult to know what software Kutiman used because there aren’t too many interviews regarding Off Grid—especially those that want to delve into the nuts and bolts of its construction. I believe that Kutiman used Sony Vegas Pro for Thru You (thanks to Eric for the tip), but I can’t be sure that he stuck with it for Off Grid. He also hasn’t responded to contact attempts, so no luck there.
Ultimately, I feel that getting hold of the working files is within the scope of this preservation effort, but it is a lower priority and will need to be explored further in the future. Even if Kutiman gave us copies of all his working files, shelling out the money for Sony Vegas Pro is a bit too much considering that it will (currently) only be used to view one set of files and access will be limited. Besides, there are several videos on YouTube itself that discuss how to make YouTube videos; why not corral a bunch of those together for another preservation project? I think that a suitable alternative for this project is a brief documentary style video in which Kutiman explains how he worked within the software. But this requires him to respond to my inquiries and be willing to participate. (He did make such a video for Thru You, so I suppose it can pull double duty until a “Making of” video for Off Grid is created.)
The Kids Are Alright (and so are their MP3s)
Vinyl Fetish
I spent four semesters of grad school as a teaching assistant for History of Popular Music, 1950s-Present. The class had a tendency to dwell on the first thirty or so years of the proposed sixty-five-ish, causing the last weeks of the semester to get juuuuust a little compressed*, but the professor always saved the last day for his traditional farewell lecture: an invective against MP3s and YouTube compression. He would give a brief description of analog audio recordings vs digital, rail on Apple ear buds, play a variety of samples to showcase different degrees of YouTube compression, and show the hazards of wanton ripping/compression, among other things.
Of course, the end goal of this “scared straight” style presentation was really to get students thinking hard about audio fidelity. Many of them were raised listening to heavily compressed music on YouTube and/or ripped MP3s of dubious quality. But perhaps there were more important questions to pose. How did we arrive at the particular media formats that are in wide use? What is audio fidelity to the people manufacturing new types of media? Or how about, as Jonathan Sterne asks in MP3: the Meaning of a Format:
“In an age of ever-increasing bandwidth and processing power, why is there also a proliferation of lower-definition formats?”
These questions have many ways to be answered because there is no grand narrative of media formats. Formats do not simply edge closer and closer by degrees of mediation from reality. Nevertheless, we see this type of thinking all over the place. Especially in regards to branding and marketing for new formats and platforms.

Mediality and Compression
This brings us to the concept of mediality. Sterne describes this as “a quality of or pertaining to media and the complex ways in which communication technologies refer to one another in form or content.” Essentially, it means that media formats naturally cross-reference one another in a variety of ways and that each medium is shaped by a unique history that involves aspects of both technological and cultural practices, among other things. Sterne presents the 74-minute compact disc as an example; sources claim that the design and storage capacity of the CD were meant to mimic the size and portability of the then-popular cassette tape.
How exactly does this relate to the proliferation of the MP3 (and other lower-definition formats)? Well, without going into a tremendous amount of detail, the MP3 can be situated within a tradition of compression. That is, the development of ways to spread information efficiently to the widest area possible. Lossless compression involves redundant data stored in a sort of shorthand that is read by an encoder and reconstructed without any, well, loss. Lossy compression does the same thing, but also permanently cuts out information that is deemed to be of lesser importance.
This is where my old professor’s lecture comes in. Audiophiles freak out about MP3s because you are actually losing aspects of the performance (such as dynamic range), as well as occasionally introducing artifacts (such as pre- and post-echoes). Still, you can’t complain with how nice ‘n’ svelte these files are!
The Bright Side of Things
So, perhaps my professor was right about some things, but isn’t there a way to add a more positive spin to his lecture?
To again borrow from Sterner:
“Compression also allows media content to proliferate in new directions, where it might not otherwise have gone. Innovation in digital technology tends towards finding new sites, contexts, and uses for file compression, rather than eliminating them.”
There we go! Compression led to innovations like streaming on Netflix and putting annoying MP3 ringtones on our phones. Or how about the wild success of YouTubers or Twitch streamers? These new forms of broadcast content can quickly be produced on the cheap, receive feedback, and be tweaked accordingly. [Is this starting to sound like a lean startup methodology pep talk for creative types?] A high quality video that is five minutes long could easily be around 50 gigs in size, so downloading would be a chore; compression algorithms can reduce this file to a tiny fraction of this size and make it easy to stream.
The GIF also provides some wonderful examples: Reaction GIFS, cinemagraphs, and other types of GIF art (such as the output of Peekasso). In the case of the Reaction GIF, the practice became so widely adopted that Facebook now allows users to directly embed Reaction GIFs in messages—and there is even a small discovery platform that allows one to search for an appropriate GIF and view ones that are trending.
Whither MP3?
So why hasn’t the MP3 been supplanted yet? Perhaps, it can simply be attributed to people enjoying the various artifacts that appeared on the formats that were popular in their formative years. Each of these artifacts—the hiss of vinyl, the flatly compressed sound of 4-track recorder demos passed around by grunge bands, etc.—are now rich with meaning and referenced by new recordings. Why should it be any different for MP3s? Or perhaps it is the fact that MP3s are so widespread and people just don’t feel any pressure to switch to something else. This makes me think of Eppink’s history of the GIF and how he states that attempts to one-up the GIF have failed because companies often fail to understand the affordances that made the GIF successful. Could this be why people aren’t flocking to Jay-Z’s Tidal service? Why people aren’t going nuts over Neil Young’s PonoPlayer (which, judging from the list of Most Requested Resolution Upgrades, seems to attract only listeners who reject anything that doesn’t fit comfortably on a Classic Rock Commercial-Free Power Hour)? Who knows what the future may hold.