My understanding of Augmented Reality (AR) and its affordances has evolved as a direct result of this project and some of the excellent conversations we had during our class sessions. When I proposed an AR poster set earlier in the semester, I knew that AR could be used to deliver digital content that a user would not otherwise access through the physical, printed version of a poster set. While my work this semester has confirmed this understanding, it has also broadened my ideas about the type of digital content that AR is advantageously positioned to deliver.
In the early stages of my project, I was focused on adapting digital content from a USHMM online exhibition and designing an AR experience that could serve as a digital, English-language extension of the exhibition’s related poster set. Beyond this original intent, a survey of all available resources related to the poster set led me to explore AR as a tool for inclusive design and accessibility. The USHMM poster set I used is available in 10 languages, including English and Spanish. The availability of translated content presented an opportunity to experiment with another compelling affordance of AR: unobtrusively providing translated text for users who would prefer to experience interpretive content in a language other than English.
In the end, I did create a prototype version of an English-language extension for the poster set centered on the Nazi regime’s use of propaganda. The prototype uses image-recognition to connect users with contextualized primary source materials related to individual posters in the set. Through AR, users can examine digital surrogates of Nazi propaganda posters overlaid with contextualizing text that explains the techniques that Nazi propagandists used to communicate their message.
In creating the English-language extension prototype, I encountered a few minor problems when I added text overlays that I created using the graphics editor PIXLR Pro to an AR experience in HP Reveal Studio. Some text overlays would appear blurry and therefore illegible when an AR experience was triggered through the HP Reveal mobile phone app. Through a frustrating cycle of trial and error, I learned that any text overlays that I wanted to be legible in the HP Reveal AR output on a mobile device needed to be created using at least 50-point font.
I also created a prototype AR experience that overlays all text on the English-language posters with Spanish text. The process that I refined during this project can easily be used to create AR translation experiences for the 8 additional languages in which the poster set is already available.
I was fortunate to produce two functional prototypes at the draft stage of this project, and the demo videos that I created at that point in the process remain the best representation of the prototypes in action, short of filming actual users testing them (which I was excited that some of you were able to do during our in-class poster session).
As we learned from reading about IDEO’s iterative, human-centered design approach, involving users in the design process by engaging them in an ongoing dialogue, testing prototypes with them, and listening to and acting on their feedback will result in better, more effective products. In order to move beyond the prototypes I created for this project, I hope to engage USHMM visitors in an iterative, human-centered design process in the coming months.
Here is a PDF version of my evaluation plan:
Here is my project poster, which presents a succinct overview of the backround for this project, and my methods, prototypes, and ideas for future directions: