Digital Proposal: Creating a Digital Repository of Syrian Archival Material

Project Description: As the Arab Spring rocked the social and political foundations of the Middle East, many foreign and internal observers alike believed that Syria would survive the storm with their regime intact. In March 2011, however, protests broke out in the city of Deraa after the regime arrested and tortured several schoolchildren for writing anti-regime graffiti on the wall. As government repression and violence escalated, so did opposition to the Assad regime, soon plunging the country into a civil war that continues to this day.

The consequences of this conflict have been numerous and devastating. The human cost of the war alone has been enormous. According to the World Bank, the death toll has risen to almost 500,000 since 2011, with 5 million seeking refugee abroad and 6.3 million internally displaced. Syrian cultural heritage sites have been damaged or destroyed by military bombardments and deliberate targeted destruction carried out by the Islamic State of Iraq and the Levant. The violence and unsafe conditions have also prevented travel to the area. The U.S. State Department has issued a warning advising all private citizens against travel to Syria “due to terrorism, civil unrest, and armed conflict.”

The on-going conflict poses significant problems for researches of Syrian history. While conducting research there has never been an easy process, with language barriers and government censorship, the current conflict has made it next to impossible. So what options are available to scholars or even the general public interested in learning from Syrian archives?

The idea for this project came to me while I was meeting with my advisor in my second week at AU. In the corner of her office, she had several boxes stacked in the corner. She said that they contained documents from Damascus that she had collected for research on her previous book. She commented that she should probably digitize them because it didn’t look like people would have access to that archive any time soon. How many other scholars of Syrian history have had similar thoughts?

My digital project will create an online archive of historical material from Syria. I will use Omeka to develop a digital archive to house documents and artifacts collected by scholars. As a crowdsourcing project, the targeted volunteers are a small group. I will begin by contacting scholars of Syria in the surrounding area to determine if they possess documents from Syrian archives and if they would be willing to digitize them.

Audience: The audience for this project are historians, academics, and researchers interested in studying Syrian history but are unable to visit the Syrian archives due to current circumstances.

Existing Projects: Several projects have focused on documenting the civil war itself. One such project is the Syrian Archive, which is dedicated to collecting visual documentation of human rights violations in Syria that is “transparent, detailed, and reliable.” The U.S. Holocaust Memorial Museum also has an exhibit dedicated to keeping the Syrian crisis present in the minds of the public.

Other projects have explored digital methods for preserving Syria’s cultural heritage. Arachne is a site launched by the Orient Department of the German Archaeological Institute and the Museum of Islamic Art dedicated to creating an online archive of Syrian cultural objects. UNESCO has been similarly involved in generating international awareness of the destruction of “built, movable, and intangible” heritage in Syria.

The Wilson Center Digital Archive of declassified Cold War documents has provided a model for this project. While this project is funded and run by the History and Public Policy Program at the Woodrow Wilson International Center for Scholars, the concept has provided inspiration. Documents are digitized and sometimes translated from all over the world, including the former Soviet bloc, giving scholars access to information that would normally remain out of reach due to language barriers and travel constraints.

Plan for Outreach and Publicity: My first avenue of outreach will be to scholars of Syrian history in the surrounding area. I will expand my geographical reach from there. I will also reach out to organizations dedicated to the study of the Middle East, such as the Middle East Institute (MEI) and the Middle East Research Institute (MERI) for critical feedback on organization and publicity outreach.

Evaluation Plan: A successful project will collect material from several sources and to successfully communicate with Middle Eastern research institutes to discuss outreach opportunities.

Citations

The Syrian Archive – https://syrianarchive.org/en

iDAI.objects arachne – https://arachne.dainst.org/project/syrher#Browse_project_content

United States Holocaust Memorial Museum – https://www.ushmm.org/confront-genocide/syria

Observatory of Syrian Culture – https://en.unesco.org/syrian-observatory/

Wilson Center Digital Archives – https://digitalarchive.wilsoncenter.org/

A Macroanalysis of FRUS: Topic Modeling Middle Eastern Policy

As I mentioned in my post on Jockers’ Macroanalysis: Digital Methods and Literary History, the methods described in his book have interesting implications for the study of U.S. foreign policy. For example, if one were to study the Eisenhower Administrations’ Middle Eastern policy from his inauguration to his farewell address, the sheer number of diplomatic correspondences alone would be mind-blowing. If one were to just examine the State Departments’ publication of Foreign Relations of the United States (FRUS) relating to the Middle East (excluding Northern Africa with the exception of Egypt and Southeast Asia), you would have to explore thirteen different volumes. Even if a historian could closely read all the documents contained in those volumes, it’s hard to see the forest through the trees.

Computational analysis is one method of seeing that forest. Topic modeling, in particular, allows you to analyze a large corpus of texts and identify words that appear together multiple times in various documents. What sets topic modeling apart from programs like Ngram is that it can be done without knowing ahead of time what topics are the most important. As Cameron Blevins writes in “Topic Modeling Martha Ballad’s Diary,” topic modeling has a lot of potential as historic source material. He concluded that MAchine Learning for LanguaE Toolkit or MALLET did a better job of grouping words than a human reader, in some cases creating word groups that he never would have predicted. This methodology allows a historian to extract patterns that would be missed during a microanalysis of the text.

Using topic modeling to analyze FRUS has the potential to reveal a great deal about U.S. foreign policy in the Middle East. What messages were diplomats most concerned about conveying to foreign dignitaries? What were the most frequent or concerning issues that faced policymakers? Did those patterns shift with each new presidential administration and how did they correlate to events on the ground or shift during election years? These questions would be difficult to answer by a single historian doing a close reading of the documents in FRUS, but are possible with the use of topic modeling software.

There are limits to this, of course. MALLET groups a limited number of topics in an unsupervised model. This can often create what Jockers calls a black box. The goal of a historian is to be able to interpret the results and draw conclusions, but some topics may be incomprehensible. This doesn’t necessarily have to be a bad thing. There will be topics produced that are unclear or false due to the presence of ‘stop words,’ but the clear topics can still be conceptualized and interpreted. As Jockers states, “we do no disservice to the overall model, and we in not way compromise our analysis” (129).

Therefore, my print project will utilize MALLET to perform a LDA topic modeling of the Eisenhower Administration’s foreign policy in the Middle East. Using the State Department’s publications of FRUS between 1953 and 1960 relating to Mideast policy, I will explore which topics are most prevalent throughout that period. Depending on time and the availability of resources, I will also compare topics from the Eisenhower Administration to the Truman, Kennedy, Johnson, Nixon, Ford, and Carter Administrations to examine how topics have shifted or remained important at different stages of the Cold War and during different presidential administrations.

Macroanalysis: Applying Computational Analysis to History

If you’re in a history graduate program, odds are you like to read. A lot. You’ve probably already discovered which areas of history interest you the most and you’ve made it your mission to read as much about it as possible. But on your first trip into the archives in search of primary sources, you probably realized with a sense of dread that it’s physically impossible to read and give a proper analysis of everything that’s out there.

Computers have made this process significantly easier. Archives are digitizing and transcribing more and more historical documents. We have the ability to run searches of key words and phrases to locate sources. But as Matthew Jockers writes in Macroanalysis: Digital Methods & Literary History, “Close reading, digital searching, will continue to reveal nuggets, while the deeper veins lie buried beneath the mass of gravel layered above” (9).

So how can we access the deep recesses of the archives without engaging in a close reading of the sources we find? According to Jockers, the answer lies in computational analysis. In light of technological advancements, Jockers argues that historians and other practitioners of the humanities need to embrace new methodologies for utilizing the massive amount of information available to us through digitization. Macroanalysis, or distant reading, allows scholars to look at patterns and trends as they appear within an entire collection. Here are a few examples of how Jockers uses macroanalysis in the field of literary history.

Jockers examined 758 works of Irish-American literature over a period of 250 years by using metadata embedded in bibliographies. He was able identify larger trends in the literature based on the gender of the author, geographical region in which the work was produced, and the setting in which the story takes place (among other variables). With the full-texts available digitally, Jockers demonstrated how computing software could trace word use, book length, and ethnic markers. In subsequent chapters, Jockers examined how macroanalysis could provide insight into the style, nationality, theme, and influence of a large body of literature by looking at something as simple as the use of “the” or how many times the author used a particular pronoun. He describes how literary scholars could use correlation coefficients, topic modeling, and information cascades to draw conclusions about how texts relate to each other over time.

Some of you may be thinking that this sounds a bit too quantitative for you. (It sounds too much like math to me, and I hate math!) After all, the humanities is about contextualization and interpretation, which is absent from a distant reading of the literature. Jockers agrees with you. He points out that there are significant drawbacks to conducting macroanalytical research. These massive data sets produce outliers and exceptions. There is always a possibility that the metadata is incorrect (Jockers points out that in many instances the publication dates and author’s genders were incorrect). Computational analysis cannot provide context and interpretation. And in many cases, sources and books are not available digitally due to copyrights laws and the significant time it takes for archives to digitize their collections.

The main point Jockers is trying to make is that macroanalysis should not take the place of microanalysis. They “should and need to coexist” (9). A macroanalysis of literary or historical data can help scholars better understand the broad patterns and trends of their research interest. Those patterns and trends, taken from a massive data set, provide the context in which a close reading of a smaller set of texts can take place. 

Jockers book is an excellent resources for those historians resistant to change. He is not lauding computational analysis as the new way of conducting research in the humanities. He is urging scholars not to ignore a tool that can provide new insights into their field of study.

As a diplomatic historian, this methodology is intriguing. If any of you have ever perused Foreign Relation of the United States (the last one I looked at was 4,752 pages long in epub form and only presented selected State Department documents on the Suez Crisis), you will recognize the benefits of having a tool that can identify trends in language and style in diplomatic correspondences or determine which themes are the most prevalent. As David Allen and Matthew Connelly suggest in “Diplomatic history after the big bang: using computational methods to explore the infinite archive,” distant reading can reveal what kinds of documents are more likely to be redacted or rank documents according to their relevance to your research interests (83). The results of this type of analysis are not sufficient by themselves, but when used to inform, conceptualize, and direct your microanalytical research, they are extremely beneficial.

Works Cited:

Allen, David, and Matthew Connelly. “Diplomatic History after the Big Bang: Using Computational Methods to Explore the Infinite Archive.” In Explaining the History of American Foreign Relations, 3rd ed. Cambridge: Cambridge University Press, 2016.

Jockers, Matthew L. Macroanalysis: Digital Methods and Literary History. Urbana and Chicago: University of Illinois Press, 2013.

Introducing Erica

My name is Erica Devine and I am a first year PhD student at American University.  My current research interests lay in the history of U.S. foreign policy in the Middle East. I took a rather circuitous path to where I am now. I first dabbled in the political history of the Middle East as an undergraduate at Providence College and became fascinated with studying the roots of current conflicts in that region and their relation to U.S. policy. I graduated with a degree in political science and history. With degree in hand, I went off to Connecticut as a Teach for America Corps member to teach 8th grade social studies. After four years in the trenches, I decided to pursue a Masters in American history at Oxford University, where my research focused on U.S. intervention in Syria in the early Cold War. Upon my return stateside, and after another year of teaching 7th grade humanities, I realized that I wanted to pursue research and higher education.

One important lesson I learned as a teacher is that history needs to be accessible. Teaching American history to 13 and 14 years olds forced me to consider how historical scholarship can be tailored to younger students and non-academics. This is especially important in the field of U.S. foreign policy in the modern Middle East where new scholarship has very real implications in the present day. My aim for this course is to learn how I can use digital technologies to create an online presence for myself and make my research accessible to both academics and non-academic students of history. Additionally, my goal is to understand how digital tools and sources are influencing new scholarship in my field and how I can incorporate those new methods in my own research.