UCL Centre for Digital Humanities


Computation and the Humanities

The field now known as Digital Humanities (DH) is almost 70 years old. However, we have no comprehensive histories of its research trajectory or its disciplinary development. This book by Julianne Nyhan and Andrew Flinn makes a first contribution towards remedying this by uncovering, documenting, and analysing many of the social, intellectual and creative processes that helped to shape DH research from the 1950s until the present day.

Table of Contents

  1. Introduction
  2. Why Oral History?
  3. Individuation Is There in All the Different Strata: John Burrows, Hugh Craig and Willard McCarty
  4. The University Was Still Taking Account of universitas scientiarum: Wilhelm Ott and Julianne Nyhan
  5. hic Rhodus, hic salta: Tito Orlandi and Julianne Nyhan
  6. They Took a Chance: Susan Hockey and Julianne Nyhan
  7. The Influence of Algorithmic Thinking: Judy Malloy and Julianne Nyhan
  8. I Would Think of Myself as Sitting Inside the Computer: Mary Dee Harris and Julianne Nyhan
  9. There Had to Be a Better Way: John Nitti and Julianne Nyhan
  10. It's a Little Mind-Boggling: Helen Agüera and Julianne Nyhan
  11. I Heard About the Arrival of the Computer: Hans Rutimann and Julianne Nyhan
  12. I Mourned the University for a Long Time: Michael Sperberg-McQueen and Julianne Nyhan
  13. It's Probably the only Modestly Widely Used System with a Command Language in Latin: Manfred Thaller and Julianne Nyhan
  14. Getting Computers into Humanists' Thinking: John Bradley and Julianne Nyhan
  15. Moderate Expectations, Tolerable Disappointments: Claus Huitfeldt and Julianne Nyhan
  16. So, Into the Chopper It Went: Gabriel Egan and Julianne Nyhan
  17. Revolutionaries and Underdogs
  18. Conclusion


Julianne Nyhan and Andrew Flinn

This chapter begins with an introduction to Digital Humanities (DH) and outlines its development since c.1949. It demonstrates that the application of computing to cultural heritage has been ongoing for some 70 years yet the histories of DH have, until recently, remained mostly unwritten. After exploring some of the particular difficulties that attend any attempt to write such histories the approach that we have taken in this book is explained in detail. We close by asking why histories of DH are needed and essential to undertake. 

Why Oral History?

Andrew Flinn and Julianne Nyhan

This chapter begins with an overview of the histories of oral history and its use within different branches of academic and public history. Focussing next on the study of communities, it briefly explores the contested, fuzzy and fluid meaning of the term 'community' before examining the application of oral history to community histories, including academic and professional communities. It discusses some of the ethical challenges at stake in this type of historical research, including the multifaceted relationship between the interviewer and the interviewee, and the choice of which 'significant' lives are privileged to tell the story of the community (and therefore which significant lives and perspectives might be missing). Before outlining some of the issues surfaced by using oral history to document foundational stories of DH as a discipline, this chapter looks briefly at the use of oral history in some other analogous professional and academic settings. In conclusion, the chapter reflects on the suitability of oral history in telling these community stories by asking who owns these histories and how that ownership is manifested.

Individuation Is There in All the Different Strata

John Burrows, Hugh Craig and Willard McCarty

This oral history interview between Willard McCarty (on behalf of Julianne Nyhan), John Burrows and Hugh Craig took place on 4 June 2013 at the University of Newcastle, Australia. Harold Short (Professor of Humanities Computing at King's College London and a Visiting Professor at the University of Western Sydney in the School of Computing, Engineering and Mathematics) was also present for much of the interview. Burrows recounts that his first encounter with computing took place in the late 1970s, via John Lambert, who was then the Director of the University of Newcastle's Computing Service. Burrows had sought Lambert out when the card-indexes of common words that he had been compiling became too difficult and too numerous to manage. Craig's first contact was in the mid-1980s, after Burrows put him in charge of a project that used a Remington word processor. At many points in the interview Burrows and Craig reflect on the substantial amount of time, and, indeed, belief, that they invested not only in the preparation of texts for analysis but also in the learning and development of new processes and techniques (often drawn from disciplines outside English Literature). Much is said about the wider social contexts of such processes: Craig, for example, reflects on the sense of possibility and purposefulness that having Burrows as a colleague helped to create for him. Indeed, he wonders whether he would have had the confidence to invest the time and effort that he did had he been elsewhere. Burrows emphasises the network of formal and informal, national and international expertise that he benefitted from, for example, John Dawson in Cambridge and Susan Hockey in Oxford. So too they reflect on the positive results that the scepticism they sometimes encountered had on their work. As central as computing has been to their research lives they emphasise that their main aim was to study literature and continuing to publish in core literature journals (in addition to DH journals) has been an important aspect of this. Though they used techniques and models that are also used by Linguists and Statisticians their focus has remained on questioning rather than answering.

The University Was Still Taking Account of universitas scientiarum

Wilhelm Ott and Julianne Nyhan

This oral history interview between Wilhelm Ott and Julianne Nyhan was carried out on 14 July 2015, shortly after 10am, in the offices of pagina in Tübingen, Germany. Ott was provided with the core questions in advance of the interview. He recalls that his earliest contact with computing was in 1966 when he took an introductory programming course in the Deutsches Rechenzentrum (German Computing Center) in Darmstadt. Having become slightly bored with the exercises that attendees of the course were asked to complete he began working on programmes to aid his metrical analysis of Latin hexameters, a project he would continue to work on for the next 19 years. After completing the course in Darmstadt he approached, among others such as IBM, the Classics Department at Tübingen University to gauge their interest in his emerging expertise. Though there was no tradition in the Department of applying computing to philological problems they quickly grasped the significance and potential of such approaches. Fortunately, this happened just when the computing center, up to then part of the Institute for Mathematics, was transformed into a central service unit for the university. Drawing on initial funding from the Physics department a position was created for Ott in the Tübingen Computing Center. His role was to pursue his Latin hexameters project and, above all, to provide specialised support for computer applications in the Humanities. In this interview Ott recalls a number of the early projects that he supported such as the concordance to the Vulgate that was undertaken by Bonifatius Fischer, along with the assistance they received from Roberto Busa when it came to lemmatisation. He also talks at length about the context in which his TUSTEP programme came about and its subsequent development. The interview strikes a slightly wistful tone as he recalls the University of Tübingen's embrace of the notion of universitas scientiarum in the 1960s and contrasts this with the rather more precarious position of the Humanities in many countries today.

Hic Rhodus, Hic Salta

Tito Orlandi and Julianne Nyhan

This interview was carried out in Rome, Italy on 17 October 2014 at about 09:00. Orlandi recounts that his earliest memory of a computer dates to the 1950s when he saw an IBM machine in the window of an IBM shop in Milan. Around 1960, together with his PhD supervisor Ignazio Cazzaniga, he engaged in some brief exploratory work to see what role punched card technology might play in the making of a critical edition of Augustine's City of God. His sustained take up of computing in the 1970s arose from the practical problem of managing the wealth of information that he had amassed about Coptic manuscripts. He was aware from an early stage of the possible limitations of computational approaches: his early encounters with the work of Silvio Ceccato left him wary of approaches to cybernetics. He identifies the work of the applied mathematician Luigi Cerofolini who taught him UNIX, among other things, as having been central to his understanding of methodological issues. In relation to theory, he emphasises the impact that understanding Turing's Universal Computing Machine made on him. Indeed, his work on the significance of modelling to Humanities Computing (see, for example, the discussion in Orlandi, T. (n.d.)) preceded that of McCarty (2005). In addition to questioning inherited beliefs about the origins of DH, particularly in regard to the role of Fr Roberto Busa S.J., in this interview Orlandi argues that DH has not given sufficient attention to the fundamentals of computing theory.

They Took a Chance

Susan Hockey and Julianne Nyhan

This interview was carried out via Skype on 21 June 2013. Hockey was provided with the core questions in advance of the interview. Here she recalls how her interest in Humanities Computing was piqued by the articles that Andrew Morton published in the Observer in the 1960s about his work on the authorship of the Pauline Epistles. She went on to secure a position in the Atlas Computer Laboratory where she was an advisor on COCOA version 2 and wrote software for the electronic display of Arabic and other non-ASCII characters. The Atlas Computer Laboratory was funded by the Science Research Council and provided computing support for universities and researchers across the UK. While there she benefitted from access to the journal CHum and built connections with the emerging Humanities Computing community through events she attended starting with the 'Symposium on Uses of the Computer in Literary Research' organised by Roy Wisbey in Cambridge in 1970 (probably the earliest such meeting in the UK). Indeed, she emphasises the importance that such gatherings played in the formation of the discipline. As well as discussing her contribution to organisations like ALLC and TEI she recalls those who particularly influenced her such as, inter alia, Roberto Busa and Antonio Zampolli.

The Influence of Algorithmic Thinking

Judy Malloy and Julianne Nyhan

This interview was carried out via Skype on 11 August 2015 at 20:30 GMT. Malloy was provided with the core interview questions in advance. Here she recalls that after graduating from university she took a job as a searcher/editor for the National Union Catalog of the Library of Congress. About a year after she arrived Henriette D. Avram began work on the process of devising a way to make the library's cataloguing information machine readable (work that would ultimately lead to the development of the MARC format (Schudel 2006)). Malloy recalls this wider context as her first encounter, of sorts, with computing technology: though she did not participate in that work it made a clear impression on her. She had learned to programme in FORTRAN in the 1960s when working as a technical librarian at the Ball Brothers Research Corporation. She had also held other technical roles at Electromagnetic Research Corp and with a contractor for the Goddard Space Flight Center, which was computerising its library around the time she worked there. She recalls that she did not use computers in her artistic work until the 1980s (when she bought an Apple II for her son). However, she had been working in an interactive, multimedia and associative mode for some time before then, as evidenced by the card catalog poetry and electronic books that she created in the 1970s and early 1980s. In this interview she traces the importance of card catalogs, Systems Analysis and algorithmic thinking to many aspects of her work. She also reflects on why it was that the idea of combining computing and literature did not occur to her (and also was not practically feasible) until a later stage in her career. Among other topics, she reflects on the kinds of computing and computing environments that she encountered, from the reactions in the 1960s of some male engineers to the presence of a female technical librarian in the mainframe room to the thrill of discovering the community that was connected via the Whole Earth 'Lectronic Link (The WELL).

I Would Think of Myself as Sitting Inside the Computer

Mary Dee Harris and Julianne Nyhan

This oral history interview was conducted on 3 June 2015 via Skype. Harris was provided with the interview questions in advance. Here she recalls her early encounters with computing, including her work at the Jet Propulsion Lab in Pasadena, California. Despite these early encounters with computing she had planned to leave it behind when she returned to graduate school to pursue a PhD; however, the discovery of c.200 pages of a Dylan Thomas manuscript prompted her to rethink this. Her graduate study was based in the English Department of the University of Texas at Austin, which did not have an account with the computer centre, and so it was necessary for her to apply for a graduate student grant in order to buy computer time. Her PhD studies convinced her of the merits of using computers in literary research and she hoped to convince her colleagues of this too. However, her applications for academic jobs were not initially successful. After working in Industry for a time she went on to secure academic positions in Computer Science at various universities. During her career she also held a number of posts in Industry and as a Consultant. In these roles she worked on a wide range of Artificial Intelligence and especially Natural Language Processing projects. Her interview is a wide-ranging one. She reflects on topics like the peripheral position of a number of those who worked in Humanities Computing in the 1970s and her personal reactions to some of the computing systems she used, for example, the IBM 360. She also recalls how she, as a woman, was sometimes treated in what tended to be a male-dominated sector, for example, the Physics Professor who asked "So are you going to be my little girl?"

There Had to Be a Better Way

John Nitti and Julianne Nyhan

This oral history conversation was carried out via Skype on 17 October 2013 at 18:00 GMT. Nitti was provided with the core questions in advance of the interview. He recalls that his first encounter with computing came about when a fellow PhD student asked him to visit the campus computing facility of the University of Wisconsin-Madison, where a new concordancing programme had recently been made available via the campus mainframe, the UNIVAC. He found the computing that he encountered there rather primitive: input was in uppercase letters only and via a keypunch machine. Nevertheless, the possibility of using computing in research stuck with him and when his mentor Professor Lloyd Kasten agreed that the Old Spanish Dictionary project should be computerised, Nitti set to work. He won his first significant NEH grant c.1972; up to that point (and, where necessary, continuing for some years after) Kasten cheerfully financed out of his own pocket some of the technology that Nitti adapted to the project. In this interview Nitti gives a fascinating insight into his dissatisfaction with both the state and provision of the computing that he encountered, especially during the 1970s and early 1980s. He describes how he circumvented such problems not only via his innovative use of technology but also through the many collaborations he developed with the commercial and professional sectors. As well as describing how he and Kasten set up the Hispanic Seminary of Medieval Studies he also mentions less formal processes of knowledge dissemination, for example, his so-called lecture 'roadshow' in the USA and Canada where he demonstrated the technologies used on the dictionary project to colleagues in other universities.

It's a Little Mind-Boggling

Helen Agüera and Julianne Nyhan

This interview was carried out between London and Washington via Skype on 18 September 2013, beginning at 17:05 GMT. Agüera was provided with the core questions in advance of the interview. She recalls that her first encounters with computing and DH came about through her post in National Endowment for the Humanities (NEH), where she had joined a division that funded the preparation of research tools, reference works and scholarly editions. Thus, she administered grants to a large number of projects that worked, at a relatively early stage, at the interface of Humanities and Computing, for example, Thesaurus Linguae Graecae. In this interview she recalls some of the changes that the division where she worked made to its operating procedures in order to incorporate digital projects. For example, in 1979, a section that was added to application materials asking relevant projects to provide a rationale for their proposed use of computing or word processing. She also discusses issues like sustainability that became apparent over the longer term and reflects on some of the wider trends she saw during her career. Computing was initially taken up by fields like Classics and lexicography that needed to manage and interrogate masses of data and thus had a clear application for it. She contrasts this with the more experimental and exploratory use of computing that characterises much of DH today.

I Heard About the Arrival of the Computer

Hans Rutimann and Julianne Nyhan

This oral history interview was conducted between Hans Rutimann and Julianne Nyhan via Skype on 15 November 2012. Rutimann was provided with the core questions in advance of the interview. Here he recalls that his first encounter with computing was at the Modern Languages Association (MLA), c.1968/9. Following a minor scandal at the organisation, which resulted in the dismissal of staff connected with the newly arrived IBM 360/20, Rutimann was persuaded to take on some of their duties. After training with IBM in operating and programming he set about transferring the membership list (about 30,000 contact details) from an addressograph machine to punched cards. After the computer's early use to support such administrative tasks the MLA began investigating the feasibility of making the research tool called the MLA International Bibliography (information about accessing the present-day version of the bibliography is available here: https://_www._mla._org/_bib__electronic) remotely accessible. Rutimann worked with Lockheed to achieve this. It was in Lockheed's information retrieval lab that the system known as Dialog, an online information retrieval system was developed (see Summit 1967). He vividly recalls how he travelled the 3000 miles to San Francisco to deliver the magnetic tape to Lockheed so that they could make the database available online. He "jumped for joy" when, once back in New York, the data was available to him via the newly acquired terminal of the MLA. While making clear that his roles in MLA, Mellon and the Engineering Information Foundation have primarily been enabling ones (and to this we can add advocacy, strategy and foresight) he also recalls the strong influence that Joseph Raben had on him and mentions some of the projects and conferences that he found particularly memorable.

I Mourned the University for a Long Time

Michael Sperberg-McQueen and Julianne Nyhan

This interview took place on 9 July 2014 at dh2014, the Digital Humanities Conference that was held in Lausanne, Switzerland that year. In it Sperberg-McQueen recalls having had some exposure to programming in 1967, as a 13 year-old. His next notable encounter with computing was as a graduate student when he set about using computers to make a bibliography of secondary literature on the Elder Edda. His earliest encounters with Humanities Computing were via books, and he mentions the proceedings of the 'Concordances and the Dictionary of Old English' conference and a book by Susan Hockey (see below) as especially influential on him. In 1985 a position in the Princeton University Computer Center that required an advanced degree in Humanities and knowledge of computing became available; he took on the post while finishing his PhD dissertation and continuing to apply for tenure-track positions. Around this time he also began attending the 'International Conference on Computers and the Humanities' series and in this interview he describes some of the encounters that took place at those conferences and contributed to the formation of projects like TEI. As well as reflecting on his role in TEI he also compares and contrasts this experience with his work in W3C. On the whole, a somewhat ambivalent attitude towards his career emerges from the interview: he evokes Dorothy Sayers to communicate how the application of computers to the Humanities 'overmastered' him. Yet, he poignantly recalls how his first love was German Medieval languages and literature and the profound sense of loss he felt at not securing an academic post related to this.

It's Probably the only Modestly Widely Used System with a Command Language in Latin

Manfred Thaller and Julianne Nyhan

This interview took place on 9 July 2014 at dh2014, the Digital Humanities Conference that was held in Lausanne, Switzerland that year. In it Thaller recalls that his earliest memory of encountering computing in the Humanities dates to c. 1973 when he attended a presentation on the use of computational techniques to map the spatial distribution of medieval coins. The difficulties of handling large, paper-based datasets was impressed upon him as he compiled some 32,000 index cards of excerpts for use in his PhD thesis. When he later encountered statistical standard software at the Institute for Advanced Studies in Vienna he found that such software could not be beneficially applied to historical data without first transforming in some way the historical data under study (indeed, the formalisation of historical and cultural heritage data is an issue that reoccurs in this interview, much as it did in Thaller's research). In light of his experience of the problems of using such software 'out of the box' to work with historical data he went on to teach himself the programming language SNOBOL. Within a few weeks he had joined a project on daily life in the middle ages and was building software to manage the descriptions of images that the project compiled and stored on punched cards. Having contributed to various other projects with computational elements, in 1978 he took up a post at the Max Planck Institut for History in Göttingen. As well as discussing the research he carried out there, for example, CLIO/k____ a databased programming system for History with a command language in Latin, he discusses the immense freedom and access to resources that he benefited from. He also goes on to discuss some of the later projects he worked on, including those in the wider context of digital libraries, infrastructure and cultural heritage.

Getting Computers into Humanists' Thinking

John Bradley and Julianne Nyhan

This interview took place in Bradley's office in Drury Lane, King's College London on 9 September 2014 around 11:30. Bradley was provided with the interview questions in advance. He recalls that his interest in computing started in the early 1960s. As computer time was not then available to him he sometimes wrote out in longhand the FORTRAN code he was beginning to learn from books. One of his earliest encounters with Humanities Computing was the concordance to Diodorus Siculus that he programmed in the late 1970s. The printed concordance that resulted filled the back of a station wagon. The burgeoning Humanities Computing community in Toronto at that time collaborated both with the University of Toronto Computer Services Department (where Bradley was based) and the Centre for Computing in the Humanities, founded by Ian Lancashire. Aware of the small but significant interest in text analysis that existed in Toronto at that time and pondering the implications of the shift from batch to interactive computing he began work as a developer of Text Analysis Computing Tools (TACT). He also recalls his later work on Pliny, a personal note management system, and how it was at least partly undertaken in response to the lack of engagement with computational text analysis he noted among Humanists. In addition to other themes, he reflects at various points during the interview on models of partnership between Academic and Technical experts.

Moderate Expectations, Tolerable Disappointments

Claus Huitfeldt and Julianne Nyhan

This interview was conducted on 11 July at the 2014 Digital Humanities Conference, Lausanne, Switzerland. Huitfeldt recounts that he first encountered computing at the beginning of the 1980s via the Institute of Continental Shelf Research when he was a Philosophy student at the University of Trondheim. However, it was in connection with a Humanities project on the writings of Wittgenstein that he learned to programme. When that project closed he worked as a computing consultant in the Norwegian Computing Center for the Humanities and in 1990 he established a new project called the 'Wittgenstein Archives', which aimed to prepare and publish a machine-readable version of Wittgenstein's Nachlass. Here he discusses the context in which he began working on the encoding scheme (A Multi-Element Code System) that he developed for that project. The influence of MECS went beyond the Wittgenstein Archives. According to Ore (2014) 'when XML itself was under development, the idea of well-formed documents (as different from documents valid according to a DTD or schema) was taken into XML from MECS'. In addition to discussing matters like the trajectory of DH research and his early encounters with the conference community he also discusses some of the fundamental issues that interest him like the role of technology in relation to the written word and the lack of engagement of the Philosophy community with such questions. Ultimately he concludes that he does not view DH as a discipline, but rather as a reconfiguration of the academic landscape as a result of the convergence of tools and methods within and between the Humanities and other disciplines. 

So, Into the Chopper It Went

Gabriel Egan and Julianne Nyhan

This interview took place at the AHRC-organised Digital Transformations Moot held in London, UK on 19 November 2012. In it Egan recalls his earliest encounters with computing when he was a schoolboy along with some memories of how computers were represented in science fiction novels, TV programmes and advertising. His first job, at the age of 17, was as a Mainframe Computer Operator. He continued to work in this sector throughout the 1980s but by the end of the decade he recognised that such roles would inevitably disappear. In 1990 he returned to university where he completed a BA, MA and PhD over the next 7 years. He recalls his shock upon returning to university as he realised how little use was then made of computers in English Studies. Nevertheless, he bought a relatively cheap, second-hand Sinclair Z88 and took all his notes on it. Later he also digitised his library of 3000 books, destroying their hard copy versions in the process. The interview contains a host of reflections about the differences that computing techniques and resources have made to Shakespeare Studies over the past years, along with insightful observations about the contributions and limitations of DH. In this interview Egan describes himself as a 'would be Digital Humanist'; indeed, it is the landscape that he describes from this vantage point that makes his interview so interesting and useful.

Revolutionaries and Underdogs

Julianne Nyhan and Andrew Flinn

Taking the work of Passerini (1979) and Portelli (1981) as a theoretical backdrop, this chapter will describe, contextualise and interpret a narrative (or 'story') that was recalled in a number, but not all, of the oral history interviews. This narrative concerns interviewees' experiences of having been ignored, undermined or marginalised by the mainstream academic community. For the purposes of discussion we will refer to this as the 'motif of the underdog'. We will complement this analysis of the oral history interviews by looking to the scholarly literature of the field and examining a theme that often occurs there, namely DH's supposedly revolutionary status (referred to below as the 'motif of the revolutionary'). Our analysis will raise the question of how DH managed to move from the margins towards the mainstream while continuing to portray itself as both underdog and revolutionary? Drawing on literature from social psychology, the history of disciplinarity and the wider backdrop of oral history, we will argue that the motifs discussed here can better be understood in terms of their function rather than their internal coherence.


Julianne Nyhan and Andrew Flinn

In this concluding chapter we explore some of the ways that the oral history interviews included in this book can be 'read'. We give particular attention to an approach to the interviews that we find intriguing and productive: how they reinforce, extend or problematize current scholarship on the history of DH, or the history of computing more generally. A case in point is the nature of the relationship that existed between DH and the wider computing industry, especially from the 1950s-1970s. We argue that the interviews included here, and the oral history methodology that underpins them, help to recover a more nuanced picture of the origins and history of DH (and computing in the Humanities more generally). They grant insights into the social, cultural, intellectual and creative processes that shaped the field's uptake and development and address how such processes were sometimes aided and sometimes hindered by external circumstances. They also provide new insights into the role of individual agency in the way they address some of the experiences and motivations of individuals who contributed to the development of this field. Such experiences are otherwise very difficult, if not impossible, to investigate using the extant professional literature. In this way, we believe that this book pushes forward the current boundaries of scholarship on the history of DH.