Computing for, in, and of the Humanities: an Oxford perspective

Lou Burnard
Humanities Computing Unit
Oxford University
13 Banbury rd
Oxford OX2 6NN, England, UK

This presentation will take a descriptive and pragmatic stance on the vexed question of where "humanities computing" should be located in our intellectual landscape at the end of the millenium. Applications of information technology in humanities subjects at Oxford University go back to the early seventies: over the subsequent two decades, we have explored both the highways and the dead-ends of the field, arriving eventually at a structure which, we hope, maximizes the usefulness of computing technologies within the Humanities faculties and departments of our University at least. The chief components of this structure, to some extent institutionalized within the HCU, are:

- the provision of expert advice;
- a context for collaborative development activities;
- the development and provision of scholarly digital resources.

In each case, an equal attention must be paid to the needs of both pedagogy and research.

After a brief historical review, I will outline the current facilities and services provided by the Centre for Humanities Computing, which provides high level support for computing facilities within the University's Humanities faculties and departments, the CTI Centre for Textual Studies, one of a small number of nationally funded subject-specific centres charged with promoting take-up of IT within teaching across UK higher education, and the Oxford Text Archive, one of the UK Arts and Humanities Data Service's new resource providers. I will also describe the HCU's planned Humanities Computing Development Team, a new Oxford initiative aiming to formalise the collaborative and synergistic energies which underly the most successful applications of IT in the Humanities, such as (for example) the Oxford "Virtual Seminars for teaching English Literature" JTAP project.

I will argue that these and other services have evolved in response to a set of needs which we are only now beginning to perceive clearly: our future strategy must be to maximize that perception in the post-Dearing world. With the massive increase in availability and take-up of basic computing methods and techniques, our key contribution should be to focus on their integration into teaching, learning, and research, not by supplanting traditional methods, but by enriching them, and also by developing agreed standards and procedures for evaluating their effectiveness.

The technology now at our disposal has the capacity for transforming the hermeneutic and analytic tradition of the humanities by such mechanisms as the large-scale availability of resources in digital form, together with the modelling of humanistic understanding which those encodings embody. I will argue that a proper understanding of the scope and applicability of such resources is essential to the continuation of the humanistic endeavour. Paradoxically, the new technology in some respects recreates aspects of an ancient world: the international and trans-cultural nature of modern scholarship recalls the middle ages, as do our current pre-occupations with textual ontologies, or with the quest for a universal language. Yet there are many respects in which the technology has changed irrevocably ways of engaging with our cultural inheritance, thus transforming both that inheritance and the version of it which we pass on to our successors.

Central to this is the adoption of (apparently) reductionist quantitative methods in support of the (allegedly) ineffable processes of humanistic explication. I shall argue that (for example) the extraction of quantitative or stochastic data by means of the (ostensibly) objective computer provides us with both a new set of tools and a useful additional perspective from which to understand the processes by which meaning is invested in texts and other cultural artefacts. In no way however can such methods reduce our responsibility for assessing the implications of the data thus provided, nor therefore "dehumanize" the debate about their implications. On the contrary, they add to the evidence at our disposal, thus enriching the process of explication rather than reducing it.