Subscribe: The Mind Tool: Edward Vanhoutte's Blog
Added By: Feedage Forager Feedage Grade A rated
Language: English
computing  conference  digital humanities  digital  electronic  humanities  journal  llc  paper  publication  published  research  tei 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: The Mind Tool: Edward Vanhoutte's Blog

The Mind Tool: Edward Vanhoutte's Blog

Research notes and musings on Humanities Computing and (Electronic) Textual Editing.

Updated: 2017-12-06T10:39:16.259+01:00


Hey, This is Your Journal - [The Editor's Cut]
Editorial - LLC. The Journal of Digital Scholarship in the Humanities 29/1


Even in academia, where much is done on a voluntary basis, it is generally acknowledged that good work deserves a correct remuneration. A lot of labour is involved in publishing a peer-reviewed scholarly journal and, whether access to the publication is open or subscription-based, there is always a cost involved. In the real world, no one expects anything to be free, except for a smile and the sun, maybe.Over the last couple of years I have been contacted by a handful of scholars who announced that they do not want to contribute to or review for LLC anymore because they object to 'giving away their research and peer reviewing for free to a publisher who charges readers and makes a profit'. I regret that this point of view diabolizes LLC by romanticizing the ideal of Open Access publication.In the first part of this editorial, I'd like to take the opportunity to explain why this perspective on LLC is false on at least three points and why a decision not to devote any time or effort to LLC directly affects Open Access publications. In the second part of this editorial I am presenting a report on the past record-breaking year of publishing LLC. The Journal of Digital Scholarship in the Humanities. 1. Three points to consider before turning your back1.1. Ownership & CopyrightsLet me start by emphasizing that LLC is not owned by the Press but by the European Association for Digital Humanities (EADH, formerly known as ALLC). Every five years, EADH re-negotiates the contract or puts the publication of the Journal – not the ownership – up for tender. A very important and substantial part of the negotiations concerns author and readership services, terms and conditions. For example, as an author publishing in LLC, you do not 'give away your research' but you retain all copyrights. All you do is sign a licence which gives EADH the right to have your work published on their behalf by the Press. This is clearly stated in the footer of the first page of each published paper. This means that as an author, you retain any right to make a preprint version of the article available on your own personal website and/or that of your employer and/or in free public servers of preprints and/or articles in your subject area, provided that where possible you acknowledge that the article has been accepted for publication in LLC.It is true that for the moment, you cannot make the accepted postprint manuscript available in the same way during the first 24 months after publication, but this term is currently under discussion. It is also true that, for the moment, you can never make the PDF of the final typeset and published version of your article available for free, e.g. in institutional repositories, for the simple reason that there is copyright involved in the typesetting by the Press. Just as the Press respects the copyrights of the authors and the licences to the EADH, we should respect the copyrights of the Press. There are legal systems to obey, after all. However, both the EADH and the LLC Editorial Team are in constant discussion. with the Press to see what could be done about these restrictions. One of the outcomes of this ongoing and open discussion has been the freely accessible online publication of the DH2012 conference issue (LLC 28/4) for a period of three months after publication. This will certainly be a recurring initiative which could hopefully be extended to other issues as well.So what do you get out of it, apart from retaining your copyrights? Well, your publication is included in a highly esteemed scholarly Journal with a long tradition, a wide distribution, and a high appreciation. Further, the peer review process helps to improve your paper and the copy-editing and typesetting helps to improve your paper's readability. Your paper is published both in print and online where reference links to cited work are included and related data files can be linked to the article. Your published paper can be accessed by ca. 600 personal subscribers, and scholars and students in over 3,500 institutions worldwide. Your publication is[...]

The Gates of Hell - Guest Lecture Würzburg, 13 December 2012


Recently, I stumbled across a documentary about Auguste Rodin's monumental sculpture La porte de l'Enfer, and decided I could use The Gates of Hell as a metaphor in telling the history of the use of computing in the Humanities and the transfer from Humanities Computing to Digital Humanities as a name for the field. This coincided with my attempts to find an angle for a guest lecture I was invited to give at the Lehrstuhl für Computerphilologie und Neuere Deutsche Literaturgeschichte of the Julius-Maximilians Universität Würzburg (Germany) on 13 December. My last visit to Würzburg as a keynote speaker to the 2011 Annual Conference and Members' Meeting of the TEI Consortium (12 October 2012) had been very enjoyable indeed, but bringing in chocolates again would be pushing it a bit. Moreover, I was not lecturing in the prestigious Würzburg Rezidens, but at the evenly prestigious University of Würzburg, where I met a very attentive audience of students of Digital Humanities.As it goes with guest lectures, at least in my case, the eventual contents of the lecture hardly ever reflects the title which was communicated way before putting together the lecture. When I sent my title to Armin Volkmann, who invited me to teach in his course, Text and Image based Digital Humanities: providing access to textual heritage in Flanders seemed a good title. However, when I discovered the story behind Rodin's Gates of Hell I changed my mind and elaborated on the metaphor to talk about the history and definition of literary and linguistic computing, Humanities Computing, and Digital Humanities. In order to relate to the previously communicated title, I divided the lecture in two parts: 1. History of the use of computing in the HumanitiesSlides src="" width="427" height="356" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC;border-width:1px 1px 0;margin-bottom:5px" allowfullscreen webkitallowfullscreen mozallowfullscreen> Text and Image based Digital Humanities: providing access to textual heritage in Flanders - Guest Lecture Würzburg, 13 December 2012 from Edward Vanhoutte VideoPart 1: width="420" height="315" src="" frameborder="0" allowfullscreen>Part 2: width="420" height="315" src="" frameborder="0" allowfullscreen>Part 3: width="420" height="315" src="" frameborder="0" allowfullscreen>Part 4: width="420" height="315" src="" frameborder="0" allowfullscreen> 2. Demonstration of DH Projects from FlandersIn the second part, I demonstrated some of the projects we realized at the Centre for Scholarly Editing and Document Studies (CTB) of the Royal Academy of Dutch Language and Literature (KANTL) in Flanders:TEI by ExampleThe Letters of Van Nu en StraksDe trein der Traagheid by Johan Daisne - Digital EditionDialectcorpus Pieter WillemsAnd one project of the Computational Linguistics& Psycholinguistics Research Centre (CLiPS) from the University of Antwerp:Stylene Further workSince I taught at Würzburg, I have elaborated on the metaphor and the themes addressed in the lecture for a forthcoming book chapter entitled: The Gates of Hell: Digital@Humanities@Computing AcknowledgementsI would like to thank Armin Volkmann for the invitation to lecture in his course, and especially Mareike Laue who took care of all the practical arrangements and who did a wonderful job filming and editing the lecture. [...]

Ruling the Screen: compromising decisions and decisive compromises - DRH 99


I was lucky my first paper on an international conference got published right away and in two different places. I presented 'Where is the editor? Resistance in the creation of an electronic critical edition' on the DRH Conference (Digital Resources for the Humanities) in Glasgow in 1998. The original paper got published in Human IT (1/1999: 197-214) where my name was vikingized as 'Edvard'. A revised version appeared a year later in DRH 98. Selected papers from Digital resources for the Humanities 1998 (Marilyn Deegan, Jean Anderson & Harold Short (eds.), London: OHC, 2000, p. 171-183). My second international paper, however, didn't make it into publication, partly because it was too sketchy and reported on research in progress. The main aim of 'Ruling the Screen: compromising decisions and decisive compromises' which I presented to DRH99 in London was to introduce the Electronic Streuvels Project and to report on the work so far. I focused especially on six decisive compromises I had to make because of the financial and infrastructural context of the project. Two of these compromises concerned the encoding architecture for the textual variation and the design of a project specific DTD for the encoding of letters.Because this paper was never published, my use of nested -tags instead of the TEI parallel segmentation method to generate an inclusive view of all variant versions of the text in the edition was misunderstood by the encoding community when the electronic critical edition of De teleurgang van den Waterhoek was published in 2000 by Amsterdam University Press. The venture was not so much about documenting the textual variation among the different versions of the novel, but about creating a model and an interface by which parts of the text could be optically compared to one another independently. Criticism has been voiced by Dan O'Donnel, for instance, in his review of the electronic edition in Literary and Linguistic Computing (17/4 (2000): 491-496). O'Donnel pointed out that my solution was a poor one because of the sigificantly stray from the TEI definition of and because it ignored several features of TEI standard intended for precisely the type of functionality that I was suggesting. O'Donnel suggested that a combination of ≶APP>, , and optionally elements could have been chosen within the TEI Guidelines and that could have been used to link the variant versions to the orientation text. These choices, however, were the result of one of the compromising decisions outlined in the current paper, namely that I had to use TEI-Lite for reasons of time and financial constraints. O'Donnell righlty pointed out that adding them to the TEI-Lite DTD wouldn't have been too difficult, but I've always been against modifying a digested DTD like TEI Lite in order to lift it up to the level of full TEI. The choice was also one of ease of formatting. With only a basic knowledge of SGML and TEI, I was unable to to much transformations, and the low-cost low-tech SGML publication suite Multidoc was perfect for my purpose of getting out an electronic edition in 21 months time. The nested -construction also served my model of the linkeme on which I elaborated in my article 'A Linkemic Approach to Textual Variation. Theory and Practice of the Electronic-Critical Edition of Stijn Streuvels' De teleurgang van den Waterhoek.' (Human IT, 1/2000: 103-138). In the current paper I also presented the DTD I had produced for the encoding of modern correspondence materials. This DTD was the very first attempt at what later became the DALF-Scheme. Important to know, when reading this paper, is that we were then still living in the SGML world.AbstractAt the end of every discussion on textual criticism and scholarly editing, there is this question about lay-out: 'How will the editor present his or her theories, findings and editorial decisions to the interested public?' The already published scholarly editions in paper seem to have op[...]

First steps in Digital Humanities


Back in 1995, at Lancaster University where I undertook an MA in Mediaeval Studies (with 'ae'!) I met Professor Meg Twycross, who turned out to become one of the most influential women in my life. At a time when it was still possible to read through the complete internet (what we attempted in the computerlabs at night) and when we were trying out different nicknames on #IRC chat channels, Meg Twycross not only caught my attention with her tremenduously well taught courses on Medieval literature and culture and Paleography, but especially with the pilot for the York Doomsday Project which she was building at that time. In one of my nightlong Internet sessions, I came across Stuart Lee's Break of Day in the Trenches Hypermedia Edition which also exploited hypertext as a didactic means in the teaching of literature and culture. This appealed so much to me that I started to build similar editions of poems by Hugo Claus when I was a research assistant at the University of Antwerp in 1996. This was picked up by people from the Department of Didactics at the University of Antwerp who invited me to present on a conference on teaching Dutch in secondary education. I presented my first conference paper on 15 November 1996 under the title "Retourtje Hypertekst. Een reis naar het hoe en waarom van hypertekst in het literatuuronderwijs" and a revised version was published as:

However, this wasn't my first publication. Before this article came out, I had already published two pieces about the same matter:

  • 'Oorlogspoëzie en HyperTekst: Gruwel of Hype?' WvT, Werkmap voor Taal- en Literatuuronderwijs, 20/79 (najaar 1996): 153-160.
  • 'Met een Doughnut de bib in. Over de rol van HyperTekst in het literatuuronderwijs' Vonk, 25/5 (mei-juni 1996): 51-56.

In the following years, I wrote some more on this subject:

  • 'De geheugenstunt van hypertekst.' Leesidee, 3/10 (december 1997): 777-779.
  • 'De soap 'Middeleeuwen.' Leesidee, 3/9 (november 1997): 697-698.
  • 'Het web van Marsua.' Tsjip/Letteren, 7/3 (oktober 1997): 9-12.

Meg Twycross not only stimulated my interest in the use of hypertext for literary studies and through this for models of electronic editions, she also charged me with an important mission which changed my life forever. One day, when I was awarded the County College Major Award which came with a cheque for £250, I asked her what I should do with that money and she told me to go away, learn everything I could about SGML and come back and tell her. The first book I bought with the prize money was Charles Goldfarb's The SGML Handbook.

Both the interest in the hypertextual modeling of scholarly editions and the markup of texts using SGML formed what I have been doing since. And the only person to kindly blame for it is Meg Twycross.

What if your paper doesn't make it into print?


Recently a graduate student at UCL emailed me with a request to have access to a number of conference papers I presented at the beginning of my academic career, none of which made it into a publication. Apart from the abstracts which were published in the conference book, nothing of the research or argument presented in these papers survive.

Is this a bad thing? Not necessarily. One of the reasons why they were never turned into a publication is probably because they were simply not good enough. Another reason may have been that the 20 minute presentation didn't have enough body to write out as a full paper submission to a Journal or a chapter in a book. A third, and more plausible, reason is that I was too busy doing other stuff to revisit the conference paper and rewrite it as an academic paper.

Nevertheless, I do think they have some value to some people including myself. I myself am interested in the history of the field and in the evolution of ideas - I can't just read an article by my colleagues without looking up the previous publications on which the argument builds - and I find it difficult sometimes to reconstruct a history of thoughts because of the lacking documentation. Therefore, I decided to dig up my old conference presentations and make them publicly available on this blog over the coming weeks. For me personally, it'll probably be a confronting revisit of my first steps in academia, but it will hopefully generate a better understanding of the provenance of my current ideas.

For my own documentation and for the sake of contextualisation I will provide each paper with a short introduction explaining the circumstances of the research and the occasion of the presentation. I'll also try to reconstruct which conference papers were the inspiration to published papers.

Being Practical. Electronic editions of Flemish literary texts in an international perspective


This is the text of my lecture at the International Workshop on Electronic Editing (9-11 February 2012) in the School of Cultural Texts and Records at Jadavpur University, Kolkata, India. The slides of this lecture have been published on Slideshare. Keep it cool: the electronic edition & the fridge Over the last couple of years, I have been observing my children's continuous development of skills with growing amusement. And, as those amongst you who are parents or grandparents will agree, kids sometimes really amaze you. From the age of two, my boys know, for instance, how to operate a fridge. As far as I can recall, neither their mother nor their father taught them how it worked and I'm pretty sure the grandparents didn't tutor them privately either. Nevertheless, they have since been very successful in opening the door of the fridge, exploring (if not rummaging) the contents, finding what they are craving for, picking one or two incidentally found extra's on their way out, and running off with their treasures after having closed the door again. They also noticed quite early on that the light is operated automatically on opening and closing the door and that they don't need to use a switch for that. While I was witnessing one of their recent scavenger's hunts, it occurred to me that the fridge was the perfect model for what we have been looking for for over almost two decades now in the design of electronic textual editions. A fridge is an intuitively designed repository of a diverse range of foods from which anyone may quarry what they need. It offers an ideal storage space for a selection of fresh meat, fish, vegetables, fruit, dairy products as well as for semi-prepared foods, finished dishes, and leftovers. It is also the most economic and safest option to defrost foods. Although there is a generally acknowledged plan by which a fridge should be filled – bottles go in the inside of the door, vegetables and herbs go in the boxes at the bottom, meat goes on the bottom shelf and dairy products go on the top shelf – the internal organisation of the foods on the shelves is decided on by whoever fills it up, and can be changed according to the insights and preferences of any user. The products can, for instance, be grouped according to food group, meal, user frequency, size and so on. The fridge can be refilled, products can be replaced by fresher ones and new products can be introduced. Another feature is that one only needs some pieces of paper and a couple of magnets or post-it notes to annotate the contents of the fridge, put up shopping lists, or leave instructions about the next meal. By the same technique the appearance of the fridge is altered on a daily basis by moving around the notes, introducing new ones, taking old ones off, embellishing the outside with various collections of fridge magnets or with your children's artistic creations. The fridge's main function is to preserve foods over a certain period of time and to offer easy access to a wide range of products from within people's homes. And fridges are available in many models with various features like freezing compartments, ice makers and water dispensers which extend the fridge's central function. Unfortunately, it must be admitted, a fridge can't cook you a meal. Electronic editions, by comparison, or at least the electronic editions we want to be making, should be intuitively designed 'repositories of information, from which skilled scholars might quarry what they need' as Peter Robinson stipulated once (Robinson, 2003b). Michael Sperberg-McQueen reminded us that 'any edition records a selection from the observable and the recoverable portions' of an 'infinite set of facts to the work being edited.' (Sperberg- McQueen, 2002) He mentions the apparatus of variants, glosses for some words, historical or literary annotations an[...]

Editorial - LLC. The Journal of Digital Scholarship in the Humanities 27/1


It has been an exciting year for the Digital Humanities in general and for LLC in particular. Not only has the Journal managed to raise its subscriptions by 25%, submissions have gone up by an amazing 300% compared to the previous year. In 2011, LLC received 140 submissions, 80% of which got a decision within 3 months. The accepted papers were published in advance access on within 6 weeks after the final decision, and all four issues in 2011 appeared on time or even ahead of time. The issues were packed with slightly more papers than the previous year, and the Journal wants to grow further and negociate a larger page budget with its stakeholders. This is necessary because of the increase in submissions and the current acceptance rate of 55.10%.The large number of submissions is explained by four evolutions in the Digital Humanities. First of all, the Digital Humanities are in very good health around the globe, which has recently been demonstrated by the Infographic 'Quantifying Digital Humanities' Melissa Terras published. More research is being funded which results in a higher number of theoretical and methodological papers as well as papers presenting research data and results. Publishing these papers will remain the core focus of LLC and I invite the readership to keep on submitting their papers to the Journal. Second, because the Digital Humanities are by definition interdisciplinary, new clusters of thematic research are being formed. This is reflected in a growing number of fine proposals for thematic issues the Journal receives. Although there is still room for unsollicited copy in the 2012 volume, we're already planning thematic issues for the 2014 volume. Together with the success of the Digital Humanities worldwide, a large number of taught courses, MA and PhD programmes are being organised. I point this out as a third evolution which influences the Journal because the introduction of the short paper in LLC especially appeals to young scholars enrolled or graduating from these programmes and has pushed the number of submissions up. The short paper offers young scholars the opportunity to getting acknowledged with the publication procedures of a peer reviewed Journal. At the same time the Journal profits contentwise from the submission of exciting reports on innovative and ongoing research. A fourth evolution in the Digital Humanities is the continuously growing production of publications which are being covered in the Journal by the intensive book review activity by our Reviews Editors.The Journal also saw some changes during the last year. Huw Price has left OUP as Acting Publisher and LLC is now being looked after by Sarah Scutts. I'd like to thank Huw for all the hard work and the many fruitful discussions about the nature and the future of the Journal. I am looking forward to working with Sarah and her team. Also, Eva Gooding, who has been the Journal's Production Editor for many years, has handed over to Alexandra McAuley. I wish to extend my thanks to Eva for managing the production of the Journal so well, and I have no doubts that the Journal is in good hands with Alexandra.At the end of 2011 we said goodbye to Stéfan Sinclair and Femke Vandevelde who have both been instrumental in the transition of the Journal between Editors. Stéfan Sinclair was appointed Associate Editor of LLC in 2005 and has served the Journal for seven years. Unfortunately, his new appointment as Associate Professor of Digital Humanities at McGill University, for which I want to congratulate him, is incompatible with the work on the Journal. Stéfan will continue his involvement with the Journal as member of the Editorial Board, to which I welcome him. Femke Vandevelde has been with the Journal as Reviews Editor for only one year, but together with Ron Van den Branden, she has added a new dynamic to this ever-expanding task. Femke has left her position at the Royal Ac[...]

So You Think You Can Edit? The Masterchef Edition


This is the text of my keynote address at the 2011 Annual Conference and Members' Meeting of the TEI Consortium (Wednesday, 12 October 2011) in the Toscana Saal of the prestigious Würzburg Rezidens. A revised article version of the text will be published in the Journal of the Text Encoding Initiative. Both this blogpost and the journal article come without the chocolates.The slides of this keynote have been published on SlideshareRead Toma Tasovac's comments on So meta 1. Introduction 1.1. Apologies First, let me say how honoured I am to have been asked to be the opening keynote speaker at this 2011 Annual Conference and Members' Meeting of the TEI Consortium. I have to confess, however, that I was surprised at first, then I became seriously nervous and in the end a feeling of relief fulfilled me. I was surprised because my academic work on the topic of Philology in the Digital Age has only produced four published articles over the last five years. Moreover, of the seven scholarly editions I've published so far, only one is a digital edition and to digital standards it was published in the dark middle ages of SGML. Even worse, both scholarly editions I'm working on at the moment will be published as books. I know that many of you have been more actively involved with the theme of this conference, which makes me quite nervous. I also realised that many of you will probably tweet comments about what I'm saying, if you haven't already done so. If you do, be sure to include my twittername @evanhoutte and the hash-tag #tei2011 in your comments, so I can prolong my nervosity till after I've read it all tonight. But then I thought, what the hell. If the programme committee of this conference wants me to present the opening keynote, they must have had at least one good reason to think I'm fit for the job. Unless of course I was the only one left who hadn't declined their invitation. I was charmed by their implied conviction that what I could tell you would be interesting enough, even if it was said from a spectator's point of view. Because I feel that's the position I'm gradually moving into due to my huge involvement with the administrative side of my job. Whatever their motive was, the fact is that you're stuck with me for the following hour or so. So sit back and let me entertain you. 1.2. Me For those who don't know me yet, I'm the director of research and publications of the Royal Academy of Dutch Language and Literature in Belgium, or should I say Flanders. The length of my job title is inversely proportional to its importance. Bygone are the days that the Royal Academy, which celebrates its 125th anniversary this year, was of any political importance. The Royal Academy of Dutch Language and Literature was founded in 1886 with the explicit task to design a uniform spelling for Dutch and to promote and facilitate the use of Dutch as a language for literature, science, scholarship and higher education. Since all of these initial goals were reached ages ago, and since we have language laws in Belgium protecting the status of all three of the country's official languages, the Royal Academy lost its influence and prestige and only recently discovered new opportunities to act as a moderate player in the cultural and academic field in Flanders. As part of the action to set new challenges for the Academy, I was asked to set up the Royal Academy's research department in the year 2000 and to concentrate its activities around two main topics: the scholarly edition of important literary works and cultural documents from Flanders and the building of linguistic historical corpora. From its start, I have pushed humanities computing as the centre's methodology and TEI as the expressive language for its research. In the first two years, the Centre for Scholarly Editing and Document Studies or CT[...]

No Such Thing as a Battle of Firsts


Yet, isn’t it true that all new ideas arise out of a milieu when ripe, rather than from any one individual? (Busa, 1980, p. 84)Reading through the immense literature available about the early history of modern computing, it is tempting to consider the historiography of modern computing as a battle of firsts. Often, the chronicles of technical evolutions are used to put a forgotten pioneer in the limelight or to contradict or corroborate one’s claims of originality and invention. It is indeed true that each design or machine, whether it is an analog or a digital, a mechanical or an electronic computer, can always claim to be the first in its own kind or the first at something which is now generally considered as essential for the design of the modern computer. But since reinvention and redefinition are at the heart of the history of modern computing, real firsts hardly exist. This doesn’t mean that the alleged firsts can’t be true influential pioneers, or as Maurice Wilkes put it: ‘People who change the world stand facing two ways in history. Half their work is seen as the culmination of efforts of the past; the other half sets a new direction for the future.’ (Wilkes, 1995, p. 21) Additionally, history shows that the same technology can develop independently on different places at the same time. This is true for ideas, hardware, as well as for applications. Charles Babbage’s (1791-1871) revolutionary idea to use punched cards both for numerical storage and control, for instance, is generally acknowledged as an essential step in the evolution of stored-program computers, and Babbage is fully credited for it. However, Babbage adopted his idea from the silk weaving industry where punched-card programmed weaving is attributed to Joseph Marie Jacquard (1752-1834) around 1801. (Menabrae, 1961 [1842], p. 232; Lovelace, 1961 [1843], p. 264-265) But Jacquard had predecessors in France as early as the 1720s. Research by the Austrian computer pioneer Adolf Adam (1918-2004) has shown that programmed weaving was also invented in Austria between 1680 and 1690. (Zemanek, 1980, p. 589) In the nineteenth century, Herman Hollerith (1860-1929) rediscovered, patented, and popularized punched cards as data storage media when he built his tabulators for the US Census of 1890. Babbage’s original double use of the perforated media for both data storage and operation control, however, was rediscovered by Howard H. Aiken (1900-1973) for application in his Automatic Sequence Controlled Calculator (ASCC), later to be referred to as the Harvard Mark I – not to be confused with the Manchester Mark I. Programs were read from paper tape, and data was read from IBM punched cards. The ASCC was built by IBM (International Business Machines), completed in 1943, and publicly announced and demonstrated in 1944. The wide news coverage marks that event as the beginning of the computer age for some historians of technology and computing. (Ceruzzi, 1983, p. 43) But in Germany, Konrad Zuse (1910-1995) had managed to build a working automatic and programmable calculating machine independently of any British or American project by December 1941. Isolated by the political reality of that time, Zuse had conceived and built his machine – later to be called the Z3 – on his own and with his private money. Data was directly input by a numerical keyboard and the calculator could run programs from perforated 35mm movie film. But the direct and demonstrable influence in the use of perforated media from Babbage over Hollerith to Aiken is absent in the work of Zuse, who always claimed that he had no knowledge of Babbage nor of his work at that time (Zuse, 1980, p. 611).For other historians, the invention and description of the stored-program principle marks the beginning of the history of modern computing. But that beginning can’t be def[...]

Paul Otlet (1868-1944) and Vannevar Bush (1890-1974)


If Vannevar Bush’s visions on associative trails in the Memex and a World Wide Web-like interconnected Memex infrastructure looks advanced, visionary, and revolutionary for its time, consider the following quotation from Paul Otlet’s Monde: essaie d’universalisme published in 1935:L’homme n’aurait plus besoin de documentation s’il était assimilé à un être devenu omniscient, à la manière de Dieu même. A un degré moins ultime serait créée une instrumentation agissant à distance qui combinerait à la fois la radio, les rayons Röntgen, le cinéma et la photographie microscopique. Toutes les choses de l’univers, et toutes celles de l’homme seraient enregistrées à distance à mesure qu’elles se produiraient. Ainsi serait établie l’image mouvante du monde, sa mémoire, son véritable double. Chacun à distance pourrait lire le passage lequel, agrandi et limité au sujet désiré, viendrait se projeter sur l’écran individuel. Ainsi, chacun dans son fauteuil pourrait contempler la création, en son entier ou en certaines de ses parties. (Otlet, 1935, p. 390-391)Unlike Vannevar Bush who, with his Memex proposal, wanted to tackle the quantitative problem of the information overload in the exact sciences, Paul Otlet (1868-1944), a Belgian lawyer, bibliographer, and ‘utopian’ internationalist, proposed solutions for the qualitative problem of information overload in the sociological sciences. Therefore he proposed ‘the creation of a kind of artificial brain by means of cards containing actual information or simply notes or references’ (Otlet, 1990a [1891], p. 16) on the social sciences with the information broken down into four categories: facts, interpretation of facts, statistics, and sources (Otlet, 1990a [1891], p. 15):The ideal [...] would be to strip each article or each chapter in a book of whatever is a matter of fine language or repetition or padding and to collect separately on cards whatever is new and adds to knowledge. These cards, minutely subdivided, each one annotated as to the genus and species of he information they contain, because they are separate could then be accurately placed in a general alphabetical catalogue updated each day. (Otlet, 1990a [1891], p. 17)The choice for separate cards allowed for indefinite addition, continuous interfiling, repetitive manipulation and classification, and direct utilisation. This, he called the Monographic Principle. A hierarchically arrangement of the scientific nomenclature could then be the basis for the production of a catalogue of cards, establishing practical links between the catalogue, its contents, and the referred publications. Therefore he developed the Universal Decimal Classification System (UDC), which is still in use in many academic libraries in Europe.The monographic principle and the decimal classification system allowed Paul Otlet to manage a vast amount of data and run a knowledge information centre in the Palais du Monde or Mundaneum. His collection grew from 1895 onwards at an exponential pace and consisted by 1934 of over 15 million entries which could be consulted on the premises and by mail. This research service was kept in operation till the early 1970s and it was a manual enterprise comprising the following steps described by Rayward (1991):Translation of subject requests into UDC numerical search termsManual searching and retrieval of relevant monographic entriesRemoving of entries from the filesCopying the entries by hand or using a typewriterRefiling the entriesMailing the duplicated results of the enquiry to the enquirerNo wonder Otlet was constantly on the look-out for the mechanization or automatisation of the several steps involved in this procedure. Especially data duplication and data retrieval and access were at the heart of his interests in the applica[...]

The Mind Tool


With The Mind Tool. Computers and their Impact on Society, Neill Graham wrote a classical textbook 'on both the promise and the threat of the computer' (Graham, 1976, p. xi) illustrated with graphs and black and white photography. This book offered suggestions for further reading and remediating questions after each chapter, included a hands-on section on BASIC, and it came with an instructor’s manual (Daughenbaugh, 1986) with a test-bank of multiple choice questions from its fifth edition onwards (1989). The book is conceived in two parts, the first of which takes up the nature of the computer and introduces the student to what the computer is, how it works, how it is programmed, and what it can do. The second part discusses the applications of computers in many areas of society such as medicine, politics, transportation, business and finance, crime, employment, information access, and the fine arts. This last chapter addresses visual arts, computer music, and computers and literature but devotes only just over half a page to literary concordances, stylistic studies, and readability research in literary research.Grace Hopper’s and Steven Mandell’s Understanding Computers (1984) was also conceived from an outspoken didactic point of view with elaborate supplementary educational material gathered in the accompanying study guide for the student, a complete instructor resource package with transparency masters reducing administrative efforts for the lecturer, and a test bank of nearly 1,000 multiple choice questions. Additional media included a student oriented audio-cassette and interactive microcomputer software packages for laboratory sessions. The book itself covers most of the topics of Graham’s book, but with more graphs and full colour pictures. It also includes an introduction to BASIC, and each chapter concludes with revision and discussion questions. Although there is a chapter devoted to computers in science, medicine, and research and design, no mention, however, of any humanities research is made. Also, the section on computers in the arts focuses on adjuvant capabilities in stage lighting, dance notation, word processing, and poetry emulation only.These books became respectively the fourth and the third most frequently used textbooks in the teaching of computing to humanities students by 1987 (Rudman, 1987a). The champions were Computer Methods for Literary Research by Robert Oakman (1980); and A Guide to Computer Applications in the Humanities by Susan Hockey (1980).Now, more than ever, I experience truth in Graham's analysis that:Few products of technology are so important to the public and yet so poorly understood by them as is the computer. When people think of computers they are apt to think of the dire warnings they have heard concerning the dangers of computerized data banks, or perhaps of someone who received an erroneous computerized bill and had trouble getting it corrected. Relatively few people have any information as to what computers actually do or how they can be used for the benefit of humanity. (Graham, 1976, p. xi).I called this blog The Mind Tool because of two reasons. First, it wants to provide information as to what computers can do for the benefit of the humanities in general and the discipline of textual editing in particular. Second, it serves as a mind tool to me, freeing my desk from scribbled on loose pieces of paper, post-it notes, notebooks, napkins, and beer maths. As such I hope to entertain both the technology-sceptics of traditional academe and the growing community of humanities computing professionals[...]