Forms of Equivalence: Bertillonnage and the History of Information Management

2018 
Late in the nineteenth century, the French civil servant and anthropologist Alphonse Bertillon developed a system of criminal identification that sought to classify human beings on individual standardized cards, each containing a consistent set of biometric measurements and observations. This process, which came to be known as "Bertillonnage," disassembled the visual form of the human body into small pieces of data that police forces used to individuate, and thus identify, single human beings within populations of millions. In this paper, we investigate Bertillonnage as a system that exemplified the most sophisticated approaches to organizing and retrieving data at the turn of the twentieth century. In addition, we demonstrate that the techniques it implemented-which turned on a purely functional equivalence between the operations of information systems and operations of the human mind-made thinkable a number of subsequent practices well-known to the history of information management. We argue that the physical infrastructure of Bertillonnage served as a set of grubby material practices that exercised a form of technological inertia over later information architectures. Without suggesting a direct, causal relationship, we note that certain of the imperatives and strategies that governed the history of modern digital computing, which scholars have long asserted grew out of the nineteenth-century culture of information, also structured core features of Bertillonnage.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []