Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.
Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.
Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary.
Chapter 3. The war effort, the technologisation of linguistics and the emergence of applied linguistics
Chapter 4. The Computational turn and formalisation in Neo-bloomfieldian distributionnalism
Chapter 5. Information theory: the transfer of terms, concepts and methods
Chapter 6. From MT to computational linguistics and natural language processing
Chapter 7. Machine translation of semantics and lexicon
Chapter 8. The French linguistic tradition and external reception of the computational mathematisation of language
Chapter 9. Automatic documentation and automatic discourse analysis. Specificity of Harris’s reception in France
Chapter 10. The empiricist turn of automation-mathematisation
Chapter 11. General Conclusion
Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.
Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.
Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.
Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary.
This unique volume will be of appeal to academic and professional researchers and historians, translators, students, and others in the linguistics field.
Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.