He trained in Glasgow, Salamanca and Rome as a philosopher, theologian and historian.
Hutchins "The development and use of machine translation systems and computer-based translation tools". Successful decoding of encrypted messages by machines during World War II led some scientists, most notably Warren Weaver, to view the translation process as essentially analogous with decoding.
The concept of Machine Translation in the modern age can be traced back to the s. Warren Weaver, Director of the Natural Sciences Division of the Rockefeller Foundation, wrote to his friend Norbert Wiener on 4 March - shortly after the first computers and computer programs had been produced: Recognising fully, even though necessarily vaguely, the semantic difficulties because of multiple meanings, etc.
Even if it would translate only scientific material where the semantic difficulties are very notably lessand even if it did produce an inelegant but intelligible result, it would seem to me worth while. Also knowing nothing official about, but having guessed and inferred considerable about, powerful new mechanized methods in cryptography - methods which I believe succeed even when one does not know what language has been coded - one naturally wonders if the problem of translation could conceivably be treated as a problem in cryptography.
When I look at an article in Russian, I say "This is really written in English, but it has been coded in some strange symbols.
I will now proceed to decode". Have you ever thought about this? As a linguist and expert on computers, do you think it is worth thinking about? Cited in Hutchins Weaver was possibly chastened by Wiener's pessimistic reply: I frankly am afraid the boundaries of words in different languages are too vague and the emotional and international connotations are too extensive to make any quasi-mechanical translation scheme very hopeful.
But Weaver remained undeterred and composed his famous Memorandumtitled simply "Translation", which he sent to some 30 noteworthy minds of the time.
It posited in more detail the need for and possibility of MT. Thus began the first era of MT research.
The first generation henceforth referred to as 1G of MT systems worked on the principle of direct transfer; that is to say that the route taken from source language text to its target language equivalent was a short one consisting essentially of two processes: A direct system would comprise a bilingual dictionary containing potential replacements or target language equivalents for each word in the source language.
A restriction of such MT systems was therefore that they were unidirectional and could not accommodate many languages unlike the systems that followed. Rules for choosing correct replacements were incorporated but functioned on a basic level; although there was some initial morphological analysis prior to dictionary lookup, subsequent local re-ordering and final generation of the target text, there was no scope for syntactic analysis let alone semantic analysis!
Inevitably this often led to poor quality output, which certainly contributed to the severe criticism of MT in the Automatic Language Processing Advisory Committee ALPAC report which stated that it saw little use for MT in the foreseeable future. We can say that both technical constraints and the lack of a linguistic basis hampered MT systems.
The system developed at Georgetown University, Washington DC, and first demonstrated at IBM in New York in had no clear separation of translation knowledge and processing algorithmsmaking modification of the system difficult.
In the period following the ALPAC report the need was increasingly felt for an approach to MT system design which would avoid many of the pitfalls of 1G systems. By this time opinion had shifted towards the view that linguistic developments should influence system design and development.
Indeed it can be said that the second generation 2G of "indirect" systems owed much to linguistic theories of the time. Modularity is an important design feature of 2G systems, and in contrast to 1G systems, which operate on a 'brute force' principle in which translation takes place in one step, the steps involved in analysis of source text and generation of target text ideally constitute distinct processes.
We will look first of all at interlingual systems, or rather those claiming to adopt an interlingual approach. Although Warren Weaver had put forward the idea of an intermediary "universal" language as a possible route to machine translation in his letter to Norbert Wiener, linguistics was unable to offer any models to apply until the s.
By virtue of its introduction of the concept of "deep structure", Noam Chomsky's theory of transformational generative grammar appeared to offer a route towards "universal" semantic representations and thus appeared to provide a model for the structure of a so-called interlingua.
An interlingua is not a natural language, rather it can be seen as a meaning representation which is independent of both the source and the target language of translation.
An interlingua system maps from a language's surface structure to the interlingua and vice versa. A truly interlingual approach to system design has obvious advantages, the most important of which is economy, since an interlingual representation can be applied for any language pair and facilitates addition of other language pairs without major additions to the system.
The next section looks at "transfer" systems. In a transfer model the intermediate representation is language dependent, there being a bilingual module whose function it is to interpose between source language and target language intermediate representations.
Thus we cannot say that the transfer module is language independent. The nature of these transfer modules has obvious ramifications for system design in that addition of another language to a system necessitates not only modules for analysis and synthesis but also additional transfer modules, whose number is dictated by the number of languages in the existing system and which would increase polynomially according to the number of additional languages required.
An important advance in 2G systems when compared to 1G was the separation of algorithms software from linguistic data lingware. In a system such as the Georgetown model the program mixed language modelling, translation and the processing thereof in one program.
This meant that the program was monolithic and it was easy to introduce errors when trying to rectify an existing shortcoming.Aims.
It is the aim of this module to explore some of the aspects and challenges in Human Language Technologies (HLT) that are of relevance to Computer Assisted Language Learning (CALL). Angela Anderson, MSc Building Surveying, scholarship recipient.
Angel Anderson is a recipient of a Stuart A Johnson Scholarship and is now studying the MSc Building Surveying .
Study MSc Social Work at Swansea University. Social Work is about change and human growth through supporting the social care and welfare needs of individuals, groups and local communities. The two-year MSc in Sustainable Urban Development will provide you with a rigorous and critical introduction to the policy and practice of sustainable urban development.
The course exposes students to sustainable urbanism as both an interdisciplinary and multidisciplinary subject at global and local. Accredited by the Institute of Chartered Secretaries and Administrators (ICSA) MSc Corporate Governance taught full-time and part-time at London South Bank University (LSBU).
The MSc in Health Psychology is a one-year taught programme run in collaboration with NHS Fife and the School of Medicine and the School of Psychology and Neuroscience at St Andrews.