Machine Translation in the Age of LLMs

Machine translation (MT) is the process of automating the conversion of text from one language to another. It is part of computational linguistics (CL) or natural language processing (NLP). In recent years, MT has seen a significant shift. Statistical MT, which dominated research for a long time, has given way to neural machine translation (NMT), employing single neural networks.

Image credit: ArtvarkFilm/Shutterstock
Image credit: ArtvarkFilm/Shutterstock

The inception of MT dates to 1947, when Warren Weaver introduced the concept, coinciding closely with the development of the first computer, the electronic numerical integrator and computer, in the preceding year. Methodologically, MT approaches are predominantly categorized into two factions: rule-based methods and corpus-based methods. Rule-based machine translation (RBMT) techniques have traditionally been used to translate text from one language into another. These techniques use bilingual dictionaries and manually created rules. However, manual rule crafting is laborious, and the rules tend to be challenging to maintain or transfer across domains and languages. Thus, scalability for open-domain and multilingual translation remains elusive for rule-based systems.

MT researchers persevered to enhance translation quality. The 1978 launch of SYSTRAN's commercial translation system, a rule-based system, and later, the advent of bilingual corpora ushered in corpus-based methods. Examples-based machine translation (EBMT), statistical machine translation (SMT), and neural machine translation (NMT) are some of these techniques.

EBMT retrieves similar sentence pairs from bilingual corpora for translation, yielding high-quality results if suitable pairs are found. SMT, introduced by Brown et al. in 1990, automatically learns translation knowledge from extensive data, diverging from rule-based approaches. SMT's adoption was limited initially due to the complexity and dominance of RBMT. However, statistical MT toolkits (GIZA and GIZA++) and phrase-based SMT methods revolutionized SMT, eventually leading to open-source systems like "Moses."

NMT, emerging in 2014, maps source language into a dense semantic representation, employing attention mechanisms for translation. It gained rapid online deployment, outpacing SMT in adoption. Subsequent developments like convolutional sequence-to-sequence models and the Transformer further elevated translation quality.

Artificial Intelligence Techniques for MT

In recent years, significant progress has been made in the field of neural machine translation (NMT). A standard NMT model has two main components: an encoder network and a decoder network. An encoder converts the source sentence into a real-valued vector, and a decoder network generates the translation. This process closely mimics human translation, where the NMT model comprehends the entire source sentence and generates the target sentence word by word.

NMT differs from earlier techniques like RBMT and SMT in that it can learn translation knowledge directly from training data without requiring manually created features and rules. This end-to-end framework has made NMT the dominant method in machine translation.

NMT Model: A typical NMT model is constructed based on standard recurrent neural networks (RNNs) or their alternatives. Given a source sentence, the encoder RNN compresses it into a hidden state. The encoder captures information from the source sentence and provides it to the decoder RNN, which generates the translation word by word.

RNNs tend to lose information over long sequences. To achieve this, researchers introduced an attention mechanism. Unlike the traditional "hard alignment" used in SMT, the attention mechanism provides a "soft alignment," linking each target word to multiple source words with different weights. This significantly improved translation quality and contributed to NMT's success. The model's ability to collect past and future information using a bidirectional encoding approach increased the quality of the translation.

Multilingual Translation: Translating between different languages presents significant challenges due to variations in language structures and morphology. Resource-rich languages such as English and Chinese benefit from abundant training data, while resource-poor languages often lack parallel data.

To address this, methods like back-translation were introduced, where a model translates monolingual data into the target language to augment training data. Unsupervised translation methods have also emerged, allowing translation without parallel corpora.

The adoption of multilingual NMT models, which employ a single model for numerous languages, has grown in acceptance. Depending on how many languages are involved, they can be classified as one-to-many, many-to-one, or many-to-many models.

Simultaneous Translation: Simultaneous Translation (ST) strives to provide real-time translation with the least amount of lag between the source voice and the target output. MT, text-to-speech (TTS), and automatic speech recognition (ASR) are all included in traditional ST systems' cascading methodology.

One challenge in ST is determining when to start translation, as starting too early may result in lower translation quality. Fixed policies, which follow predefined schedules, and adaptive policies, which dynamically segment source text, have been proposed to address this issue.

In addition, research is progressing towards end-to-end ST models that integrate ASR and MT into a unified framework. Although they confront difficulties because of the small amount of training data, these models seek to decrease error propagation and increase efficiency.

Applications of MT

Applications of MT span various domains, driven by its cost-effectiveness and high translation quality.

Text Translation:

  • Webpage Translation: MT facilitates rapid access to foreign-language information online. Users can copy and paste webpage content or input URLs to view pages in their preferred language.
  • Scientific Literature Translation: Researchers, engineers, and students use MT to read scientific papers and patents in their language. Domain adaptation enhances terminological accuracy.
  • E-commerce Translation: Transnational online businesses employ MT to translate websites, product information, and manuals, facilitating global trade and improving customer service.
  • Language Learning: MT systems offer rich functionalities, including translations, high-quality dictionaries, and sentence pair examples, aiding users in vocabulary acquisition and comprehension.

Image Translation:

  • Multilingual Image Captioning: This MT subfield combines computer vision and language generation to describe image content in multiple languages, aiding language studies.
  • Optical Character Recognition Translation: MT first recognizes characters in images and then provides translations. This is useful for menus, street signs, and product descriptions while traveling.

Speech Translation:

  • Simultaneous Translation: Recent advances in simultaneous translation (ST) have enabled real-time translation in various applications. ST is employed in international conferences, online meetings, and video watching with foreign content.
  • Portable Translation Devices: Voice translation devices, popular for language learning, travel, and business negotiations, leverage MT to provide seamless communication.

Additionally, MT has ventured into creative fields, generating poems and couplets. By using MT models to produce lines successively, it can create poems in a line-by-line manner.

GPT Models for MT

Recent advances in NLP, specifically the evolution of large-scale language modeling techniques, have ushered in remarkable enhancements in MT and various other NLP tasks. The emergence of extensive language models with diverse capabilities, including MT, has unfurled fresh opportunities for crafting more efficacious translation systems. Within this array of models, the latest iterations of Generative Pre-trained Transformer (GPT) models have garnered significant attention for their capacity to generate coherent and context-aware text.

The GPT model, text-davinci-003, exhibits superior translation performance across languages, often outperforming state-of-the-art systems. This achievement is particularly notable given the model's zero-shot setting. The evaluation further probes the impact of lexical metrics and human evaluation to gain deeper insights into the results.

Challenges and Future Directions in MT

Despite significant strides in MT, there are ongoing opportunities for enhancement. While certain metrics and benchmarks may suggest machines outperform humans in translation, these metrics might not capture essential qualities such as adequacy and fluency. Adequate translation conveys the right message at the right time, emphasizing key points while omitting non-essential parts. Human interpreters excel in this, adapting speed and emphasis to context. Current MT systems lack such nuanced capabilities.

To address these challenges, the following measures can be taken:

  • Develop New Evaluation Metrics: The field needs metrics that go beyond completeness. Metrics should reward systems for emphasizing crucial aspects and penalize unnecessary translations. Capturing emphasis, synchronization, and comprehension is vital.
  • Enhance Robustness: MT systems should be more robust and capable of handling slight source sentence variations without drastically altering translations. Human-like error tolerance and the ability to correct errors are essential.
  • Address Data Sparseness: Resource-poor language pairs and domains pose data sparsity challenges for NMT methods. Innovative techniques are required to improve translation quality in these scenarios.

In conclusion, achieving high-quality MT demands new methods that blend symbolic rules, knowledge, and neural networks. Real-world applications continuously generate more data, fostering the rapid development of advanced MT techniques.

 ​​​​​​References and Further Readings

Hendy, A., et al. (2023). How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation. arXiv. DOI: https://doi.org/10.48550/arXiv.2302.09210

Philipp Koehn, (2020). Neural Machine Translation, Cambridge University Press, DOI: https://doi.org/10.1017/9781108608480

Haifeng Wang, Hua Wu, Zhongjun He, Liang Huang, and Kenneth Ward Church, (2022). Progress in Machine Translation, Engineering, Volume 18, Pages 143-153. DOI: https://doi.org/10.1016/j.eng.2021.03.023

Zhixing Tan, et al. (2020). Neural machine translation: A review of methods, resources, and tools. AI Open, 1, 5-21. DOI: https://doi.org/10.1016/j.aiopen.2020.11.001

Last Updated: Sep 25, 2023

Dr. Sampath Lonka

Written by

Dr. Sampath Lonka

Dr. Sampath Lonka is a scientific writer based in Bangalore, India, with a strong academic background in Mathematics and extensive experience in content writing. He has a Ph.D. in Mathematics from the University of Hyderabad and is deeply passionate about teaching, writing, and research. Sampath enjoys teaching Mathematics, Statistics, and AI to both undergraduate and postgraduate students. What sets him apart is his unique approach to teaching Mathematics through programming, making the subject more engaging and practical for students.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Lonka, Sampath. (2023, September 25). Machine Translation in the Age of LLMs. AZoAi. Retrieved on September 19, 2024 from https://www.azoai.com/article/Machine-Translation-in-the-Age-of-LLMs.aspx.

  • MLA

    Lonka, Sampath. "Machine Translation in the Age of LLMs". AZoAi. 19 September 2024. <https://www.azoai.com/article/Machine-Translation-in-the-Age-of-LLMs.aspx>.

  • Chicago

    Lonka, Sampath. "Machine Translation in the Age of LLMs". AZoAi. https://www.azoai.com/article/Machine-Translation-in-the-Age-of-LLMs.aspx. (accessed September 19, 2024).

  • Harvard

    Lonka, Sampath. 2023. Machine Translation in the Age of LLMs. AZoAi, viewed 19 September 2024, https://www.azoai.com/article/Machine-Translation-in-the-Age-of-LLMs.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.