ADAPT Machine Translation Research Features in Slator
Focused research has highlighted ChatGPT is capable of enhancing “adaptive MT,” defined as simultaneously improving the quality of new translations based on user feedback. ADAPT Machine Translation researcher at DCU, Yasmin Moslem, recently co-authored a paper with colleagues Rejwanul Haque and Professor Andy Way that featured in the leading publication, Slator: Language, Industry, and Intelligence journal. The paper, titled Adaptive Machine Translation with Large Language Models, is the result of several months of experiments that explored a less common method of adaptive MT that is learning from similar translations (known as ‘fuzzy matches’) found in approved translation memories.
Speaking about the research, Moslem wrote: “Instead of asking the model to translate a sentence or providing random examples, it turns out that showing the model 1-10 domain-specific translation pairs similar to the sentence to be translated can improve the translation of the new sentence immediately.” The method seems to be especially useful for high-resource languages.
Developed by OpenAI, ChatGPT is a chatbot built on top of OpenAI’s GPT-3 family of large language models and has been fine-tuned using both supervised and reinforcement learning techniques.
The paper also featured on Global NLP Lab’s Youtube Channel providing an engaging analysis on the paper. Watch the full video here.
For more information, click here >