/TranScribe

English to Hindi Language Translation

Primary LanguageJupyter Notebook

TranScribe

Overview

Our central goal revolves around the translation of English to Hindi, utilizing a set of four different models. Our evaluation criterion involves a comparative analysis of the BLEU scores associated with each model's performance. Through this comprehensive assessment, we aim to discern and pinpoint the most proficient model among the four for English-to-Hindi translation tasks.

This repository contains four machine learning models for Neural Machine Translation (NMT):

  • Helsinki-MT: This is a Transformer-based NMT model trained on the Helsinki Corpora. It supports translation between a variety of language pairs.
  • Meta-NLLB-200: This is a state-of-the-art NMT model from Meta AI that supports translation between 200 languages. It is based on the No Language Left Behind (NLLB) project, which aims to develop high-quality machine translation capabilities for most of the world's languages.
  • Transformer: This is the original Transformer model for NMT, which was first introduced in the paper "Attention is All You Need" by Vaswani et al. (2017).
  • Neural Machine Translation using LSTM: This is a template for creating a LLM model using LSTM.

Features

Helsinki-MT:

  • Supports a wide range of languages: Trained on the Helsinki Corpora, which includes data for many European languages.
  • Fast and efficient: Delivers fast translation speeds and efficient resource utilization.
  • Easy to use: Provides a simple API for easy integration into your applications.
  • Open-source: Freely available for use and modification under the MIT License.

Meta-NLLB-200:

  • Unprecedented language coverage: Supports translation between a staggering 200 languages.
  • State-of-the-art performance: Achieves high translation quality across various language pairs.
  • Built for diverse languages: Handles low-resource languages effectively, promoting accessibility.
  • Continuous development: Actively maintained and updated by Meta AI researchers.

Transformer:

  • Pioneering architecture: Introduced the novel Transformer architecture, revolutionizing NMT.
  • Highly flexible: Can be adapted to various language tasks beyond translation.
  • Extensive research base: Supported by a vast body of research and development efforts.
  • Easy to interpret: Allows visualization and analysis of the attention mechanism.

Neural Machine Translation using LSTM:

  • Tailored to your needs: Build a model optimized for specific languages or domains.
  • Full control over training: Fine-tune hyperparameters and experiment with different architectures.
  • Open for exploration: Investigate and develop new NMT techniques and approaches.
  • Empowers creativity: Allows researchers and developers to push the boundaries of NMT.

Requirements:

  • Python 3.6+
  • TensorFlow 2.x
  • PyTorch (optional)
  • NumPy
  • Matplotlib (for visualization, optional)

References

Helsinki-MT:

Meta-NLLB-200:

Transformer:

Neural Machine Translation using LSTM:

Contributers

Acknowledgement

We express our sincere gratitude to Animesh Sir for his invaluable contributions to this work. His insightful comments, corrections, and inspiration have greatly enriched our understanding and improved the quality of our research.